Ker-Jiun Wang1, Prakash Thakur2, John Apostolides3, Emily Ackerman1, Zhi-Hong Mao1
Student Design Competition (Poster)
12:00 - 14:00 | Tue 7 Nov | Auditorium Foyer, E1/E2, Upper Atrium Space | TPO
In the US, over 17.5 million people (over 1 billion disabled people worldwide) are living with serious movement disabilities due to limb loss, quadriplegia, stroke, spinal and brain injuries/pathologies, which limit their ability to seamlessly interact with the world. Current assistive technologies need complicated, cumbersome, and expensive equipment, which are not user-friendly, not portable, and often require extensive fine motor control. EXGBuds provides a patent-pending technology, which allows for the hands-free control of smart devices using simple eye movements. Our innovation uses machine learning and non-invasive biosensors on top of the ears to identify eye movement activity with over 95% accuracy. With voice cue instructions, users can control different applications, such as a cell phone, powered wheelchair, smart home, or other Internet of Things (IoT) devices. Created with 3D-printing or paperboard, EXGBuds wearable devices are fully customizable, allowing sensors placed at different locations to monitor various physiological activities, like heart and breathing rates, and alert users when levels are dangerous. EXGBuds combines a universal controller, mobility, and early warning system in one simple, affordable device. The "X" is a variable, representing many available biosignals, enabling a myriad of applications.
No information added