EPSRC Reference: |
EP/F01869X/1 |
Title: |
A Tongue Movement Command and Control System Based on Aural Flow Monitoring |
Principal Investigator: |
Vaidyanathan, Dr R |
Other Investigators: |
|
Researcher Co-Investigators: |
|
Project Partners: |
|
Department: |
Mechanical Engineering |
Organisation: |
University of Bristol |
Scheme: |
First Grant Scheme |
Starts: |
01 April 2008 |
Ends: |
30 September 2011 |
Value (£): |
269,165
|
EPSRC Research Topic Classifications: |
Human-Computer Interactions |
Med.Instrument.Device& Equip. |
|
EPSRC Industrial Sector Classifications: |
|
Related Grants: |
|
Panel History: |
Panel Date | Panel Name | Outcome |
19 Sep 2007
|
Healthcare Engineering Panel (ENG)
|
Announced
|
|
Summary on Grant Application Form |
Although there is a well-recognized need in society for effective tools which will enable the physically impaired to be more independent and productive, existing technology still fails to meet many of their needs. In particular, nearly all mechanisms designed for human-control of peripheral devices require users to generate the input signal through bodily movements, most often with their hands, arms, legs, or feet. Such devices clearly exclude individuals with limited appendage control. Spinal cord injuries, repetitive strain injuries, severe arthritis, loss of motion due to stroke, and central nervous system (CNS) disorders all represent examples of these impairments. Past work has attempted to address this need through recognition of the potential of the oral cavity (victims of stroke, spinal damage, and arthritis can often move their tongue and mouth) for control input. Developed mechanisms include: track balls, small joysticks, and retainers inserted in the mouth to be manipulated by the tongue, or sip and puff tubes which respond to concentrated exhaling and inhaling. Despite their numerous successful implementations, these devices can be difficult to operate, problematic if they fall from or irritate the mouth, may impair verbal communication, and present hygiene issues since they must be inserted within the mouth.The objective of this programme is to surmount these issues through the development of a unique tongue-movement communication and control strategy in a stand-alone device that can be calibrated for patient-use with all manner of common household devices and tailored to control assistive mechanisms. The strategy is based on detecting specific tongue motions by monitoring air pressure in the human outer ear, and subsequently providing control instructions corresponding to that tongue movement for peripheral control. Our research has shown various movements within the oral cavity create unique, traceable pressure changes in the human ear, which can be measured with a simple sensor (e.g. a microphone) and analyzed to produce commands, that can in turn be used to control mechanical devices or other peripherals. Results from this programme will enable patients with quadriplegia, arthritis, limited movement due to stroke, or other conditions causing limited or painful hand/arm movement to interface with their environment and control all manner of equipment for increased independence and quality of life.
|
Key Findings |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
|
Potential use in non-academic contexts |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
|
Impacts |
Description |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk |
Summary |
|
Date Materialised |
|
|
Sectors submitted by the Researcher |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
|
Project URL: |
http://faculty.nps.edu/ravi/HMI/Bristol%20Tongue%20Control%20Video.avi |
Further Information: |
|
Organisation Website: |
http://www.bris.ac.uk |