Tongue Movement for Control for Medical/Robotic Assist Mechanisms

BioRobots, LLC, with support from Think-A-Move, Ltd. is proposing to research and implement a novel technique to provide independent or augmented control of multi-degree-of-freedom (upper) limb prosthetics using tongue movements. The core of this technique centers on a method of detecting specific tongue motions within the mouth by monitoring air pressure near the human ear, and subsequently providing control instruction corresponding to that tongue movement. Our past work has shown that various movements within the oral cavity create unique, traceable pressure changes within the human ear, which can be measured with a simple sensor (e.g. a microphone) and analyzed to produce commands signals, which can be used to control a mechanical device.

Prototype System

A narrated video of the system in its prototype manifestation for mobile robotic and/or wheelchair control may be downloaded here:

Tongue for Telerobotic Operation BioRobots.mpg     MPG     320x240     4:57     35.3 MB
 
Tongue for Telerobotic Operation BioRobots.mov     Quicktime     240x180     4:57     13.8 MB

Note this video was assembled for an award presentation at the 2007 International Conference on Intelligent Robots and Systems (IROS).

Motivation

Upper limb prosthetics are presently difficult and awkward for a patient to control for anything other than simple tasks. Given that the human arm has seven degrees-of-freedom (dofs), and the hand has more than 20, patient input to direct even simple functions remains a significant challenge. While future work involving neural input, physiological signal-based control, and transplanted nerve regeneration hold great potential, such systems are not ready for patient use today. Under this program we propose the development of a complete system that will allow patents higher degree-of-freedom (dof) control of prosthetic limbs by moving their tongue, while needing nothing more than a simple earpiece to provide control signals. This program will enable patients with prosthetic limbs far greater independence than present-day control systems allow. Furthermore, beyond independent (stand-alone) control, our system will integrate seamlessly with other current and future prosthetic limb control systems. Tongue movements can be used for direct (active) control by mapping them on to control commands for artificial limbs, or indirect (passive) control through augmentation of other control schemes (e.g. neural or EMG-based control). At this time, we are not aware of any alternative system that could be available to patients in the near future with the potential of controlling comparable degrees-of-freedom without bulky equipment or requiring external bodily movement.

Our system outstrips all traditional input-control devices. It performs the same functions as those devices using subtle and unobtrusive movements and gestures of the human tongue within the mouth, rather than requiring uncomfortable movements or persistent skin contact, and it does not require anything to be placed within the oral cavity or on the body. Internal tongue movements many of them fundamental gestures recognizable from user to user are captured near the ear by an ear-piece microphone, and analyzed to determine the intent of the user. Not only is the system more comfortable than mechanical-input devices, it may also perform more functions than a joystick (the tongue can be moved in over 20 directions), and is quicker to use than other types of physiological control (commands are given directly).

The primary benefits of the proposed research will be the mental health of potentially millions of people worldwide with robot assist prosthetic devices. By enabling these patients to more fully operate their prosthesis, this research will provide them with fuller, happier lives. The effects on the patients and their loved ones could be dramatic.

All material contained herein is ©2008 BioRobots LLC, unless otherwise specified. Permission is granted to certain parties only to download and view these files.