A team of MIT researchers designed a wearable sensor that could help ALS patients communicate with their loved ones… and would cost just $10!
The stretchable sensor uses components that are easy to mass-produce, something which would account for the affordability of it.
How does this sensor work?
Those affected by amyotrophic lateral sclerosis gradually lose the ability to control their facial muscles, which in turn means they also gradually lose the ability to speak.
The sensor MIT produced is a thin, stretchable and almost-invisible device that can be worn on the face, where it would measure twitches, smiles and other small facial expressions, allowing the wearer to communicate with others.
“Not only are our devices malleable, soft, disposable, and light, they’re also visually invisible. You can camouflage it and nobody would think that you have something on your skin,” said Canan Dagdeviren, the LG Electronics Career Development Assistant Professor of Media Arts and Sciences at MIT and the leader of the research team, in the project announcement.
The researchers have tested the patch-like sensor on two ALS patients, one female and one male, and found that it could pick up the wearer’s smile, open mouth and pursed lips expressions.
Dagdeviren started working on the project after meeting the world-renowned physicist Stephen Hawking, who used an infrared sensor picking up twitches on his cheek to communicate, and wanted to find better ways to enable people with neuromuscular diseases to communicate.
Because the devices used by Hawking and other sufferers of similar diseases are bulky and sometimes unreliable, the researcher and their team wanted to create something lightweight and comfortable to use.
The sensor they created uses four piezoelectric sensors printed on a thin silicone sheet, which pick up mechanical deformation of the skin and convert that information into an easily measured electric voltage.
“We had subjects doing different motions, and we created strain maps of each part of the face,” said Rachel McIntosh, another one of the researchers.
That data was then used to train a machine-learning algorithm to recognize facial movements.
In the trials, the algorithm picked up the correct facial expressions of ALS patients with around 75 accuracy, an extremely encouraging statistic, as for healthy subjects the accuracy was 87 percent.
You can read more about this sensor and the research here.