New wearable device detects hand gesture you intend to make
The system, which couples wearable biosensors with artificial intelligence (AI), could one day be used to control prosthetics or to interact with almost any type of electronic device, said engineers from University of California, Berkeley.
New York
Imagine typing on a computer without a keyboard, playing a video game without a controller or driving a car without a wheel. A team of scientists has developed a new device that can recognise hand gestures based on electrical signals detected in the forearm.
The system, which couples wearable biosensors with artificial intelligence (AI), could one day be used to control prosthetics or to interact with almost any type of electronic device, said engineers from University of California, Berkeley.
"Prosthetics are one important application of this technology, but besides that, it also offers a very intuitive way of communicating with computers." said Ali Moin, who helped design the device as a doctoral student in UC Berkeley's Department of Electrical Engineering and Computer Sciences.
Reading hand gestures is one way of improving human-computer interaction.
"And, while there are other ways of doing that, by, for instance, using cameras and computer vision, this is a good solution that also maintains an individual's privacy," Moin said in a new paper appeared in the journal Nature Electronics.
To create the hand gesture recognition system, the team collaborated with Ana Arias, a professor of electrical engineering at UC Berkeley, to design a flexible armband that can read the electrical signals at 64 different points on the forearm.
The electrical signals are then fed into an electrical chip, which is programmed with an AI algorithm capable of associating these signal patterns in the forearm with specific hand gestures.
The team succeeded in teaching the algorithm to recognise 21 individual hand gestures, including a thumbs-up, a fist, a flat hand, holding up individual fingers and counting numbers.
Like other AI software, the algorithm has to first "learn" how electrical signals in the arm correspond with individual hand gestures.
To do this, each user has to wear the cuff while making the hand gestures one by one.
"In gesture recognition, your signals are going to change over time, and that can affect the performance of your model," Moin said.
"We were able to greatly improve the classification accuracy by updating the model on the device."
Another advantage of the new device is that all of the computing occurs locally on the chip: No personal data are transmitted to a nearby computer or device.
Not only does this speed up the computing time, but it also ensures that personal biological data remain private.
While the device is not ready to be a commercial product yet, it could likely get there with a few tweaks, said Jan Rabaey, senior author of the paper.
Visit news.dtnext.in to explore our interactive epaper!
Download the DT Next app for more exciting features!
Click here for iOS
Click here for Android