- The new device will improve human-computer interaction serving as an alternative to computer visions and cameras.
- The new device will become a commercial product likely soon, after a few tweaks.
A new device, a wearable sensor that will monitor human muscle activity is being developed by the University of California, Berkeley that can recognize a thumbs-up, counting numbers, a fist, holding up individual fingers, a flat hand, and 16 other individual hand gestures based on electrical signals detected in the forearm.
This new system combines wearable biosensors with AI or Artificial Intelligence that will one day likely pave the way we communicate with any kind of electronic device or even control Prosthetics. Experts claim this device is an alternative to improve the human-computer interaction.
The team that developed the hand recognition system collaborated with Ana Arias, professor of electrical engineering at UC Berkeley. They have successfully designed a flexible armband that reads the electrical signals at 64 different points on the forearm and these electrical signals are then fed into an electrical chip which is then programmed with an AI algorithm. The algorithm is capable of connecting these signal patterns in the forearm with specific hand gestures, but first needs to “learn” how the electrical signals in the arm correspond with the individual hand gestures, and to attain this, hand gestures must be made one by one while wearing the cuff.
Ali Moin is a doctoral student in UC Berkeley’s Department of Electrical Engineering and Computer Sciences explained an instance: when the hand muscles are about to contract, the brain sends electrical signals through neurons in the neck and the shoulders to the muscle fibers in the arms and the hands. The electrodes in the cuff detect this electrical field, however, not precisely as they are still unable to pinpoint the exact fibers that are triggered in doing so. But Moin further explains that with a high density of electrodes, it can still learn to recognize certain patterns.
Moin who also helped in designing this device said, “Prosthetics are one important application of this technology, but besides that, it also offers a very intuitive way of communicating with computers…Reading hand gestures is one way of improving human-computer interaction. And, while there are other ways of doing that, by, for instance, using cameras and computer vision, this is a good solution that also maintains an individual’s privacy.”
There are many advantages to this new device especially because this device uses a hyperdimensional computing algorithm, an advanced AI. This algorithm is impressive as it can update itself with new information. Moin further explains that the user’s signals will change over time which in return will affect the performance of the model which they have been triumphant in improving the classification accuracy by updating the model on the device.
The other noteworthy advantage is that all of the computing takes place on the chip locally which helps in speeding up the computing time. Not only that, but the personal data does not transmit to any other device ensuring that all personal biological data stays private.
Jan Rabaey, the Donald O. Pedersen Distinguished Professor of Electrical Engineering at UC Berkeley and senior author of this paper said, “The problem is that then you’re stuck with that particular model. In our approach, we implemented a process where the learning is done on the device itself. And it is extremely quick: You only have to do it one time, and it starts doing the job. But if you do it more times, it can get better. So, it is continuously learning, which is how humans do it.” She further added that the device will likely be ready to become a commercial product with a few tweaks and is not ready to be one as of yet.