A Wearable Device That Could Control Computers and Prosthetics

Researchers from UC Berkeley have developed a gesture-detecting wearable device that could be used to control electronics and prosthetics. It uses a combination of AI software and biosensors to identify the hand gestures a person intends to make by analyzing electrical signals from their arm.

Human hand reaching out to a robotic hand
A Wearable Device That Could Control Computers and Prosthetics

A Prosthetics Device That Brings Convenience

It is not the first gesture recognition system designed for human-computer interaction, but this one offers great and unique benefits. Most importantly, it uses a neuro-inspired hyperdimensional computing algorithm to update itself when it receives new information, such as changes to electrical signals if the arm gets sweaty.

Ali Moin, who is the study coauthor, explained that in gesture recognition, the signals of the person who is wearing the device are going to change over time and that this could easily affect the performance of the model. The creators of the special prosthetics were able to greatly improve the classification accuracy by updating the model on the device.

Prosthetics controlled by a wearable device
A Wearable Device That Could Control Computers and Prosthetics

A Wearable Future

The team screen-printed the biosensing system on a thin sheet of PET substrate. It’s a polymer resin that is usually used to produce plastic containers and synthetic fibers. The specialists picked the material from their armband due to its flexibility. This way, it allows it to conform to the muscle movements of the forearm. The array consists of 64 electrodes. Each of them detects electric signals from a different point on the arm. This data is then fed into an electric chip that uses the algorithm to associate the specific hand gestures with the signals.

Prosthetics are one important application of this technology. Aside from this, it offers a very unique and intuitive way of communication with computers. Reading hand gestures is one amazing way of improving human-computer interaction. All the computing on the device is done locally on a chip that protects the biological data of the user and speeds up the system. Ali Moin believes that this combination of performance and security could turn the system into a viable commercial product.