Controlling hand prostheses can be enhanced by tracing eye movements. According to Henning Müller, Professor of Business Information Systems, a new dataset could enable the production of more efficient prostheses. This opens up completely new perspectives.
With its 34 muscles and 20 joints, the hand is an essential part of the body. Consequently, a hand amputations has a tremendous physical and psychological impact.
Thanks to what are known as myoelectric prostheses, which detect electrical muscle signals via electrodes on the skin, people who have lost a hand can at least regain certain functions. However, the dexterity is often limited and the reliability depends on the signal strength in the forearm muscles. In order to improve the control of prostheses, the data from the myoelectric signals can be combined with data from other sources. With eye movements, for example. This approach is being pursued by Henning Müller, Professor of Business Information Systems at the HES-SO Valais-Wallis in Sierre, Switzerland.
“When we reach for an object, our eyes focus on it for several hundred milliseconds,” Henning Müller explains. Eye tracking thus yields valuable information about the perception of an object that a person is reaching for, as well as about the movement that is required to grip it.
In addition, the eyesight remains unaffected, in contrast to the muscles of the amputated limb, which atrophy and reduce the quality of the myoelectric signals. In combination with eye tracking, machine vision, the computer-controlled recognition of objects in the field of vision, could also be used to partially automate hand prostheses.
The goal is to assign typical hand movements to the data provided by the muscles of the amputated forearm and additional information sources. The scientist thus developed an experimental set-up with 45 test subjects. 15 people with an amputated hand and a control group of 30 people with otherwise similar characteristics.
All participants had twelve electrodes attached to their forearms as well as sensors to their arms and heads. They also wore special glasses that recorded their eye movements.
The test subjects performed ten widely used hand movements. These and the objects used for the test had previously been selected in collaboration with the HES-SO Valais Institute of Physiotherapy. For example, the subjects were required to pick up a pen or a fork or to play with a ball.
By applying computer modelling to these movements, Henning Müller was able to create a new multimodal dataset of hand movements. This dataset not only contains the data recorded by the electrodes, but also information on the speed of forearm movements, eye movements, machine vision, and head movements.
The new dataset also contains information that could be useful to other disciplines involved in research into the coordination of eye and hand movements. These include neuroscience, robotics in healthcare, artificial intelligence, and even psychology.
The interdisciplinary study, carried out by the HES-SO, the University Hospital of Zurich, and the Italian Institute of Technology in Milan, was conducted within the framework of the Swiss National Science Foundation’s Sinergia program. Sinergia funds the collaboration between two to four research groups that conduct interdisciplinary research promising groundbreaking findings.
Written by SDA
Photos by SDA-KEYSTONE