Friday, September 17

You can see with your hands


You can see with your hands

You can see with your hands

A new technology developed at the Weizmann Institute of Sciences in Israel converts visual information into tactile signals, allowing you to “see” distant objects with your hands. The user perceives the information through active exploration of the environment, without the confusing intervention of artificial stimulation aids.

According to a Press release, the new system correctly identifies objects in less than 20 seconds, an unprecedented level of performance compared to existing touch vision methods. The research that was the basis for the creation of the new technology was published in the journal iScience.

The results of this innovation are based on the commitment to a dynamic and always active exploration of the environment. Since people who use it can experience sensations directly through their hands, there is an easier interaction that facilitates use and allows the device to achieve better performance compared to other similar systems.

Being a intuitive solution, which does not use cumbersome interfaces that remove the user from real sensations, the technology has achieved identical positive results in tests with sighted and blind people.

Consequently, it becomes attractive both for those who acquire a tool to “discover” the world due to the impossibility of seeing it, in the case of the blind, and for those users who can use the movement of their hands almost as a new sense.

Active and intuitive exploration

The system inevitably requires the active detection: there are no artificial stimulations involved. By being forced to use their hands, the object to be detected from a distance is “touched” and “seen” at the same time by the user, who scans the environment to achieve the desired objective.

The technology has been dubbed ASenSub, and it includes a small lightweight camera that sits on the user’s hand. The images it captures during environmental scans are transformed into tactile signals by means of a matrix, distributed on the lower surface of three fingers of the same hand that manages the camera.

Although the sensations are virtual, acquire a realism not seen in previous technologies. For example, if a triangle is scanned with the device, the sensation does not correspond to a flat figure, as if it were a graph. On the contrary, a triangle is received with its corresponding relief and with all the details that the object presents.

Related Topic: A touch screen system enables the blind to “see” images.

The plasticity of the human brain

The concept behind this technological application is called sensory substitution, through which information is obtained with one sense through the signals that have been perceived with another. The exploration of this class of technologies broadens the understanding of the mechanisms of human perception.

At the same time, it can become a very useful tool to improve the quality of life of people who are blind or severely visually impaired. However, there is no information on other applications of this type that have been incorporated into the daily life of the blind, mainly because they do not present the intuitive and dynamic character that is required. This new development could inaugurate a new trend in this regard.

Researchers have highlighted that this new technology can not only be very useful for visually impaired people, but also serves to verify the enormous human brain plasticityBy actively exploring the environment, the brain is able to “invent” a new meaning in a few seconds, proving once again that its possibilities are practically endless.

Reference

Active sensory substitution allows fast learning via effective motor-sensory strategies. Yael Zilbershtain-Kra, Shmuel Graffi, Ehud Ahissar and Amos Arieli. iScience (2021) .DOI: https://doi.org/10.1016/j.isci.2020.101918

Photo: In the new system, a special device (1) creates tactile signals on the basis of visual information captured by a small camera (2). Credit: Weizmann Institute of Science / Yael Zilbershtain-Kra et al / iScience.


www.informacion.es

Leave a Reply

Your email address will not be published. Required fields are marked *

Share