[ad_1]
//php echo do_shortcode(‘[responsivevoice_button voice=”US English Male” buttontext=”Listen to Post”]’) ?>
This year, the Mahowald Prize for neuromorphic engineering was awarded for the development of sophisticated electronic skins with tactile sensing that can be processed at low power. This could have major implications for both robotics and prosthetics. Nitish Thakor and his colleagues in the Neuroengineering and Biomedical Instrumentation Lab at Johns Hopkins University (JHU) have been working on this multi-faceted project for several years.
The research extends from the sensors themselves and how to interpret their output to the incorporation of other data like hand position and (useful) pain signals to the non-invasive communication of all this information to the user’s brain.
Prosthetics are particularly well suited to a neuromorphic approach. First, low power is of the essence if you are unable to function fully when your battery runs out. Second, you don’t need your prosthetic to be particularly programmable; you just need it to do a restricted set of tasks really efficiently, so there’s no disadvantage in using custom circuitry. Finally, prosthetics are designed to interface with biological systems, so having them work using biological principles makes sense.
However, the sensing properties of a human hand are complicated. Touch, or tactile sensing, involves four different receptors that measure indentation, movement, stretch and vibration on the skin, plus thermoreceptors to measure temperature. Though they are activated in different ways, what they all have in common is that their output consists of neural impulses or spikes. These move at speeds that depend on the timeliness of the information they carry. For instance, spikes from pain receptors (sensing damaging forces) and those measuring when contact starts/stops are transported via nerves that are 10× to 500× faster than simple force-amplitude sensors.
Touchy-feely
From an engineering point of view, it’s not just the detection that’s critical—many different tactile devices can potentially do the job—but also how the sensor encodes the information. As with event-based vision sensors, neuromorphic efficiency comes from sparsity: The spikes signal not what is but what’s changing, which means the data is inherently compressed and ready to be interpreted by an appropriate neural network.
Extracting meaning from the spikes involves looking at them collectively over a time window. As with sound, an instantaneous snapshot is not useful: Just as you only hear a rhythm over time, you can only feel a texture as you pull your finger across a surface to sense repeated cycles of increase and decrease in pressure. The speed at which you pull your finger across can also make a difference because it impacts the accuracy of the force measurement at any given time. This means there is an inherent engineering tradeoff between speed and accuracy.
Members of the JHU team have done a lot of work in this area, including demonstrating neuromorphic (brain-inspired) texture classification, the importance of the relative change between taxels (tactile pixels) to avoid ambiguity and the effective use of fabric- and silicon-based soft fingers. In addition, they’ve shown that, with just a few taxels, they can get high accuracy in determining the orientation of edges when sensing an object. Finally, just last year, they showed how they were able to combine tactile sensing with proprioception, providing information about the speed of movement across a surface, which helps interpret the signals and improve classification accuracy.
Closing the loop
When dealing with prosthetics, however, feeling surfaces is not enough. Just like bodies, prosthetics can be damaged, and pain is the mechanism we use to stop this. To work, nociceptors (from the word “noxious”) have to be fast, so the JHU researchers developed a circuit with just three neurons that works out whether a stimulus is likely to be harmful. One measures force, another measures area and a third determines whether these together represent an amount of pressure that could cause damage. If yes, spikes can be sent by fast channels directly to the spinal cord to induce a reflex action away from danger.
Tying the sensors into the prosthetic-human action/reaction loop has been a critical part of the work. The group has been using targeted transcutaneous electrical nerve stimulation (tTENS) to get a spiking signal from the prosthesis into the brain via the skin of an amputee. In a 2020 paper, they showed that the information can indeed be used by the brain when the stimulation matches what the brain would be expecting from the phantom hand—the hand that is no longer there but which the brain still remembers and residually expects input from. According to the paper, this “leads to faster information transfer and increased number of functional connection paths among somatosensory, motor and multisensory processing systems.”

Researchers have already shown that robots can combine vision and tactile sensing to better coordinate action, so most of these technologies should directly benefit this field, too. Given that the largest mass market for robots is expected to be in the area of personal and home care—particularly in support of our aging population—the ability to feel the subtleties of physical interactions is likely to become increasingly important. Whether it’s to support someone frail, handle breakable glassware or slice a tomato, success will inevitably require sensitive touch.
[ad_2]
Source link

