industry news
Subscribe Now

Artificial skin capable of feeling ‘pain’ could lead to new generation of touch-sensitive robots

An electronic skin which can learn from feeling ‘pain’ could help create a new generation of smart robots with human-like sensitivity.

A team of engineers from the University of Glasgow developed the artificial skin with a new type of processing system based on ‘synaptic transistors, which mimics the brain’s neural pathways in order to learn. A robot hand which uses the smart skin shows a remarkable ability to learn to react to external stimuli.

In a new paper published today in the journal Science Robotics, the researchers describe how they built their prototype computational electronic-skin (e-skin), and how it improves on the current state of the art in touch-sensitive robotics.

Scientists have been working for decades to build artificial skin with touch sensitivity. One widely-explored method is spreading an array of contact or pressure sensors across the electronic skin’s surface to allow it detect when it comes into contact with an object.

Data from the sensors is then sent to a computer to be processed and interpreted. The sensors typically produce a large volume of data which can take time to be properly processed and responded to, introducing delays which could reduce the skin’s potential effectiveness in real-world tasks.

The Glasgow team’s new form of electronic skin draws inspiration from how the human peripheral nervous system interprets signals from skin in order to eliminate latency and power consumption.

As soon as human skin receives an input, the peripheral nervous system begins processing it at the point of contact, reducing it to only the vital information before it is sent to the brain. That reduction of sensory data allows efficient use of communication channels needed to send the data to the brain, which then responds almost immediately for the body to react appropriately.

To build an electronic skin capable of a computationally efficient, synapse-like response, the researchers printed a grid of 168 synaptic transistors made from zinc-oxide nanowires directly onto the surface of a flexible plastic surface. Then, they connected the synaptic transistor with the skin sensor present over the palm of a fully-articulated, human-shaped robot hand.

When the sensor is touched, it registers a change in its electrical resistance – a small change corresponds to a light touch, and harder touch creates a larger change in resistance. This input is designed to mimic the way sensory neurons work in the human body.

In earlier generations of electronic skin, that input data would be sent to a computer to be processed. Instead, a circuit built into the skin acts as an artificial synapse, reducing the input down into a simple spike of voltage whose frequency varies according to the level of pressure applied to the skin, speeding up the process of reaction.

The team used the varying output of that voltage spike to teach the skin appropriate responses to simulated pain, which would trigger the robot hand to react. By setting a threshold of input voltage to cause a reaction, the team could make the robot hand recoil from a sharp jab in the centre of its palm.

In other words, it learned to move away from a source of simulated discomfort through a process of onboard information processing that mimics how the human nervous system works.

The development of the electronic skin is the latest breakthrough in flexible, stretchable printed surfaces from the University of Glasgow’s Bendable Electronics and Sensing Technologies (BEST) Group, led by Professor Ravinder Dahiya.

Professor Dahiya, of the University’s James Watt School of Engineering, said: “We all learn early on in our lives to respond appropriately to unexpected stimuli like pain in order to prevent us from hurting ourselves again. Of course, the development of this new form of electronic skin didn’t really involve inflicting pain as we know it – it’s simply a shorthand way to explain the process of learning from external stimulus.

“What we’ve been able to create through this process is an electronic skin capable of distributed learning at the hardware level, which doesn’t need to send messages back and forth to a central processor before taking action. Instead, it greatly accelerates the process of responding to touch by cutting down the amount of computation required.

“We believe that this is a real step forward in our work towards creating large-scale neuromorphic printed electronic skin capable of responding appropriately to stimuli.”

Fengyuan Liu, a member of the BEST group and a co-author of the paper, added: “In the future, this research could be the basis for a more advanced electronic skin which enables robots capable of exploring and interacting with the world in new ways, or building prosthetic limbs which are capable of near-human levels of touch sensitivity.”

The team’s paper, titled ‘Printed Synaptic Transistors based Electronic Skin for Robots to Feel and Learn’, is published in Science Robotics. The research was supported by funding from the Engineering and Physical Sciences Research Council (EPSRC).

Leave a Reply

featured blogs
Nov 30, 2022
By Joe Davis Sponsored by France's ElectroniqueS magazine, the Electrons d'Or Award program identifies the most innovative products of the… ...
Nov 29, 2022
Smart manufacturing '“ the use of nascent technology within the industrial Internet of things (IIoT) to address traditional manufacturing challenges '“ is leading a supply chain revolution, resulting in smart, connected, and intelligent environments, capable of self-operati...
Nov 22, 2022
Learn how analog and mixed-signal (AMS) verification technology, which we developed as part of DARPA's POSH and ERI programs, emulates analog designs. The post What's Driving the World's First Analog and Mixed-Signal Emulation Technology? appeared first on From Silicon To So...
Nov 18, 2022
This bodacious beauty is better equipped than my car, with 360-degree collision avoidance sensors, party lights, and a backup camera, to name but a few....

featured video

How to Harness the Massive Amounts of Design Data Generated with Every Project

Sponsored by Cadence Design Systems

Long gone are the days where engineers imported text-based reports into spreadsheets and sorted the columns to extract useful information. Introducing the Cadence Joint Enterprise Data and AI (JedAI) platform created from the ground up for EDA data such as waveforms, workflows, RTL netlists, and more. Using Cadence JedAI, engineering teams can visualize the data and trends and implement practical design strategies across the entire SoC design for improved productivity and quality of results.

Learn More

featured paper

Algorithm Verification with FPGAs and ASICs

Sponsored by MathWorks

Developing new FPGA and ASIC designs involves implementing new algorithms, which presents challenges for verification for algorithm developers, hardware designers, and verification engineers. This eBook explores different aspects of hardware design verification and how you can use MATLAB and Simulink to reduce development effort and improve the quality of end products.

Click here to read more

featured chalk talk

NXP GoldVIP: Integration Platform for Intelligent Connected Vehicles

Sponsored by Mouser Electronics and NXP Semiconductors

Today’s intelligent connected vehicle designs are smarter and safer than ever before and this has a lot to do with a rapidly increasing technological convergence of sensors, machine learning, over the air updates, in-vehicle high bandwidth networking and more. In this episode of Chalk Talk, Amelia Dalton chats with Brian Carlson from NXP about NXP’s new GoldVIP Platform. They examine the benefits that this kind of software integration platform can bring to automotive designs and how you can take a test drive of the GoldVIP for yourself.

Click here for more information about NXP Semiconductors GoldBox for Vehicle Networking (S32G-VNP-GLDBOX)