When we imagine the future of robotics, humanoid machines often take centre stage: robots that can assist in manufacturing, support healthcare workers, or safely collaborate alongside humans. Yet despite dramatic advances in AI, vision and actuation, one fundamental capability still limits what robots can do in the physical world, a genuine sense of touch.
This challenge, and the progress being made to address it, is the focus of an upcoming IET Bristol Local Network event, Tactile Robotics: Past and Future, which takes place on 22 April 2026. The lecture will be delivered by Professor Nathan F. Lepora, Professor of Robotics & AI at the Bristol Robotics Laboratory, and one of the leading researchers working on robotic touch and dexterous manipulation.
While the event itself promises an engaging overview, the topic is far broader than a single evening talk. Tactile sensing is increasingly recognised as a cornerstone technology for the next generation of robotics, and its story is one of steady progress, occasional setbacks, and renewed momentum.
Why touch matters in robotics
Vision systems have benefited enormously from improvements in cameras, data processing and machine learning. But vision alone is not enough to manipulate the physical world with human-like dexterity. Tasks such as grasping irregular objects, adjusting grip force, or detecting slippage all rely heavily on tactile feedback.
Humans use touch not just to detect contact, but to infer shape, texture, compliance and motion. Translating these rich sensory signals into robotic systems is non-trivial. It requires advances across materials science, sensor design, signal processing and control, as well as AI models capable of interpreting noisy and high-dimensional data in real time.
Without touch, robots remain rigid and cautious. With it, they become adaptable, precise, and far more capable of interacting safely with people and delicate objects.
A brief history of tactile robotics
One of the strengths of Professor Lepora’s work is his historical perspective on the field. Tactile robotics has progressed in distinct phases:
- 1965–1979: Origins
Early experiments explored simple contact sensors and force measurements, often limited by bulky hardware and slow electronics. - 1980–1994: Foundations and growth
Researchers began developing tactile arrays and studying how touch could be used for object recognition and manipulation. - 1995–2009: The “tactile winter”
Progress slowed due to high costs, manufacturing complexity, and limited computational power to process tactile data effectively. - 2010–2024: Expansion and diversification
Renewed interest, driven by soft robotics and advances in AI, led to rapid innovation across multiple approaches.
This recent expansion is particularly significant. Tactile sensing is no longer a niche research interest; it is becoming a practical component of real-world robotic systems.
Key technical approaches shaping the field
Modern tactile robotics draws on several complementary technologies, many of which are likely to feature in the upcoming lecture.
Electronic skins (e-skins)
E-skins use flexible substrates embedded with pressure, stretch or temperature sensors to cover robotic hands or arms. These systems aim to replicate the distributed sensitivity of human skin while remaining robust and scalable for real-world use.
Vision-based tactile sensing
Rather than relying solely on electrical signals, vision-based sensors use cameras to observe deformations in a soft material when contact occurs. This approach allows very high-resolution touch sensing using well-understood computer vision techniques.
Soft and biomimetic touch
Soft robotics has transformed tactile design by allowing sensors and actuators to deform safely on contact. Biomimetic approaches take inspiration from human and animal physiology, improving both safety and sensory richness.
The tactile internet and telepresence
Touch is increasingly being integrated into remote operation systems, enabling operators to “feel” what a robot experiences. This has applications in surgery, hazardous environments, and advanced training.
Together, these techniques are pushing tactile robotics from the lab into industrial, medical and consumer domains.
From research to real-world impact
Looking ahead, Professor Lepora suggests that the next generation of tactile robotics – emerging from 2025 onwards – could see widespread commercial adoption. This shift matters not just for robotics specialists, but for engineers working across automation, AI, healthcare and manufacturing.
More capable robotic touch has the potential to:
- Enable human-like dexterity in robotic hands
- Improve our understanding of human intelligence and sensorimotor control
- Transform teleoperation and remote collaboration
- Support safer human–robot interaction in shared environments
As systems mature, tactile sensing is likely to become as standard as vision in autonomous machines.
About the event
Tactile Robotics: Past and Future takes place on Tuesday 22 April 2026, from 6:30–8:30pm, at the Merchant Venturers Building at the University of Bristol. The event is free to attend and includes time for networking with local engineers and researchers.
The lecture is led by Professor Nathan F. Lepora, who heads the Dexterous Robotics Group at Bristol Robotics Laboratory. The session also contributes towards IET Continuing Professional Development (CPD) hours.
Whether you are working directly in robotics or simply interested in how engineering is enabling more capable and intelligent machines, this event offers both technical insight and a clear view of where the field is heading.
If you’re curious about how robots might one day match (or even exceed) human capabilities in touch and manipulation, this is an excellent opportunity to explore one of robotics’ most important frontiers.
Registration details are available via the IET events page.
Join the conversation
Where do you think tactile sensing will make the biggest difference over the next decade — industrial robotics, healthcare, humanoids, or remote operation? What do you see as the hardest technical challenge in giving robots a reliable sense of touch: sensor design, data interpretation, integration with control systems, or something else? Have you worked with tactile sensors, haptics, or soft robotics in practice — and what lessons did you learn? How important do you think touch will be alongside vision and AI in achieving truly intelligent robots?
Share your thoughts and experiences in the comments below. And if these questions resonate, the Tactile Robotics: Past and Future event is a great opportunity to explore them further with researchers and engineers working at the forefront of the field.