Haptic technology that can simulate touch
Northwestern University engineers recently revealed a technology that creates precise movements to mimic complex sensations, with the potential to create haptic technology that simulates real touch.
While sitting on the skin, the compact, lightweight, wireless device applies force in any direction to generate a variety of sensations, such as vibrations, stretching, pressure, sliding, and twisting. This device, detailed in a study published in the journal Science, is also capable of combining sensations and operating fast or slow to simulate a more nuanced sense of touch.
The device is also powered by a small rechargeable battery, and utilises Bluetooth to wirelessly connect to virtual reality headsets and smartphones. It is small and efficient, so it can be placed anywhere on the body, and be combined with other actuators in arrays or integrated into current wearable electronics.
The vision is for it to enhance virtual experiences, help individuals with visual impairments navigate their surroundings, reproduce the feeling of different textures on flat screens for online shopping, provide tactile feedback for remote healthcare visits, and even allow people with hearing impairments to ‘feel’ music.
“Almost all haptic actuators really just poke at the skin,” said John A. Rogers from Northwestern, who led the device design. “But skin is receptive to much more sophisticated senses of touch. We wanted to create a device that could apply forces in any direction — not just poking but pushing, twisting and sliding. We built a tiny actuator that can push the skin in any direction and in any combination of directions. With it, we can finely control the complex sensation of touch in a fully programmable way.”
Rogers is the Louis A. Simpson and Kimberly Querrey Professor of Materials Science and Engineering, Biomedical Engineering, and Neurological Surgery, with appointments in the McCormick School of Engineering and Northwestern University Feinberg School of Medicine. Rogers co-led the work with Northwestern’s Yonggang Huang, the Jan and Marcia Achenbach Professor in Mechanical Engineering and professor of civil and environmental engineering at McCormick. Northwestern’s Kyoung-Ho Ha, Jaeyoung Yoo and Shupeng Li are the study’s co-first authors.
Challenges of haptic technologies
This research comes as visual and auditory technologies have experienced an explosion in growth, delivering immersion through devices like high-fidelity, deeply detailed surround-sound speakers and fully-immersive virtual reality goggles. Haptic technologies, however, have plateaued, and modern systems offer buzzing patterns of vibrations.
This developmental gap stems from the complexity of human touch. The sense of touch involves different types of mechanoreceptors or sensors, each with its own sensitivity and response characteristics, located at varying depths within the skin. When these mechanoreceptors are stimulated, they send signals to the brain, which are translated as touch.
Replicating the sophistication of touch requires precise control over the type, magnitude, and timing of stimuli delivered to the skin. This presents a significant challenge which current technologies are struggling to overcome.
“Part of the reason haptic technology lags video and audio in its richness and realism is that the mechanics of skin deformation are complicated,” said J. Edward Colgate from Northwestern, a haptics pioneer and study co-author. “Skin can be poked in or stretched sideways. Skin stretching can happen slowly or quickly, and it can happen in complex patterns across a full surface, such as the full palm of the hand.”
To simulate the complexity of touch, the researchers developed the first actuator with full freedom of motion (FOM). The actuator is not constrained to a single type of movement or limited set of movements, and can instead move and apply forces in all directions along the skin. In doing so, these forces engage all mechanoreceptors in the skin, individually and in combination with one another.
“It's a big step toward managing the complexity of the sense of touch,” added Colgate, Walter P. Murphy Professor of Mechanical Engineering, McCormick. “The FOM actuator is the first small, compact haptic device that can poke or stretch skin, operate slow or fast, and be used in arrays. As a result, it can be used to produce a remarkable range of tactile sensations.”
The device measures a few millimetres in size and harnesses a tiny magnet and set of wire coils arranged in a nesting configuration. As electricity flows through the coils it generates a magnetic field. When this magnetic field interacts with the magnet, it produces a strong enough force to move, push, pull or twist the magnet. Combining actuators into arrays means the team could reproduce the feeling of pinching, stretching, squeezing and tapping.
“Achieving both a compact design and strong force output is crucial,” said Huang. “Our team developed computational and analytical models to identify optimal designs, ensuring each mode generates its maximum force component while minimising unwanted forces or torques.”
On the other side of the device, the team added an accelerometer, which allows it to gauge its orientation in space. Using this information, the system can provide haptic feedback based on the user’s context. If the actuator is on a hand, for instance, the accelerometer can detect if the user’s hand is palm up or palm down. The accelerator also can track the actuator’s movement, providing information about its speed, acceleration and rotation.
“If you run your finger along a piece of silk, it will have less friction and slide faster than when touching corduroy or burlap,” explained Rogers. “You can imagine shopping for clothes or fabrics online and wanting to feel the texture.”
Beyond replicating everyday tactile experiences, the platform also can transfer information through the skin. By changing the frequency, intensity and rhythm of haptic feedback, the team converted the sound of music into physical touch. They also were able to alter tones just by changing the direction of the vibrations. Feeling these vibrations allowed users to be able to differentiate between various instruments.
“We were able to break down all the characteristics of music and map them into haptic sensations without losing the subtle information associated with specific instruments,” Rogers explained. “It’s just one example of how the sense of touch could be used to complement another sensory experience. We think our system could help further close the gap between the digital and physical worlds. By adding a true sense of touch, digital interactions can feel more natural and engaging.”