Co-led by CityU develops high-resolution wearable electro-touch rendering device that virtualizes the sense of touch

0

A collaborative research team co-led by the City University of Hong Kong (CityU) has developed a portable touch rendering system, which can mimic the feeling of touch with high spatial resolution and fast response rate. The team demonstrated its potential for application in a braille display, adding the sense of touch to the metaverse for functions such as virtual reality shopping and gaming, and potentially making life easier for astronauts, deep-sea divers and others who must wear thick gloves.

“We can hear and see our families over long distances via phones and cameras, but we still can’t feel or hug them. We are physically isolated by space and time, especially during this long-lasting pandemic,” said Dr. Yang Zhengbao, associate professor in the Department of Mechanical Engineering at CityU, who co-led the study. “Although there have been great advances in the development of sensors that digitally capture tactile characteristics with high resolution and high sensitivity, we still lack a system that can effectively virtualize the sense of touch that can register and read tactile sensations in space and time.”

In collaboration with Chinese tech giant Tencent’s Robotics X lab, the team developed a new electrotactile rendering system to display various tactile sensations with high spatial resolution and fast response rate. Their findings were published in the scientific journal Science Advances under the title “Super-resolution Wearable Electro-tactile Rendering System”.

Limits of existing techniques

Existing techniques for reproducing tactile stimuli can be broadly classified into two categories: mechanical and electrical stimulation. By applying mechanical force or localized vibration to the skin, mechanical actuators can cause stable and continuous tactile sensations. However, they tend to be bulky, limiting spatial resolution when integrated into a portable or portable device. Electrotactile stimulators, on the other hand, which evoke tactile sensations in the skin at the location of the electrode by passing a local electrical current through the skin, can be lightweight and flexible while providing higher resolution and responsiveness. faster. But most of them rely on high-voltage direct current (DC) pulses (up to hundreds of volts) to penetrate the stratum corneum, the outermost layer of the skin, to stimulate receptors and nerves, which poses a safety concern. In addition, the resolution of the touch rendering needed to be improved.

The latest electro-tactile actuator developed by the team is very thin and flexible and can be easily integrated into a finger cot. This fingertip wearable device can display different tactile sensations, such as pressure, vibration, and texture roughness in high fidelity. Instead of using DC pulses, the team developed an alternative high-frequency stimulation strategy and succeeded in lowering the operating voltage below 30 V, thus guaranteeing a safe and comfortable tactile rendering.

They also proposed a novel super-resolution strategy that can render tactile sensation at locations between physical electrodes, instead of just at electrode locations. This increases the spatial resolution of their stimulators by more than three times (from 25 to 105 points), so the user can experience more realistic tactile perception.

High spatial resolution tactile stimuli

“Our new system can elicit tactile stimuli with high spatial resolution (76 points/cm2), similar to the density of related receptors in human skin, and a fast response rate (4 kHz),” said Dr. Lin Weikang, PhD student at CityU, who fabricated and tested the device.

The team carried out various tests to show the various application possibilities of this new portable electrotactile rendering system. For example, they proposed a new braille strategy that is much easier to learn for people with visual impairments.

The proposed strategy breaks down the alphabet and numeric digits into individual strokes and in the same order as they are written. By wearing the new electrotactile rendering system on the fingertip, the user can recognize the presented alphabet by feeling the direction and sequence of the strokes with the fingertip sensor. “It would be particularly useful for people who lose their sight later in life, allowing them to continue to read and write using the same alphabetic system they are used to, without needing to learn the whole braille dot system. “, said Dr. Yang. .

Enable Touch in the Metaverse

Second, the new system is well suited for VR/AR apps and games, adding the sense of touch to the metaverse. The electrodes can be made very flexible and scalable to cover larger areas, such as the palm. The team demonstrated that a user can virtually feel the texture of clothes in a virtual fashion store. The user also feels an itching sensation at the fingertips when licked by a VR cat. When stroking the fur of a virtual cat, the user may experience a variation in roughness as the strokes change direction and speed.

The system can also be useful for transmitting fine tactile detail through thick gloves. The team successfully integrated the thin, lightweight electrodes of the electrotactile rendering system into flexible tactile sensors on a safety glove. The network of tactile sensors captures the distribution of pressure on the outside of the glove and transmits the information to the user in real time through tactile stimulation. In the experiment, the user was able to quickly and accurately locate a tiny steel puck just 1mm in radius and 0.44mm thick based on tactile feedback from the glove with sensors and stimulators. This shows the system’s potential to enable high-fidelity tactile perception, which is currently unavailable to astronauts, firefighters, deep-sea divers and others who must wear heavy protective suits or gloves.

“We expect our technology to benefit a wide range of applications, such as information transmission, surgical training, teleoperation and multimedia entertainment,” Dr. Yang added.

The co-first authors of the research are Mr. Lin, Mr. Zhang Dongsheng and Mr. Lee Wangwei, from Tencent’s Robotics X laboratory. The corresponding authors are Dr. Yang and Dr. Wei Lei of the Robotics X Lab. The research was supported by the Robotics X Lab and the Hong Kong Research Grants Council’s General Research Grant and Early Career Scheme.

/Public release. This material from the original organization/authors may be ad hoc in nature, edited for clarity, style and length. The views and opinions expressed are those of the author or authors. See in full here.
Share.

About Author

Comments are closed.