Sensing an Evolution: How InterDigital’s Work in Multimodal Communications Brings Us Closer to Haptic and Emotional Wireless Internet




Sensing an Evolution: How InterDigital’s Work in Multimodal Communications Brings Us Closer to Haptic and Emotional Wireless Internet

Sensing an Evolution: How InterDigital’s Work in Multimodal Communications Brings Us Closer to Haptic and Emotional Wireless Internet

Over the years, wireless technology has evolved to better communicate human sensory information. While the first cell phones leveraged our sense of hearing, our technology has developed over wireless generations to leverage our sense of sight in order to send and receive data and stream video. The next natural frontier for sensory connection is touch, and InterDigital teams leading Next Generation Networking research are committed to cutting-edge innovations that may one day equip wireless networks with the ability to transmit haptic and emotional data between users in real time.

Our modern cellular ecosystem is built upon the foundation of earlier generations of wireless. When cell phones first emerged in the era we now call 1G, they had the functionality of landline phones equipped with analog radio technology to make them mobile. The advent of 2G made these calls digital and added SMS functionality and limited internet connectivity, while 3G added support for video calling, interactive gaming, and expanded wireless internet connectivity at comparatively low data rates. 4G increased data rates dramatically to support seamless high-definition video streaming nearly anywhere, and to enable other activities like interactive AR gaming. As we look toward 5G and 6G, the evolution of our wireless technology may shift in ways that appeal to our human senses beyond sight and sound and towards functionality that includes tactile, haptic, and even emotional sensory capabilities.

Here's what that future might need to consider.

Getting in Touch

Our sense of touch is unique because it is capable of simultaneous input and output. While we can’t see our own eyes or smell our own nose, our sensory neurons can receive the sensation of being touched at the same time as perceiving and responding to the sensation of touching something, a phenomenon that makes the creation of haptic sensory modalities and feedback very complex.

Touch is closely tied to our human perception. Like searching for keys or a specific item in the bottom of a bag in the dark, haptic perception refers to active touch or the human ability to manipulate things while receiving sensory feedback about the interaction. A step further, haptic perception is shaped by two types of feedback: kinesthetic feedback, which provides information of force, torque, position, and velocity perceived by the body, and tactile feedback which provides information about surface texture, friction, and more. Conversely, tactile perception is a more passive concept tied to the sensation of being touched, like when someone unexpectedly taps you on your shoulder to get your attention.

Haptic technology works to replicate the haptic sense by providing real time tactile and force feedback to the user through a variety of sensor-enabled haptics devices, from gloves to body suits to other wearable devices. Rooted in industrial sensor technology, the sensing part of this technology is made possible by the emergence of 5G with its extreme high availability, reliability and ultra-low latency and the convergence of wireless sensor networks and cellular communications technology.

Because the human body reacts so quickly, the high reliability and ultra-low latency enabled by 5G is a critical requirement for haptic technologies. The tactile internet demands latency not exceeding one millisecond because that is the amount of time that our neurological system can transfer to our brain the sense of touch from the part of the body where touch is being felt. Longer latencies can induce cyber-sickness or nausea in users, and high reliability is integral to mitigate information loss for sensitive activities like remote surgery.

These capabilities unlock exciting new use cases, including several industrial/robotic control applications, interactive gaming, education and edutainment applications, automotive features, and more. Because neurons associated with touch respond more quickly than those associated with visual or auditory stimuli, haptics have proven especially effective at delivering alerts to guide visually- or hearing-impaired people.

Incorporating Emotional Feedback

As we look toward 6G use cases, our emerging sensory capabilities might operate beyond the human-to-computer interactions we see today and expand into emotional sensing and feedback. Networks will be equipped to consider the emotional state of users based on feedback from ambient sensors, wearable devices, and information collected from other sources. Together, these devices can provide key data points that measure things like body position and posture, analyze user mood from social media engagement, or conduct biomolecular sensing to determine a person's emotional state. To enhance our immersive experiences, it’s no surprise that the next generation of sensory applications, mixed reality (XR), and other intelligent systems will move to incorporate this kind of technology.

For example, professional athletes can benefit from haptic feedback and emotional data to better implement and tailor virtual trainings. In this 5G-enabled scenario, a coach can remotely monitor an athlete’s training session based on data collected from wearable and ambient sensors, while emotional information collected from athletes during the training provides live feedback to adjust the training to each athlete’s needs or temperament.

In a practical setting, an Olympic-training athlete might leverage the multi-modal sensory data collected from 5G-enabled smart garments regarding information like muscle fatigue, temperature, blood oxygen level, sweat, and heart rate to enhance performance. To support these communications, a mobile network operator (MNO) must deliver service data flows for multimodal information to one common session over different devices or pieces of user equipment (UE), enabling the athlete to receive a haptic alert to their glove while receiving related audio via a headset.

Use Case: Multimodal feedback for remote athlete training

To accommodate these types of use cases and user experiences, both user equipment and system architecture models must evolve. To achieve the necessary latency to prevent cyber sickness, innovations in edge computing architecture help bring data closer to the end user, while network slicing solutions help increase the available resources for these use cases. In the future, we hope to design network slices capable of delivering modalities to satisfy specific user requirements on the fly, thus delivering a combination of resources for audio, video, and haptics feeds to increase the reliability and performance of the service.

Shaping Standards for our Haptic Sense

Still, there are several hurdles to overcome before we can enjoy this future. Interoperability challenges have arisen alongside the evolution of these new capabilities and system architectures, and several industry standards groups have turned their attentions to crafting new solutions.

Haptics research remains in the early stages and the technology standards development process has been spread across different standards bodies. For example, in 3GPP Release 18, the service requirements group (SA1) has introduced and is currently studying several multimodality communications features. In the 3GPP system architecture group, companies have begun introducing study items for multimodality communications, specifically exploring how the 3GPP system can evolve to deliver multimodality communications, and how that might impact the standard. There are additional initiatives underway in standards bodies like IEEE, IETF, ISO/IEC MPEG, and ITU-T. The first IEEE activities on tactile internet began in 2016 with 1918.1, following the very first ITU-T reference report on the topic in 2014, and ISO/IEC MPEG is currently exploring a haptics standard addressing applications like mobile communications, gaming, and safety.

In addition to shaping technology capabilities, standards bodies are also defining service requirements and key performance indicators (KPIs) with direct inputs from engineers and neuroscientists who intimately understand the minimum perceived delay requirement in a tactile or haptic scenario. For applications involving robotics, a separate set of experts are consulted, as the required latencies may reduce to levels that humans cannot perceive, as low as 125 microseconds.

Collaborating on What's Next

For haptics and multimodal communications, the standards development process must be an interdisciplinary effort. Networking and wireless technology companies will need to specify things like latencies in the core and radio networks, but tactile internet applications may require contributions and expertise beyond traditional communications, internet, and computing technology industries. For example, physicians and medical device manufacturers will need to be involved in technology and standards development for remote surgery applications, while behavioral psychologists should provide insights into emotional sensing technologies.

As we welcome the next generation of wireless and look towards 6G, we recognize that our wireless internet experience of the future will be much more complex than it is today. At InterDigital, we are excited to contribute to the emergence of this new technology enabling more autonomous intelligent systems and multi-sensory experiences that enable us to experience and interact with more than the sights and sounds we sense today.