Posts by Roya Stephens

Posts by Roya Stephens

December 16, 2019 / IEEE / Posted By: Roya Stephens

This week, InterDigital became the latest donor to the IEEE Information Theory Goldsmith Lecture Program, an award created to highlight the achievements of exceptional female researchers early in their careers, while providing opportunities to acknowledge and publicize their work. With our donation to the program, established this year in honor of Dr. Andrea Goldsmith, InterDigital stands alongside Microsoft, Intel, Nokia, Google and others across in tech ecosystem in elevating female researchers and supporting diversity and inclusion both within and outside of our business.

InterDigital knows that a diversity of backgrounds and perspectives enables us to approach big problems, conduct complex research, and develop effective solutions that work across ecosystems and experiences, despite our modest size. We feel a strong responsibility to acknowledge the benefits of diversity in engineering and innovation, while also encouraging gender diversity of researchers in engineering and the field of information theory.

As part of the Goldsmith Lecture Program, each yearly award recipient will deliver a lecture of her choice to students and postdoctoral researchers at one of the IEEE Information Theory Society’s (ITSoc’s) Schools of Information Theory. In addition to driving inclusion and diversity of thought within the IEEE and the ITSoc, the Goldsmith Lecture Program helps provide more visibility and acknowledgment of the contributions of female researchers to technology. The impact of this program extends beyond IEEE and InterDigital, giving female researchers a platform to share their work while inspiring and encouraging new, more diverse students to explore and innovate with technology.

InterDigital is proud to support the Goldsmith Lecture Program and we congratulate the 2020 award recipient, Ayfer Özgür. Ayfer is currently an assistant professor within the Stanford University’s Electrical Engineering Department, and conducted her postdoctoral research with the Algorithmic Research on Networked Information Group.

To learn more, about the IEEE Information Theory Society and the Goldsmith Lecture Program, please visit:

October 22, 2019 / Posted By: Roya Stephens

As a company dedicated to advanced research and development in wireless and video, we know that our innovations are only enhanced by partnerships with industry leaders and respected academic institutions. That’s why InterDigital is so proud that Dr. Mohammed El-Hajjar has been awarded the Royal Academy of Engineering (RAEng) Industrial Fellowship to work alongside InterDigital to support the evolution of 6G technology.

Dr. El-Hajjar, a professor at the School of Electronics and Computer Science at the University of Southampton, was recently awarded the prestigious RAEng Industrial Fellowship to spearhead a joint project with InterDigital to advance the research and design of transmission receivers for 6G wireless systems. Just 19 professors and researchers were bestowed Industrial Fellowships this year, based on the caliber of research proposal, direct support from an industry partner, and a clear and significant impact on the industry. Other engineering fellowships were awarded for innovative research in plastic waste recycling, carbon dioxide capture, 3D-reconstructed human skin, and more.

The Royal Academy of Engineering Industrial Fellowship

Being awarded the coveted RAEng Industrial Fellowship is an excellent acknowledgment of the proposed benefit our research will bring to industry and validation of InterDigital’s industry leadership in developing the foundations for 5G and 6G networks.

“InterDigital is proud to work alongside Dr. Mohammed El-Hajjar on this project and join the more than 50 industrial partners that have supported RAEng Industrial Fellowship recipients over the past five years” said Dr. Alain Mourad, Director Engineering R&D at InterDigital. “This is a nice validation of InterDigital’s industry leadership in developing the foundations for 5G and 6G networks alongside our global partners and top-of-class university professors.”

During his fellowship with InterDigital, Dr. El-Hajjar’s research will build upon several years of collaboration with InterDigital to advance the research and design of wireless transceivers for 6G systems. Specifically, Dr. El-Hajjar will design and develop new signal processing techniques based on the concept of Holographic Multiple-Input Multiple-Output (MIMO), which enables unprecedented data rates up to Terabits per second whilst mitigating the challenges of complexity, energy consumption and cost of large antenna arrays in Massive MIMO. The value of this collaborative research will be foundational for the long-term evolution of 5G into 6G.

“With mobile subscribers continuing to demonstrate an insatiable demand for data and billions of smart wireless devices predicted in future services for smart homes, cities, transport, healthcare and environments, the explosive demand for wireless access will soon surpass the data transfer capacity of existing mobile systems, said Dr. El-Hajjar. “Achieving the vision of fiber-like wireless data rates relies on efficiently harnessing the benefits of massive MIMO and millimeter wave frequencies. A major challenge for achieving this vision is the design trade-off of the underlying cost, complexity and performance requirements of massive MIMO in future wireless communications.”

As a result, Dr. El-Hajjar’s research on Holographic MIMO will improve upon the current, state-of-the-art Massive MIMO framework. Today’s 5G New Radio (NR) networks have largely adopted Massive MIMO, a concept in which base stations are equipped with an array of antennas to simultaneously serve many terminals with the same time-frequency resource. Massive MIMO utilizes hybrid digital-analogue beamforming, in which the number of users, or streams, depends on the number of available radio frequency chains. While Massive MIMO has enabled high energy and spectral efficiency, scalability to the number of base station antennas, and the ability to employ simple data processing at the transmitter and receiver edge, this method faces several hardware impairments. Namely, hybrid beamforming requires a significant number of radio frequency chains and faces inaccuracies in the angular resolution of phase shifters in analogue beamforming.

Holographic MIMO

Dr. El-Hajjar and InterDigital’s joint project will center around the concept of Holographic MIMO, a new and dynamic beamforming technique that uses a software-defined antenna to help lower the costs, size, weight, and power requirements of wireless communications. In other words, the Holographic MIMO method implements a phased antenna array in a conformable and affordable way so that each antenna array has a single radio frequency input and a distribution network to vary the directivity of the beamforming. Utilizing machine learning tools within the Holographic MIMO design ensures a high level of adaptability and reduction of signal overhead at the transmitter and receiver levels, while enabling support for Massive MIMO that is 10 times greater than what is available in 5G NR today.

The Holographic MIMO technique will be foundational to the long-term evolution of 5G into 6G networks. Though this technique has a timeframe of five to ten years before it matures and can be implemented in future iterations of 5G NR, our collaborative research will enable unprecedented data rates while mitigating the challenges of cost, complexity, and energy consumption presented by large antenna arrays in Massive MIMO situations. This fellowship project also aligns with InterDigital’s ongoing research on meta-materials based large intelligent surfaces with the 6G Flagship program at the University of Oulu, as large intelligent surfaces include large antenna arrays that would require techniques like Holographic MIMO to support efficient and operational beamforming.

The year-long Industrial Fellowship will run until September 2020, and InterDigital’s collaboration with the University of Southampton on Beyond 5G intelligent holographic MIMO extends through 2022 as part of InterDigital’s sponsorship of a three-year PhD studentship program at the university.

September 26, 2019 / Posted By: Roya Stephens

InterDigital marked its debut at the IBC trade show in Amsterdam by showcasing five cutting-edge video demonstrations and taking home an award for Best in Show for the Digital Double technology  

A first impression is a lasting one. At last week’s International Broadcasting Convention (IBC) trade show in Amsterdam, InterDigital not only made its debut as a company with expertise and advanced research in wireless and video technologies, but also left a lasting impression with our award-winning technologies and contributions to immersive video.

Group Photo Engineers from InterDigital's Home Experience, Imaging Science, and Immersive Labs at IBC 2019


Throughout the week, engineers from InterDigital R&I’s Home Experience, Immersive, and Imaging Science Labs in Rennes, France displayed their contributions to next-generation video coding standards, volumetric video frameworks, compression schemes, and streaming applications, as well as a cutting-edge tool to automate the creation of digital avatars in VFX, gaming, VR, and other video applications. At the end of the five-day convention, InterDigital received a prestigious prize and recognition of our significant work to enable immersive video and streaming capabilities of the future.


InterDigital Wins Best of Show for the Digital Double  

InterDigital received the IBC Best of Show award, presented by TVB Europe for innovations and outstanding products in media and entertainment, for our cutting-edge “Digital Double” technology. Developed in InterDigital’s Immersive Lab, the Digital Double tool improves upon the traditionally time- and labor-intensive 3D avatar creation process to automatically create a person’s digital avatar in less than 30 minutes! Although the Digital Double technology completely automates the avatar creation process, it also gives users the option to make stops and manually finetune the avatar at each step. Using a rig of 14 cameras, the technology computes a full 3D mesh of a person’s face and upper body from the cameras’ images to create more human-like avatars and a precise set of facial expressions for animation.

Bernard Denis and Fabien Danieau hold the IBC Best of Show award for the digital double
As we enter the 5G era of ultra-low latency and high bandwidth, video viewers will desire, and be able to enjoy, more immersive video experiences, and our Digital Double tool will become increasingly important to content producers.    

The Best in Show award recognized the Digital Double’s potential to enhance immersive video opportunities of the future, where individuals could see themselves in real-time as a character in a film or on television or even virtually participate in a game show alongside a presenter, contestants, and audience on screen. The Digital Double technology started at the highest end of the market, this time in Hollywood film production, and is likely to eventually make its way into the consumer mainstream.

The Digital Double’s foundational facial animation control for expression transfer (FACET) technology has already been used by production companies like Disney and Paramount in blockbuster films such as the Jungle Book remake and the Shape of Water. We are excited to explore this award-winning tech’s applications in virtual reality, gaming, and other immersive experiences where an individual’s digital avatar can be adapted to each context.



InterDigital’s Contributions to Tech Innovation in the Digital Domain

In addition to the Digital Double technology, InterDigital’s Research and Innovation teams displayed 5G their advanced research to support next generation and future video streaming capabilities. Laurent Depersin, Director of the InterDigital R&I Home Experience Lab, provided an overview of InterDigital’s contributions to video innovations during a panel discussion on “Technological Innovation in the Digital Domain.” Laurent spoke alongside peers from VoiceInteraction and Haivision, to explore the innovations needed to support high resolution and intensive data applications for the video content of the future. You may view Laurent’s panel discussion here.  

During his presentation, Laurent outlined new video applications that drive the need for technological innovation, as well as InterDigital’s Home Experience Lab’s commitment to develop technologies that both connect and improve user experience in the home. Laurent identified mass increases in video consumption, the popularity of interactive and immersive content like VR and gaming, and the trend towards ultra-high bandwidth and ultra-low latency content in the form of immersive communication and 8K video, as the key drivers of InterDigital’s innovative work in video technology.

Laurent Depersin outlines technological innovation in the digital domain

Versatile Video Coding: Improving on the High-Efficiency Video Coding (HEVC) Standard

Lionel Oisel demonstrates the enhanced capabilities of the VVC standard

5G InterDigital’s demonstration on Versatile Video Coding (VVC), presented by Michel Kerdranvat and Imaging Science Lab Director Lionel Oisel, reflects our work to develop cutting-edge tools that analyze, process, present, compress, and render content to improve the production and delivery of high-quality images.

The InterDigital R&I lab’s contribution to the VVC standard enhances the video compression efficiency of the existing High-Efficiency Video Coding (HEVC) standard published in 2013. Specifically, its demonstration compared the HEVC and VVC video standards and showed how VVC can compress and improve video delivery by lowering the bandwidth and bitrate required for Standard Dynamic Range (SDR), High Dynamic Range (HDR) and immersive, 360-degree video content.    


The Need for Point Cloud Compression for Immersive Video  

The InterDigital Imaging Science Lab’s demo on Point Cloud Compression, presented by Céline Guede and Ralf Schaefer, built upon the HEVC video coding standard to showcase the vital need for video compression mechanisms to enjoy increasingly immersive and interactive video experiences in VR, AR, and 3D imagery.        

Point Clouds are sets of tiny “points” grouped together to make a 3D image. Point Cloud has become a popular method for AR and VR video composition, 3D cultural heritage and modeling, and geographic maps for autonomous cars. While this method has many benefits, it is important to remember that each Point Cloud video frame typically has 800,000 points, which translates to 1,5000 MBps uncompressed – a massive amount of video bandwidth. To address this challenge, our Imaging Science Lab has participated in the development of a Point Cloud Compression method being standardized in MPEG to support widespread industry adoption of the Point Cloud format for immersive video. InterDigital showcased its video-based Point Cloud Compression capabilities in a Point Cloud-created AR video demo streamed to a commercially available smartphone in real time. This technique will support the crisp, low-latency deployment of immersive video experiences through existing network infrastructure and devices.

Ralf Schaefer displays Point Cloud compression on a comercially available smartphone

The Challenges and Potential for Volumetric Video  

In concert with our efforts to compress and deliver high bandwidth video, InterDigital R&I’s Immersive Lab also demonstrated its innovative work to enhance immersive experiences that meet our interactive media demands. To give context to the importance of its technological contributions, Immersive Lab Technical Area Leader Valérie Allié delivered a presentation on the challenges and potential of volumetric video and the various applications in which it might be deployed.  


Valérie Allié delivers a presentation on the opportunities of volumetric video content


Volumetric video is hailed as the next generation of video content where users can feel the sensations of depth and parallax for more natural and immersive video experiences. As AR, VR, and 3D video become a more mainstream consumer demand, providers will require tools to deliver the metadata necessary to produce a fluid, immersive or mixed reality video experience from the perspective of each viewer. As a result, content providers may face challenges in maintaining high video quality while supporting user viewpoint adaptation and low latency.  

MPEG Metadata for Immersive Video: A Roadmap for Volumetric Video Distribution  

Valérie Allié and Julian Fleureau’s demo on MPEG Metadata for Immersive Video outlined both the steps to create volumetric video and the requisite format for its distribution. Unlike flat 2D video experiences, volumetric video is much larger and cannot be streamed over traditional networks. In addition, volumetric video requires the capture of real video through camera rigs, the development of computer-generated content, the creation of a composite film sequence using VFX tools, and the interpolation of a video’s view to create a smooth, unbroken rendering of immersive content from the user’s point of view.                    


Addressing the Challenges of Six Degrees of Freedom (6DoF) Streaming

Visitor experiences InterDigital's 6DoF streaming video capabilities on a VR Headset

The significance of the MPEG codec for immersive and volumetric video was put on display in the InterDigital R&I Home Experience Lab’s Six Degrees of Freedom (6DoF) streaming demo, presented by Charline Taibi and Rémi Houdaille. 6DoF refers to the six movements of a viewer in a 3D context, including heave for up and down movements, sway for left and right movements, surge for back and forward movements, yaw for rotation along the normal axis, pitch for rotation along the transverse axis, and roll for rotation along the longitudinal axis.

Using a computer-generated video streamed through a VR headset, the demonstration showed how the standards and codecs developed by InterDigital’s labs can be utilized to stream fully immersive volumetric video with six degrees of freedom over current network infrastructure.

The demonstration achieved a seamless and immersive experience by streaming only content from the viewers’ point of view.  

InterDigital left a lasting impression on all who visited our IBC booth and networking hub and experienced the Research and Innovation Labs’ innovative demos. We are excited to play a role in the pioneering compression solutions and streaming capabilities that will drive and enable the immersive video experiences of the future.