This paper proposes an adaptive volumetric video streaming platform including real-time tracking, adaptive source view selection, DASH streaming, GPU accelerated view synthesis, HEVC decoding and view-dependent rendering. Volumetric use case supports 3 degree of freedom immersive experience with additional limited translational movement. The proposed adaptive streaming platform is built on advanced view synthesis technologies and DASH-based adaptive streaming to enhance the immersive experience while reducing the overall transmission bandwidth. The GPU acceleration further enables real-time view synthesis, HEVC decoding and view rendering. The simulation results verify that the proposed adaptive solution can utilize the bandwidth more efficiently. As a result, a more consistent viewing experience with high quality volumetric video content and low interactive latency can be delivered to the user. The platform has been demonstrated in the 2019 Mobile World Congress and 2nd VR ecosystems and standards workshop in April 2019.
Adaptive Volumetric Video Streaming Platform
Adaptive Volumetric Video Streaming Platform
Adaptive Volumetric Video Streaming Platform
Research Paper / Nov 2019
Related Content
White Paper /Oct 2025
“Bridge to 6G: Spotlight on 3GPP Release 20”
As live sports migrates from traditional broadcast to digital platforms, streaming is redefining how leagues, networks, and tech providers engage audiences and generate revenue. This transition brings both opportunity and complexity—from fragmented rights and shifting viewer expectations to significant technical demands around latency, scalability, and quality.
White Paper /May 2025
Media over Wireless: Networks for Ubiquitous Video
Webinar /Jun 2024
Blog Post /Oct 2025
Blog Post /Sep 2025
Blog Post /Aug 2025