In a relatively short time, edge computing has progressed from a novel idea to commercial reality, with great potential to transform 5G into a vastly more powerful technology. As 5G matures, it’s important to explore edge computing and the potential it can unlock for connected ecosystems.
For 50 years, InterDigital has been a consistent contributor throughout each generation of wireless evolution, and today, we are engaged in ETSI Multi-Access Edge Computing (MEC) and in the 3GPP edge standardization process, for example helping to introduce an edge enablement service layer within Releases 17 and 18.
Edge computing is a critical technology that will both supplement 5G capabilities and complement 5G networks. As we examine the applications and use cases enabled by the edge, three key benefits come into clearer focus.
Benefit 1: Low Latency
The most apparent benefit of edge computing is its ultra-low latency as critical network and device processes take place much closer to the user at the network edge. Today, network operators and hyperscalers are embedding computing and storage resources at the network edge, and will soon offer additional capabilities including GPUs, AI resources, and other advanced services.
The low latency benefits could have varied impacts for different applications, all dependent upon where the network edge is located (whether on-premises, cell-site, network aggregation point, or central office, etc.), what applications are being prioritized, or what tier of edge computing to allocate to each use case.
For example, when considering the location of the edge for optimal ultra-low latency benefits versus cost or capacity, the needs for XR immersive gaming applications may differ from industrial machine-control applications. In XR gaming, video rendering is a critical function that is computing heavy and can be realized at the edge to enable new devices, such as lightweight XR glasses. XR rendering demands end-to-end latency in the order of 10-20 milliseconds. In contrast, industrial edge machine control applications will require end-to-end latency of less than 10 milliseconds. End-to-end latency includes network delay and application compute processing. These more complex applications, like the industrial example, will require edge compute capabilities to be deployed on-premises and even on mobile devices to work well. Other use cases can allow over 20 milliseconds of latency and can thus leverage a hybrid edge deployment or even a public network slice.
A tangible takeaway on the latency benefits provided by edge computing is that it eliminates the need for a one-size-fits-all approach to network provisioning for different use cases. As we hone the technology and customize the infrastructure to provision different edge computing designs for different cases, each edge deployment can leverage the aspects that best suit its specific needs.
Benefit 2: Local Processing
Local processing, or the ability to process huge volumes of data nearby the device instead of within a cloud resource, is a hugely beneficial edge computing model.
For example, an industrial IoT application may require a significant amount of sensing, largely through high-volume-producing sensors like video or LiDAR. Whether it’s a video sensor on a production line looking for manufacturing defects or a LiDAR sensor that uses AI processing, the addition of each sensor adds huge amounts of data and it may very quickly become impractical (and expensive!) to send all the data to a cloud server.
Local processing provides a practical solution to address this massive demand and ensures network operators don’t have to transmit large swaths of data over long distances -- both delivering obvious security benefits and improvements to overall system efficiency and cost.
Benefit 3: Proximity
In an edge context, proximity refers less to the closeness to the network edge for latency or efficiency purposes, and instead is more regulatory and data sovereignty focused.
As an example, the U.S. has recently experienced a proliferation in online gambling, with a growing number of virtual users and a patchwork of vastly different state and local regulations governing the online activity. In this instance, the data and computing of those operating the gambling may be required to be physically located in a specific state to comply with local regulations. Similarly, in the EU, the GDPR requires storage of personal data to remain in the EU. In some use cases, like healthcare, privacy and security requirements extend further and necessitate local on-premises edge processing and data storage.
Edge Intelligence, the intersection of Edge Computing and AI, is especially useful in the context of proximity. Private and sensitive data is stored and maintained at the edge, perhaps in a hospital. Edge Intelligence models execute localized training and inference on the private edge data, keeping it anonymized, while exchanging derived knowledge with the cloud.
The Edge and 5G: Points of Intersection
To better predict how 5G and edge computing will collaborate and entwine in the future, it’s important to explore how application developers are leveraging the edge computing ecosystem today.
In fact, the most significant level of traction is happening inside private 5G networks, often deployed in enterprise settings for a specific application or vertical. 5G private networks are often seen as "silos" because they aren’t interconnected in the same way as public networks. Due to localization in enterprises, they may hold the greatest potential for edge computing in the nearer term.
Another emerging example can be found in the ways edge computing is leveraging public telecom networks. A prominent example of this is AWS Wavelength, which places Amazon’s compute, storage, and developer services within 5G networks. In partnership with Verizon, the Wavelength platform allows application developers to use Verizon’s 5G network to deploy low-latency edge applications. Though AWS Wavelength service is available in limited locations, this is an exciting development for industry.
As industry progresses toward 5G Advanced and 6G, the fabric of edge computing will become more intertwined and closely knit with communications technology.
A critical opportunity remains in encouraging application developers to imagine and implement the use cases that will make edge networks most valuable. InterDigital is proud to have been awarded contracts to help the European Telecommunications Standards Institute (ETSI) develop, launch, and maintain the ETSI MEC Sandbox to help app developers interact with edge computing APIs and experiment with edge-native applications. As an example, application developer teams utilized the MEC Sandbox to compete in the ETSI & LINUX Foundation Edge Hackathon - 2022 hosted at the Edge Computing World Conference in Silicon Valley. This environment is driven by InterDigital's open-sourced mobile edge emulator, AdvantEDGE.
Over time, the edge ecosystem will continue to grow into a large, multi-domain ecosystem with many platforms, providers, and operators contributing solutions to specific and local network needs. As we approach this future, InterDigital remains engaged with industry partners, academia, and standards bodies to achieve the greatest potential for the edge under 5G and in the future 6G.