How edge will impact cloud gaming


How edge will impact cloud gaming


The number of people who played digital and/or online video games boomed in 2020. And also for 2021, it is expected that about 870.6 million people across the world will be consistently playing online PC games (including gamers playing on Mac/OS X and Linux systems) for at least one hour a week on average, according to IDC Worldwide Digital PC and Mac Gaming Forecast. No wonder the worldwide spending on cloud gaming services is expected to rise to nearly 36.9 billion dollars in the course of this year. It’s a huge market where most of the revenues is coming from banners, video ads, and investment and sponsorships generated by the ever-growing number of eSports fan.


As we see the gaming market expanding with new games and offerings, the related technological and infrastructural requirements are also growing: more sophisticated video game visuals require more powerful GPUs, high-quality monitors, excellent routers, and low latency connections (that means a very small time delay between the user’s action and the web application response). Of course, these needs may vary depending on what kind of game you’re playing and what the associated reaction time is: fighting games or first-person shooters require a very small latency compared to a chess match, a puzzle or strategy game. However, some of the most popular video game titles nowadays involve racing or fighting that require a very short reaction time and consequently a high speed of connection. And these requirements are needed by a large number of users, including hobbyists and casual gamers.


Therefore, for improved experiences with cloud gaming, there is a constant push to reduce the delays introduced both by local devices and by the network infrastructure. As illustrated in the graphic prepared by Fragkiskos Sardis from Konica Minolta Digital Services R&D for one of his recent talks, the process of cloud gaming involves different infrastructural segments. Let’s investigate the typical cloud gaming datapath by following one signal.

When a gamer pushes a button, the associated signal or data package passes through the following steps:

  1. First, the data enters the controller gathering a first (quite small and rather negligible) amount of delay.
  2. Before processing the user input in the cloud, the following delays are added to the loop:
    • The delay incurred by the data navigating through your home network (wireless network delay and router processing).
    • The transport delay from Internet Service Provider (ISP) – often known as Network delays ( routing and propagation) – to the cloud infrastructure.
    • The delay added by the cloud network infrastructure, between the node processing the game execution and the datacentre gateway to the ISP.
    • The processing delays from the AI and graphics rendering on the node running the game.
    • The latency added by video encoding running in the cloud to compress image output and save bandwidth.
  3.  And here, our signal has only gone through half of its path. So, the signal starts traveling back towards the gamer (exactly to their monitor) collecting further delays again due to the cloud network, to devices, and to the video decoder on the user’s device.

As you may imagine, even if all these delays are quite small, the whole process can sum up to hundreds of milliseconds, which is an amount of time that can make a huge difference when you’re playing a game requiring a short reaction time.


That’s why, in a case like this, edge computing may represent a solution to reduce the latency of the signal. In fact, rather than reaching a far away datacentre, with the edge, the gaming processes could be brought to a location closer to the gamer. Edge computing for cloud gaming would be able to:

  • Reduce the network distance by improving latency and jitter, localising network slicing rules and decongesting core links for the ISP. This also helps localise the high volume of data transmitted by the game’s video.
  • Disaggregate processing: we are able to move the processes towards local micro-datacenters reducing the scale, achieving application-optimised datacentres and thus reducing resources contention with fine-grained optimisation per location.

In order to achieve a service based on edge computing for cloud gaming, there is the need to have different stakeholders collaborating together to develop a full ecosystem supporting this process.


Cloud gaming represents a context in which, as shown by the path of the signal through the cloud network, every millisecond counts. However, there are several circumstances in which reducing the latency of signals is of utmost importance for the success of the application. At Konica Minolta Digital Services R&D, in the last year, we’ve been working on Distributed Cloud Intelligence (DCI), a platform as a service (PAAS) solution that locates data analysis and the development of solutions in the place where the data is generated. In this way, the intelligent edge reduces latency, costs and security risks, thus making the organization associated with it more efficient.

With so many edge use cases identified, DCI represents a possible answer to the fast-growing need to exploit the increasing number of connected devices. With huge amounts of data being produced by medical devices (such as wearable sensors, blood glucose monitors, and healthcare apps), Digital Healthcare could benefit from Distributed Cloud Intelligence. When exchanging data between different healthcare institutes, with DCI the regulatory compliance risks are minimized by eliminating raw data transfer. On another hand, for Video analytics, DCI enables on-demand scaling and resilience by offloading the computer vision algorithms to the cloud with the increased bandwidth that 5G networks offer.

To address the many challenges related to edge, Konica Minolta Digital Services R&D is innovating in three key areas:

  • We’re considering using virtualisation techniques for domain-specific accelerators, to support sharing and multi-tenant use of specific hardware
  • With DCI we’re considering zero-touch orchestration for hardware accelerators, which includes hardware capability exposure and aggregation for the orchestration system, as well as automated mechanisms to design and assign service instances
  • Finally, we are using artificial intelligence and cognitive technologies to address the technical complexity and to optimize for business value.

To find out how these solutions could apply to your industry, get in touch with our researchers today.