How the b2b and b2c cloud gaming platforms work. Solutions for excellent image quality and tackling the “last mile” challenge effectively.

Cloud gaming has been hailed as one of the new technology changes to be excited about for 2019.
Over the next 6 years, the cloud gaming market is expected to grow by 10 times — from $45 million USD in 2018 to $450 million USD in 2024. Tech giants are already rushing in to develop the niche: Google and Nvidia have launched beta versions of their cloud gaming services, and Microsoft, EA, Ubisoft, Amazon, and Verizon are preparing to storm the market.
This is great news for gamers. Very soon, they will no longer need to sink ridiculous sums of money into the bottomless pit of hardware upgrades. They will finally be able to play the most power-hungry games on low-spec machines and achieve the same performance they would get from a super expensive system.
But what about the other players in the ecosystem?
Do publishers, developers, TV manufacturers, and telecom operators have anything to gain from this revolution? The answer is yes, and in this article, we’ll be revealing a powerful tool that will allow them to penetrate this promising market and boost their revenue with minimal investment and negligible risk.
What cloud gaming can do for publishers, developers, TV manufacturers, and telecom operators
Game publishers and developers are interested in delivering their product to as many players and as quickly as possible. But our data shows that currently, games are failing to reach up to 70% of their potential buyers. Players simply do not want to wait for a client and installation file weighing tens of gigabytes to download. Additionally, from the results of this survey, 60% of users do not have a computer that can handle demanding games (AAA level) at acceptable quality. Cloud gaming will be able to solve this problem. More importantly, it will not reduce the revenue of publishers and developers. On the contrary, it will help them build up a wider paying audience.
The field of cloud gaming is now becoming attractive to TV and set-top box manufacturers. In the era of smart homes and voice assistants, they have to increasingly compete for the user’s attention. Offering game functionality is a key way to attract this attention. With integrated cloud gaming, they can offer their customers the ability to run modern games directly on their TV by paying a subscription fee for the service directly to the manufacturer.

Another potentially active member of the ecosystem is a telecoms operator. Operators increase revenue by providing additional services — gaming being one of these. Rostelecom has launched the “Game” tariff, while Akado sells access to our Playkey service. It is not just about broadband Internet operators. In connection with the active spread of 5G, mobile operators will also be able to take advantage of cloud gaming as an additional revenue stream.
Despite the bright prospects, it’s not that easy to enter the market. All existing services, including products by tech giants, have not yet managed to completely overcome the “last mile” problem. This means that due to the imperfection of the network, specifically in the house or apartment, the Internet speed of the user is not enough for cloud gaming to work correctly.

Stakeholders who have been on the market for a long time and possess powerful resources are gradually moving towards solving this problem. But starting your cloud gaming service from scratch in 2019, means a massive investment, both financially and in opportunity cost. More importantly, there’s no guarantee of creating a working solution.
But now, you have the opportunity of penetrating this rapidly-growing market, thanks to our new technology that allows you to launch your cloud gaming service quickly and at a fraction of the expense.
How we created a technology that makes it easy to start your cloud gaming service
Playkey began to develop its cloud gaming technology back in 2012. In 2014, the product was launched commercially, and by 2016, 2.5 million players had made use of the service at least once. Throughout the development, we have seen interest not only from gamers but also from TV set-top box manufacturers and telecom operators. We have even launched several pilot projects with NetByNet and Er-Telecom. In 2018, we decided that our product may have a B2B future.
Developing a bespoke cloud gaming integration (as we did in pilot projects) for each company, poses a real challenge. Each such implementation takes from three months to six months. Why? Because every company has different hardware, operating systems, and requirements. One might need cloud gaming on an Android console, while someone like iFrame needs it in the web interface of a personal account for streaming to computers. Additionally, companies have different design and billing requirements, and other feature demands. It became clear that it was necessary either to increase the development team tenfold or to build the ultimate out-of-the-box B2B solution.
In March 2019, we launched Remote Click — a software that companies can install on their servers, which gives them a working cloud gaming service. On the client side, users will see a button on the familiar website, allowing them to launch a game in the cloud. When the button is clicked, the game will start on the company’s server, the user will receive the audio and video stream, and will be able to play remotely. That is what it might look like in popular digital game distribution services.


Active fight for quality. And passive as well.
Remote Click overcomes numerous technical barriers. The first generation of cloud gaming (such as OnLive) was destroyed by the poor quality of the Internet among its users. At the time, in 2010, the average Internet connection speed in the USA was only 4.7 Mbit/s. But by 2017, it had grown to 18.7 Mbit/s, and 5G is set to usher in a new era. But despite the fact that — in general — the infrastructure is ready for cloud gaming, the above mentioned “last mile” problem remains.
First, there’s the “external” element, where the user has problems with the network itself. For instance, the operator might not actually be providing the advertised maximum speed. Another case is where 2.4 GHz WiFi is being used, which might be receiving interference from a microwave oven or a wireless mouse.
Then there’s the “internal” element, where the user doesn’t even suspect that there might be a problem with the network (they don’t know that they don’t know)! At best, he or she is sure that if the operator is selling a connection of 100 Mbit/s, then the Internet connection is 100 Mbit/s. At worst, he or she has no idea what a router is, and thinks there’s such a thing as “blue internet” and “colored internet”. This is an actual case from client development.

But both parts of this “last mile” problem can be overcome. Remote Click uses active and passive mechanisms to achieve that. Here’s how those mechanisms work.
Active mechanisms
1. Effective Forward Error Correction (FEC) of transmitted data a.k.a redundancy
When transmitting video data from the server to the client, we use forward error correction. This allows us to restore the original data in cases where it is partially lost due to network problems. What makes our solution effective?
- Speed. Encoding and decoding are performed at high speed. Even on “weak” computers, the operation takes no more than 1 ms for 0.5 MB of data. Thus, encoding and decoding add almost no delay when playing through the cloud. Its importance is hard to overestimate.
- Maximum data recovery potential. Namely, the ratio of excess data volume and potentially recoverable volume. In our case, the ratio is 1. Suppose you need to transfer 1 MB of video. If we add 300 KB of additional data during encoding (this is called redundancy), then in the decoding process, to restore 1 original megabyte, we will need only any 1 MB of the total 1.3 MB sent by the server. In other words, we can lose 300 Kb and still recover the original data. As can be seen, 300/300 = 1. This is the maximum possible efficiency.
- Flexibility in setting up additional data volume when encoding. We can configure a separate level of redundancy for each video frame that needs to be transmitted over the network. For example, when noticing network problems, we can increase or decrease the level of redundancy.
2. Data reforwarding
An alternative method to deal with losses is to request data again. For example, if the server and the user are in Moscow, the delay in transmission will not exceed 5 ms. With this value, the client application will have time to request and receive the lost part of the data from the server out of the sight of the user. Our system decides by itself when to apply redundancy, and when to apply reforwarding.
3. Customized data transfer
To choose the best way to deal with losses, our algorithm analyzes the user’s network connection and sets up the data transmission system individually in each case.
It looks at:
- connection type (Ethernet, WiFi, 3G, etc.);
- WiFi band used — 2.4 GHz or 5 GHz;
- WiFi signal strength.
If you rank the connections by losses and delays, then the cable is obviously the most reliable. Through Ethernet, losses are rare, and delays in the “last mile” are extremely unlikely. Next comes WiFi 5 GHz and finally WiFi 2.4 GHz. Mobile connections are very poor quality, which is why we are waiting for 5G.

When using WiFi, the system automatically configures the user’s adapter, putting it into the most suitable mode for use in the cloud (for example, turning off power saving).
4. Customized encoding
Video streaming exists thanks to codecs — programs for video compression and recovery. In the uncompressed form, one second of video easily exceeds one hundred megabytes, and the codec reduces this value by an order of magnitude. We are armed with H264 and H265 codecs.
H264 is the most popular one. All major video card manufacturers have been supporting it at the hardware level over the past decade. H265 is a daring newcomer. It became supported by hardware five years ago. Encoding and decoding in H265 requires more resources, but the quality of the compressed frame is much higher than in H264. And without increasing the volume!

But the big dilemma is, which codec should be chosen, and which encoding parameters should be set for a specific user, based on their hardware? It is a sophisticated problem that we solve automatically. The smart system analyzes the capabilities of the equipment, sets the optimal parameters of the encoder and selects a decoder on the client side.
5. Loss compensation
Though it’s hard to admit, even our system is not perfect. Some of the data lost in the depths of the network cannot be recovered and we do not have time to reforward it.
But in cases like this, there are still solutions. One method is by adjusting the bitrate.
Our algorithm constantly controls the amount of data transmitted from the server to the client. It records every shortcoming and even predicts possible future losses. Its task is to monitor over time and, ideally, to predict when losses will reach a critical value that could result in flaws on the screen that are noticeable to the user, and control this by adjusting the amount of data transmitted (bitrate) at that moment.

Additionally, we use the invalidation of unpicked frames and the mechanism of reference frames in the video stream. Both tools reduce the number of noticeable artifacts. That is, even with serious disruptions in data transmission, the image on the screen remains acceptable, and the game is playable.
6. Distributed transmission
Data transmission distributed over time also improves the quality of streaming. The way in which it is distributed depends on specific indicators in the network, for example, the presence of losses, ping, and other factors. Our algorithm analyzes them and selects the best option. Sometimes distribution within a few milliseconds reduces losses by several times.
7. Reducing delay
One of the key characteristics when playing through the cloud is a delay. A smaller delay results in a better gaming experience. The delay can be provisionally divided into two parts:
- network delay or data transfer delay;
- system delay (capturing control commands on the client side, capturing the image on the server, encoding the picture, the above mechanisms for adapting data for sending, collecting data on a client, decoding an image and its rendering).
The network delay depends on the infrastructure and it is problematic to deal with. If the wire is gnawed by a mouse, waving a dead chicken over it will not help. On the other hand, the system delay can be reduced by several times and the quality of cloud gaming for the player will change dramatically. In addition to the above mentioned forward error correction and personalized settings, we use two more mechanisms.
- Fast data acquisition from control devices (keyboard, mouse) on the client side. Even on weak computers, 1–2 ms is enough for this.
- Rendering of the system cursor on the client. The mouse pointer is processed not on the remote server, but in the Playkey client on the user’s computer, that is, without the slightest delay. Of course, this does not affect the actual control in the game, but the whole point here is to aid human perception.
With the use of our technology, with a network delay of 0 ms and working with a video stream of 60 FPS, the delay of the entire system does not exceed 35 ms.
Passive mechanisms
In our experience, many users have little idea of how their devices connect to the Internet. In an interview with players, it was discovered that some people do not know what a router is. And that’s fine! You do not have to know the structure of an internal combustion engine to drive a car. You should not require the user to have the knowledge of a system administrator to play a game.
However, some technical points are still important to convey, so that players can independently remove barriers on their side. And we help them with this.
1. Indication of WiFi 5 GHz support
Above, we wrote that we can see the Wi-Fi standard — 5 GHz or 2.4 GHz. And we also know whether the network adapter of the user device supports the ability to operate at 5 GHz. If that’s the case, we recommend using this range. We cannot change the frequency on our own yet, as we are unable to see the characteristics of the router.
2. Indication of WiFi signal strength
Some users can have a weak WiFi signal, even if the Internet works well and it seems that the speed is acceptable. The problem will be revealed particularly during cloud gaming, which puts the network under real test.
Signal intensity is affected by obstacles — for example, walls — and interference from other devices. Microwave ovens are a common source of this. As a result, there are losses that are imperceptible when working on the Internet, but critical to cloud gaming. In such cases, we warn the user about the interference, we suggest finding a position closer to the router and turning off “noisy” devices.
3. Indication of traffic leeches
Even if everything is fine with the network, other applications might be consuming too much traffic, for instance, if a video is running on Youtube while running a game in the cloud or downloading torrents. Our application calculates these traffic leeches and warns the player about them.

Fears from the past — dispelling myths about cloud gaming
Cloud games as a fundamentally new way of consuming game content, have been trying to break through to the market for almost a decade. And as in the case of any innovation, their story is a series of small victories and resounding defeats. It is not surprising that over the years cloud gaming has become overgrown with myths and prejudices. At the dawn of the technology, they were reasonable, but today they are completely groundless.
Myth 1. The image in the cloud is worse than in the original — as if playing on YouTube
Today, in the technically advanced cloud solution, the images of the original and that of the cloud are almost identical — there is no difference evident to the naked eye. Encoder customization based on the player’s equipment, in addition to a complex of mechanisms for dealing with losses, takes this issue off the table. There is no blurring of frames or graphics artifacts in a high-quality network. We even take resolution into account. It makes no sense to stream the picture in 1080p if the player uses 720p.
Below are two Apex Legends videos from our channel. One video shows the game being played on a PC. The other shows the game being played in the cloud on Playkey.
Apex Legends on PC
Apex Legends on Playkey
Myth 2. Unstable quality
Network status is indeed inconsistent, but this problem has been resolved. We dynamically change the settings of the encoder based on the quality of the user’s network. We support a constantly acceptable level of FPS by special methods of image capture.
How does it work? The game has a 3D engine that is used to build a 3D world. But the user is shown a flat image. In order to make the user see it, a picture of memory is created for each frame — a kind of photograph of how this 3D world is seen from a certain point. This picture is stored in encoded form in the video memory buffer. We capture it from the video memory and transfer it to the encoder, which then decrypts it. And so on with each frame, one after another.
Our technology allows you to capture and decode the image in one stream, which increases the FPS. And if these processes are conducted in parallel (a solution that is quite popular in the cloud gaming market), the encoder will constantly engage capture, take new frames with a delay and, accordingly, transfer them with a delay.
Myth 3. Due to lags in the controls, I will be a “snail” in multiplayer
Normal delay in control amounts to a few milliseconds and is usually imperceptible to the end user. But sometimes there is a tiny delay between the movement of the mouse and the resulting cursor movement. This does not affect anything but creates a negative impression. The technique mentioned earlier, of rendering the cursor directly on the user’s device, eliminates this disadvantage. Other than that, the total system delay of 30–35 ms is so small that neither the player nor his/her rivals in the match will notice anything. The outcome of the battle is purely dependent on skill. Here’s a clear example of this.
What’s next
Cloud gaming is already a reality. Playkey, PlayStation Now, and Shadow are working services with their own audience and market place. And like many young markets, cloud gaming will grow rapidly in the coming years.
One of the scenarios that seems most likely to us is the emergence of private services owned by game publishers and telecom operators. Some of them will develop their own solutions, others will use ready-made out-of-the-box solutions like RemoteClick.net. The more players on the market, the faster the cloud-based way of consuming game content will become the mainstream.
P.S. Hey, if you liked the article, please, give us a dozen or so of claps for the good luck!