But you ignore all of the headwinds that it faces. We live in a world where players will shell out big bucks for ultra fast monitors with <1ms lag, where gamers measure their abilities in actions per second, and where first perspective shooters LIKE COD are the most popular games and demand split second reactions. How do you expect them to be ok with >100ms ping and dropped packets? I have google fiber with 4ms ping and I hate the experience when cloud gaming.
I'm not sure if the niche of hardcore competitive guys buying expensive low-latency monitors is really something to be concerned about.
But dropped packets/lag spikes are a real issue. I gave the Stadia trial a shot and never had a play session where I didn't have several instances where the game would start stuttering or straight-up freeze for a few seconds and I'd have to wait until it sorted itself out before I could continue playing. Maybe that's fine if you're playing a turn-based RPG or visual novel, but that's a complete dealbreaker for any game with real-time action (most of them).
And I don't know how you fix that issue without making internet infrastructure 100% perfect and reliable. Good luck with that.
> without making internet infrastructure 100% perfect and reliable
If this really takes off, I'd expect we'd see a further stratification of ISP offerings with QoS guarantees.
Currently, there's little incentive for ISPs to optimize for jitter, and yet they already often market their highest bandwidth offerings as "good for gaming."
The players buying <1ms monitors will continue to buy them. Its the other 95% gamers that are the targeted audience. An example: As a dad of two 3 year olds, i don't have time to game more then a couple of hours a week. Its not worth it for me to keep up with the hw trends for that amount of dedication. However, using Shadow to stream my games, i get between 30 and 50ms ping on wifi. Perfectly fine for the type of games i play.
I had a similar experience with GeforceNow and Stadia (40 to 60ms for them as their servers are farther away).
> How do you expect them to be ok with >100ms ping and dropped packets?
You don't. Cloud gaming is not aimed at those people. People like that, who want to compete at higher levels(that includes me) will not play in cloud competetively. But 99,9% other gamers won't care if the tech gets to sub 100ms delay. That's an insanely big market. You could just run any game you want, without worrying about your PC/console specs, there's huge value in that.
It's not the CMA's job to make that judgement, that's what expert testimony is for.
But:
- Residential broadband quality is stagnating in large parts of the world; theoretical peak speeds keep increasing, but so is overprovisioning, and latency is worsening due to increased reliance on 4G/5G for residential connections, leaving vast demographics incapable of having the required stable, low jitter, high bandwidth connection during the typical "prime entertainment" time slots. Noninteractive media can compensate for that easily, interactive… not so much.
- Externalizing energy and cooling costs to customers can be attractive, especially in high-cost locales like Western Europe. Additionally, space for data centres close to customers is limited, and competition is high. Microsoft has enough synergies with their other cloud offerings to make this feasible as long as demand is low-ish, but if it dramatically increases, they'll have to make tough decisions wrt pricing. And even if it comes out ahead… this is the sort of monopolist advantage that CMA is worried about. Sony etc. don't have this advantage. (Presumably one of the reasons why Sony's current cloud gaming offerings are noticeably worse in quality.)
- Game console manufacturers aren't doing much R&D these days, the machines are standard PCs/tablets with mildly modified off-the-shelf software on top. Customers, meanwhile, are much more likely to be loyal to and seek to justify their invest into the $500 box in the living room, than a service they can subscribe to when a new game comes out, and unsubscribe from at the end of the month when they're done with it.
- Leasing has been the dominant model for 10+ years. Steam and other game launcher do not let their customers own anything, and "subscribe to play the 5 most advertised games" services are highly popular. Cloud gaming is not required for this business model. Indeed, freeing these services from the requirement to own a $500 box would make it easier for users to keep switching to whoever has the best offering currently, which reduces profits.
- See above, game consoles are already highly standardized with little effort needed to port between them. And cloud gaming just adds more platforms, since each vendor will have his own (vendor lock in) optimized image formats/APIs/whatnots.
Its bad value if you only care about a few games. Its bad value if you have bad internet. Another issue is if people no longer have their own hardware then there will be much less incentive for providers to make cloud options cheap.
- Residential bandwidth (and to a lesser extent latency) will decrease in the future
- It will be cheaper to have hardware at home than in a datacenter
- Game console manufacturers will prefer designing, manufacturing, and shipping systems vs data center upgrades
- Game companies will prefer allowing end users to own bits vs leasing access a la SaaS
- Game companies will prefer targeting custom console architectures vs standardized data center architectures
None of these seem plausibly likely.