> The current implementation only works in Chrome. Before you assume we’re kool aid drinking Google evangelists, or simply just lazy, there is good reason for this — likely due in part to Google’s work on Project Stream, they have introduced “low delay” mode for MSE that sets up a push model for video frames rather than the traditional buffered pull model. This is also good for any kind of low latency video stream, not just game streaming. When in low delay mode, Chrome begins to break the rules of MSE and no longer requires buffered playback. It also starts to ignore certain timing information and keyframe requirements.
> ...
> This is not to say that one couldn’t get a decent working implementation in Firefox, but Chrome’s low delay mode works in the ideal way without having to complicate the implementation or diverge too heavily from the way Parsec’s native applications behave. And while we love Firefox, the harsh reality is that 82% of Parsec users are using Chrome, with only 5% using Firefox, which made us more comfortable starting with a Chrome only implementation. For Firefox users, you can always use the Parsec native applications, which will probably perform better anyway .
I think this is quite sad to see. It’s not really game streaming tech in a web browser, it’s game streaming tech in Google Chrome.
What happens if Google decides they don’t want to play with other streaming services? They’re only a quick auto update of locking Parsec services out.
I know both people from both the MSE team and the Project Stream team.
Project Stream doesn't use MSE (it uses WebRTC) and the code in Chrome to detect live streams and optimizing them has been around long time, (see, for example, this change from almost 4 years ago: https://codereview.chromium.org/692323002) and they've steadily added better support for live streams over the years. It's not "game-streaming tech in a web browser". It's just a few simple heuristics to function better for low-latency streams. You can read the code for yourself if you'd like:
Coincidentally, we're at FOMS right now talking to other browsers about standardizing an explicit signal that will provide additional effects on MSE buffering and audio preroll size.
Other than escaping the bloodthirsty data-slurping of the Google ecosystem, I don’t see that solution being much better.
If they have to make their own proprietary browser, the situation will just be streaming “with a client you need to download” rather than “in the browser” generally.
(Disregard this if I’m misunderstanding and by “their own browser” you meant an existing project like Firefox.)
I've been playing odyssey in project stream all weekend and it's blown away my expectations. We're about two to three years away from seeing consoles move to the cloud. It runs flawlessly at 1080p and I've only experienced three or four five second lag spikes in my six hours of play. The game feels just as responsive and looks just as good on my 1080 GTX running at home.
My project stream was perfect for 2 hours, then was completely unplayable for the 10 minutes until I quit. I'm not sure what happened, but I've gone back once or twice and have horrible input lag. I'm using ethernet and get 350MB/s, but google says the issue is packet loss. I don't know how to remedy it, but I know that game streaming is and seemingly always will be just around the corner from perfect.
Edit - it was absolutely stunning to see my accepted email, login to google and ubisoft, and be playing within 5 minutes. Zero download time, zero load time, just instant gaming.
It doesn't have to be perfect- it just has to be profitable. And the increased control a streaming service will have over users guarantees that game streaming with be outrageously profitable.
Think of all the basic ways we already see services make money- from the subscription, to metered access, to congestion pricing, early access fees- and those are only the basics. This is an industry that figured out the loot crate, after all.
On a different note, once the profitability is figured out, publishers can give direction to developers that games need to be designed around the latency, or whatever issue appears. Games can also be designed around the assumption that they'll be streamed, allowing things to be a lot more cross platform.
>It doesn't have to be perfect- it just has to be profitable.
There has been quite a few streaming services already that have already folded such as Onlive, Gaikai, Gamefly.
The other part of the problem is that Digital Renting for video games has also failed to kick off in any meaningful way since the pricing structure is just to restrictive; especially for modern games that can last 40+ hours.
Netflix and Redbox were able to use Blockbuster to make a business model but that hasn't translated as well to video games. Maybe the problem is that people don't play enough games to warrant a high subscription price or play so much that they might as well buy it outright.
I think the problem is the product is poor due to technical issues so no one will adopt it. If I can only play a few hours before getting game-breaking input lag, I'm not going to come back to the service.
I have a decent computer and hard wired gigabit internet, if game streaming can't work there it won't work for the 99% of americans who have worse internet than me.
(Edit: to clarify the below is about their cloud offering user since I'm on Mac and don't have my own PC to stream from)
I've been using Parsec to play games like No Man's Sky and it's really impressive technologically speaking, but the pricing model is a bit obtuse. It's made worse by the fact that the big cost for someone who doesn't game all that often is storage.
Basically, you sign up to a VM with an impressive GPU and it runs games really well, and the streaming tech is top notch. But to run it all you need to pay for at least 50 GB of storage and you need to pay for it on a time basis, which means you're left with anxiety of leaving it turned off, as well as sudden deletion of your cloud PC if you're not paying attention.
I wish they would come up with some kind of clever storage solution so that the basic Windows image is a shared (read only) storage component and the additional games are additional pluggable hard drives, also shared between users. This would mean you as the end user loses control of modding/patches for some games, but it could save so much storage. The save games etc would go on a tiny hard drive that you ultimately pay for (but I'd say Parsec should eat that cost to reduce anxiety and instead encourage a monthly subscription for any-time access).
TL;DR: Parsec is awesome. I want to pay a fixed price and not be exposed to their tech stack costs, so I don't get FOMO when I'm not using it every day.
This reminds me of a huge problem with Steam, which is the size and frequency of updates. For the very small number of first party games, like Half-Life or Counter-Strike, updates are cumulative smaller files that contain changes. So if there was an update to a game, it might be 1-2 GB. For the vast majority of titles on Steam, however, an update requires a full download of the latest version. So for most games it is a 25 GB or 50 GB download. Developers want to stay on top of bugs, so for an infrequent gamer playing on a weekly or monthly basis, opening Steam means queuing up 100 GB of downloads and a long wait to actually play games.
The fact that this hasn't been solved yet shows that Valve isn't willing to invest in fixing it either through new technology or cooperation with developers, which is really sad.
Optimistically, something you're describing for Parsec would be possible, but maybe it isn't realistic.
I don't think this is correct. I use Steam and have a lot of games installed. When it does updates, they are not the size of the original downloads. Usually, much, much smaller.
Same here. I think what's being seen is that some things are easier to diff and update than others. If a game has a large file for map data that's indexed into, and update to any portions of it are delivered the most easily (where easily includes reliably) by updating the whole file. If that file is 2GB, then you have a 2GB download. If a game has lots of relatively little files, you could update multiple maps with a few tens of megabytes of downloads.
An actual patch utility that goes in and applies binary diffs would be much more economical in terms of bandwidth, but then that complicates Steam's ability to confirm a game is in a good state with file checksums, and is much more likely to have odd errors as a diff somehow correctly applies to a file with an incorrect starting starting state.
Maybe I should clarify my issues with updates. I know for certain that some games have had massive updates on the order of the original title, including Grand Theft Auto, Metal Gear Solid, and Mortal Kombat (I particularly know the last one because I have a credit in that boondoggle of a game).
Secondly, at home I have a modest internet connection and a busy home life that leaves me at most an hour to play games at a time, and usually less than that. If I open Steam infrequently and there are multiple patches on the order of gigabytes, then that eats into my play time and makes my PC gaming experience feel like a PC update experience.
When you consider how large some of these games are like The Witcher or Final Fantasy or MGS5, the number of games in my library, and the shortage of time I have, I easily spend more time installing or updating any of my games than playing them.
Any way you cut it, it's a bad experience.
Steam doesn't allow players to play out-of-date games in online mode and doesn't allow multiplayer games in offline mode. When going between online and offline mode it keeps track of which games are out-of-date as well. Whether using on-demand updates or leaving my computer on with Steam running I'm either wasting my time or wasting power and bandwidth (with a monthly cap).
I didn't make my post to complain about the state of things (which I found poor). What I am saying is that I think there are creative engineering solutions to this problem that would make experiences like mine better. Google found a creative way to compress Chrome updates by using disassembly changes rather than file diffs to push small critical updates. I'm sure Valve could spend any amount of resources looking for solutions like that tailored to game assets.
What's more, you can get the exact experience I described by playing Fortnite on an iPad on a weekly basis.
I haven't noticed the need to re-download every game over and over (most patches seem to be ~1 GB), but I have noticed Steam is bad for having old versions of MMO's - typically you install the MMO and then the MMO basically re-installs itself with what's actually the latest version of the game.
I don’t see the problem you’re describing and my Steam library has over a hundred games. I think in general that as long as the game developers keep their individual files relatively small it all patches efficiently. I remember Starbound (indie 2D game) describing how they restructured their game a little to facilitate smaller patches. So yes, there is a worst case scenario of gigabyte patches, but it doesn’t appear to be the norm.
Anyway, if you could get individual games on disk images they aren’t patched every hour anyway so it’s a non issue. The real issue will be dealing with Windows’ file system and Steam’s DRM.
I think it's the games you're playing more than it's Steam. Steam definitely does do partial / cumulative updates for a huge library of games. I cannot think of a single game I own on Steam that has had an update even approach the size of the original download.
Imagine waiting for a game to download, playing it, then having to download it again the next day. If you've experienced it once you will remember. If you've never seen it happen, then maybe it seems like a mystery to you. Some people have slower internet, bandwidth caps, busier schedules or different game libraries.
> just set it so games only update when u decide to play?
When I have time to play games, I have maybe 20 or 30 minutes to do so. If I delay updates until I want to play, and the amount of updates is over a few GB's, then it will take that entire time to update.
This is anecdotal of course, but I can't think of a single game that behaves like this, and my library is pretty sizable. Would be interested in an example.
I also cannot see the justification on a technical level for why this behavior would exist. All in all I'm pretty confused by this comment, tbh
If you're a frequent gamer, in an always online situation, have faster internet, don't have bandwidth caps, or you are a collector of indie games then it might not be your experience.
If you’re using Steam most games support Steam Cloud for your save games, and with the server being on a gigabit link in a data centre (possibly the same one as Steam’s mirrors) it doesn’t take long to install.
Given that you don’t have to worry too much about keeping storage you’re not using, just make sure to dump your non-cloud save games in Dropbox or something now and again.
Sure but now we are talking even more time and effort to spin up/down a computer whenever I want to game for a little while. Installing a Windows VM from scratch and then No Man’s Sky is still probably 20-30 minutes from start to gameplay. And specifically in the case of NMS it’s terribly slow at first launch (not sure if it’s just compiling shaders or what).
Parsec is a tech/app, it's not a service as you think it is. They're basically making a bridge between cloud renting and their app, like an integration, you can't really say that parsec is expensive, they don't rent computing power at all :)
Parsec as tech/app is indeed awesome! Together with Steam Link app, they're the best we currently have.
I’m not saying it’s expensive, but that the service they’re offering is obtuse and anxiety inducing because you’re economically “punished” as soon as you stop playing.
And they offer this service under their own brand and never leaving their site, so I don’t agree with your dismissal that this is just an integration. It’s a service they offer and they are taking my money for it. Money I gladly paid, I just wish the service I got was simplified.
Anyway, you see in their app Amazon and Paperspace, the prices are not decided by them. I'm not saying it's cheap, but paperspace it's the cheapest I've seen for cloud gaming.
Anyway, my use of parsec is my home computer, so I don't really have the difficulties you're facing.
Depends on your connection to their datacenter. They have a West Coast and an East Coast. I live near NYC so I connect to their NJ datacenter. With my fiber optic, I get about 4-6ms.
It could work for some games, but in my experience the latency has been pretty noticeable; and Assassin's Creed isn't a very latency-sensitive game.
I think any kind of shooter, platformer, or fighter game would be nearly unplayable, at least on my internet connection. For RPG-type games like Assassin's Creed though it does have potential.
There's a big difference between network latency and input lag. Taking 100 ms for the server to register a hit on an opponent is annoying, but acceptable. Taking 100ms for the game to notice me moving my mouse to the left and shift my view is not.
Google has properly opened the path for games streaming to the web browser.
Being into the Cloud Gaming industry since 2012 we found this path was getting good enough and decided to start building a cooperative game streaming platform at Dixper. We have now it up and running at https://dixper.gg and we are seriously building not only streaming tech but great comunity and optimizing how games use your hardware resources to enable smooth gaming. Great news for all the industry!
I mean if it's built in the browser it's likely using WebRTC under the hood. Might not be the same techniques they described in their article but all the negotiation and transport is built on open source then. Likely also using MS Desktop Duplication API and Media Foundation.
We use multiple connection methods actually. WebRTC data channels can fail to connect a lot and we don't want people to be left out in the rain. So we built our own PKI and global dynamic DNS on top of Let's Encrypt to ensure connectivity https://blog.rainway.io/encryption-for-all-3383217e4194 (we made sure to sponsor them as well)
>MS Desktop Duplication API & Media Foundation
Are too much of a black box and we needed to be able to fine-tune both our capturing and encoding.
>moonlight-chrome
Is a Chrome App (discontinued) which relies on native code whereas Rainway is HTML5 and requires no plugins and works in all spec compliant browsers. We
Your comment reads like an incredibly passive-aggressive cease & desist letter. Yes, you're filing a patent, and yes, you started doing this over a year ago. Good for you! This marking of territory is an immature look you really don't want to put on in a public forum.
> ...
> This is not to say that one couldn’t get a decent working implementation in Firefox, but Chrome’s low delay mode works in the ideal way without having to complicate the implementation or diverge too heavily from the way Parsec’s native applications behave. And while we love Firefox, the harsh reality is that 82% of Parsec users are using Chrome, with only 5% using Firefox, which made us more comfortable starting with a Chrome only implementation. For Firefox users, you can always use the Parsec native applications, which will probably perform better anyway .
I think this is quite sad to see. It’s not really game streaming tech in a web browser, it’s game streaming tech in Google Chrome.
What happens if Google decides they don’t want to play with other streaming services? They’re only a quick auto update of locking Parsec services out.