Hacker News new | past | comments | ask | show | jobs | submit login

Carmack's dotplans are always interesting history to read.

If you weren't a gamer (or alive) at the time when quakeworld came around, you might not appreciate how amazing it was for multiplayer games on the internet. On dial-up, you were lucky to have 150ms latency. Before client-side-prediction, that latency applied to every action you took in game, including player movements. Hit the up-arrow, and you wait 150-300ms before the game responds and moves your character forward. CSP really was an amazing break through, and made multiplayer action games feasible on the internet.

This is particularly relevant now, that we are entering the era of cloud-based streaming game platforms, like Stadia. The latency problems of the pre-CSP 90's will be rearing their heads again. Its going to be interesting to see how these same problems will be tackled in this new context. Internet speeds are higher now, but so are our expectations.

I doubt we'll have the nice, simple dotplan files that Carkmack left for us to read and remember, from all the SRE's at Google, sadly.




Sadly we live in a world today where some local code editors could really benefit from CSP because they add 100-200ms of latency to every character you press.

I'm not sure if it's funny or sad that there's more key press latency typing into most local Electron apps than connecting to a Quake 3 server 200 miles away back when I had 56k dial-up in 2000.

If you want to fast forward to today's internet, with an average internet connection it takes around 150ms to ping a server in the Netherlands from California. That's over 5,000 miles (8,000 kilometers). Somehow a local key press has the same latency with certain code editors. What have we gotten ourselves into.


Not just Electron either, writing an email in Outlook frequently gives me half-a-second pauses before the typing catches up with the cursor


Ah, so the Outlook lag isn't just me. This is incredibly frustrating when typing emails. I even have that "intelligent" predictive service turned off and yet, it constantly seizes as if it is trying to work out what I'm saying in an email... Just let me type dammnit!


Actually, there's nothing sad about the state of code editors in the world today. I'm not sure what code editors you're regularly using that is giving you that kind of latency, but there's plenty -- more than ever before, in fact, that definitely don't have this problem (some of which are even electron based!).


my iOS phone freezes approx twice a day for 2 seconds whenever i type text. I remember playing with the keyboard buffer overflow sound when i was playing on my cpc464 (as in 64k of ram and 4Mhz cpu) in the 80s, and it took me longer than that to trigger it.

"The mess we're in" famous talk by joe amstrong should be transformed into a website listing all of those absurdities, as a way to public shame the culprits.


> "The mess we're in" famous talk by joe amstrong

For anyone else: https://www.youtube.com/watch?v=lKXe3HUG2l4

Fantastic watch, thanks for recommending. Sad to hear that Joe Armstrong passed away a few weeks ago.


i discovered recently that there could be some progress to be made wrt latency in editors: https://makepad.github.io/makepad/

// This is Makepad, a work-in-progress livecoding IDE for 2D Design. // This application is nearly 100% Wasm running on webGL.


Typing delay is a pet peeve of mine. This is why I have stuck with Sublime and Vim even though there are more powerful editors out there like VsCode or PyCharm.

If you want a fast editor, switch to Sublime 3.


I find VS Code to be just about the only electron-based editor I can use without getting frustrated with typing latency. It's usually not noticeable unless the process is chugging for unrelated reasons.


VS Code is Electron-based, but not Atom-based, thankfully.


vs code typing latency is on the order of 50ms.

its more than necessary but not as bad as you are implying


VS Code is pretty good for an Electron app


Usually that kind of typing lag is caused by something running amok on the machine. Often for me it has been company installed backup software, to which I send SIGSTOP (not SIGKILL or SIGQUIT, since those cause re-start.)


You can't really do prediction if you are not rendering the game locally, so all you can do is have a lot of servers all over the world and rely on most customers having a low latency fiber link.


For the camera alone one could do a few tricks on the client like VR's timewarp/reprojection, but that doesn't work for gameplay actions (like pressing the fire/jump button).

Theoretically the server could speculatively render and transmit a number of different "potential future frames", and the client throws the wrongly predicted frames away.

That's a nice way to burn even more energy and bandwidth though ;)


Stadia controller connects directly to WiFi to the remote server though, so it sounds like the Chromecast or whatever isn't even getting the local inputs to do such tricks.


TVs has been so slow for so long anyway. I suspect we will just have a ton of point and click style games or games that are mostly simulations with relatively uninteresting inputs but with potentially bigger visuals, or casual games where even 300ms+ latency doesn't make much of a difference. For better or worse, reflex (time) based games will just not be played on Stadia by serious gamers. We have VR for that now.


> We have VR for that now.

VR actually introduces further latency problems. With a TV, your typical PS4 can cover up latency issues with spectacle, as you mentioned. That's a big reason I suspect so many console games feel like a movie today, with tons of cutscenes and quick time events. I've been playing a lot of Bloodborne lately, and while it's an action game, it's still incredibly slow compared to something like Quake.

But with VR, when you move your head you expect the world to feel as if it is real. A TV is artificial, and latency is not an intrusion in the experience. But with VR, latency is felt on a deeper level. Potentially resulting in headaches and nausea.

The funny thing is that John Carmack is riding the state-of-the-art decades later on this front as well: https://www.wired.com/2013/02/john-carmacks-latency-mitigati...


> For better or worse, reflex (time) based games will just not be played on Stadia by serious gamers.

True, but don't overestimate the importance of "serious gamers" to the industry's bottom line. IIRC, mobile gaming is now making more money than all PC and home systems combined.


The servers could predict what the user does in the next 100ms. Not the same kind of CSP, but fits well into "powered by AI" marketing...


They probably would, but the creative players would suffer from it. AI never predicts creativity.


I am now imagining a (probably short) story in which the AI does learn to predict players perfectly, even the creative ones, and ends with a gamer taking his hands off the controller and allowing the AI to play exactly as he would have and wondering what was ever the point.


I think prediction failures are more likely to punish the opponents of the unpredictable guy. In a lot of online shooters, people with 300ms+ ping blink around unpredictably and appear to suddenly murder you out of nowhere, but they don't seem to have any trouble themselves.


No offense but I think you underestimate the predictive potential of millions of hours of game state


None taken. But your claim is that creativity is not possible anymore, since it was all done in the "millions of hours of game state". However, if creativity is possible, then my argument is correct.

Another issue is that machine learning/AI don't predict rare events, like earth quakes. So even with all the knowledge in the world, it won't predict a rare creative move of a player.


But every event, creative or otherwise, is made up of hundreds of smaller events. That complicated wall jump - 360 kill you just did used several input signals. Even if the server side AI can't predict the exact final outcome, it can definitely help with the intermediate, well known states for at least some of the input systems.

I say some but I do believe a large enough volume of data can improve the performance of this class of input/states.


Yes, and then you predict something, broadcast it to your clients, and it ends up being wrong so the clients have to roll back. Would not be a good experience.


Perhaps if you train the model on the existing movement and action history of the particular player


Oh, that's a great idea!

Stadia could sell it as an add-on to players which not only don't want to play their games themselves, but also aren't satisfied by watching other people play through their games on YouTube or Twitch. With this add-on, they can finally watch themselves play through their games, without having to lift a finger to, you know, actually play!


They would make a killing on Twitch, where a streamer could just buy someone else's training data and use that instead of playing themselves.


Black market dealers would swap Terabytes of hot RAM with manually inputted data from the best e-sport players in the world. Corporate enforcers would hack those dealers to delete the data.

Call it Sonic Mnemonic.


I thought about it. But first, you need a lot of information to train a model, so it would only work for very heavy players. Second, creativity is not defined only as doing an action that others didn't do (often or never) before, but also of doing an action that you didn't do (often or never) before.


The problem there seems two fold, one that you need even more beefy hardware to predict, simulate and render ahead of the player input particularly when it has to catch up to deal with misprediction and secondly that mispredictions are going to make the game feel really imprecise at best and jarring at worst.

I'm also skeptical about the ability for a model to generate predictions without having too many mispredictions to make it viable.


I think the problem here is, the game is made by Studio A, but it's run by Google in the cloud. Studio A probably doesn't have enough of an incentive to put this in the binary, but Google doesn't have the source, so they can't change the game loop.


...what's the point of playing competitive first-person-shooters when an AI is at the controls.

And those types of game are the ones that are suffering most from input latency.


I actually kind of find the thought hilarious.

In a twitch shooter, you mouse over a visible opponent. Would the AI be more likely to pre-emptively pull the trigger for you? Or more likely delay and swing the aim passed the opponent before shooting at air? AKA Is the training set for predictive user actions based on experience of players better or worse at the game than you?


That would make the delay much worse if you mis-predict the users action. Games would also have to add additional 100ms lag for all important events.


If you had a big enough server, you could render multiple frames for each possible user action...


True, that could work, and the two frames are probably very similar, so wouldn't even require that much more bandwidth. As someone here pointed out, however, the game controller is connected to the cloud directly, so the display doesn't even know of the inputs until the roundtrip is already done.


>we are entering the era of cloud-based streaming game platforms, like Stadia. The latency problems of the pre-CSP 90's will be rearing their heads again. Its going to be interesting to see how these same problems will be tackled

My take is that it'll be the a primary competitive point, nearly as important as the library available. Companies that can deliver the service without introducing these issues will succeed and ones that cannot will fail. If nobody can reliably crack it, cloud gaming won't take off.

Back in the 90s there was no viable competitor aside from LAN parties, and those weren't available to you every evening.


There was modem dialup between two players though. Among my circle of friends in 94/95 there was someone who wanted to play DOOM every night. The games was often coordinated during the day at school, or later by phone.

Also I agree that is not a given that "Cloud gaming" will take off. We have emerging VR were latency is absolutely critical, even more so than for FPS e-sports


Yeah I remember that. There were also some gaming-specific low-latency premium-rate dialup services (e.g. Wireplay in the UK) that hosted game servers on-net. With the right TCP/IP settings and Modem firmware, you could get very close to the minimum theoretical modem latencies with barely any jitter, and it made a big difference.


Now and as it was then the game maker controls the viability of its game's ecosystem. Some game companies think its a market advantage to have open hosting, mod ability, and some don't.


It's funny. Sure, DSL brought the (much) bigger bandwidths, but when moving from ISDN to DSL, ping times increased again and it made total sense to play ESL matches over telephone dialup instead of DSL. But either they improved or it's just the normal. Anyway, I stopped playing shooters in the early 00s and so I don't really care anymore.

source: German who never had a 56k modem but started with ISDN in 1998 and can't really remember a ping > 100 on EU servers ;)


> but when moving from ISDN to DSL, ping times increased again and it made total sense to play ESL matches over telephone dialup instead of DSL.

Telekom used "interleave" by default, which provided very slightly faster download speeds. And caused about 70 ms latency to the first hop.

I had to contact them and ask them to change my ADSL to "fast path". This dropped the latency to maybe 20 ms (IIRC, my memory might fail me on this number).

I think ISDN was about 40 ms to first hop, but again, it's been a long time.


Thanks for the reminder! Yes, I remember fast path. But if memory serves it wasn't immediately available with the 768kbit plan, only a little later. Or at least the knowledge hadn't widely spread.


I guess it was available in a few months after it was launched? Yeah, you had to know about it and request it from the service number. AFAIK, it wasn't mentioned in any instructions etc.


I was thinking the other day how simple and elegant dotplans were. They were truly the original social media.


The worst part was having a 200ms ping and getting constantly smoked by the guy with the sub-100ms ping. Hence the acronym LPB (low ping bastard).


I dunno, I actually hated the HPBs more, because at least I could rationalize getting beat by an LPB, i.e. someone with a technological advantage over me. ;-)


I remember when I was a kid in the 90s and my dad connected two computer to play Doom. The connection was so slow it was almost impossible to play. When my brother and I saw each other’s characters walking around my mind was so blown at the time. Really a great memory. Wait, you can play the same game from 2 computers?


I recall staying late in the office at British telecom and using our super high end oracle forms dev pcs (£4k) and 20 inch monitors(£2k) to play doom.


What is CSP ?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: