Hacker News new | past | comments | ask | show | jobs | submit login

Googlenet is not internet 2.0 and barely anyone in the world beyond a couple of megacorps can benefit from HTTP/2, HTTP/3, HTTP/4, etc. It feels more like the web is dead, completely captured by megacorps.



So, presumably organisations like the IRS and Wikipedia are "barely anyone" and all of the big technology companies are "a couple of megacorps" but can you explain why you believe the _users_ don't benefit?

[About a third of all "popular" (ie top 10 million) web sites are HTTP/2 today]

Or did you just mean "I don't care about the facts, I'm angry and the world changes which I don't really understand, so I just make things up and call that truth because it's easier" ?


>[About a third of all "popular" (ie top 10 million) web sites are HTTP/2 today]

Don't forget that a huge chunk of them are hosted on megacorp cloud platforms.

Everything became so "simple" and "streamlined" that companies are forced to outsource all their hardware and platform management and then hire a small army of AWS certified devops.


> companies are forced to outsource all their hardware and platform management

Nothing is being forced. You can still set up a server in your basement, or rent/build a data center and run nginx to get all of the benefits of H2, TLS1.3, etc. You can even get "megacorp-quality platform management" with things like Outposts, GKE on-prem, Azure stack, etc.


>Nothing is being forced.

Not directly, but it is by complexity of dominant technology stacks, protocols and standards that are influenced by ubercorps.


The web is captured by megacorps but it's not captured because of HTTP/2.0. It's captured because of network effects or whatever. And you are wrong, there is a benefit from using HTTP/2.0 on any website that has more than 1 resource to download.


QUIC is developed by an IETF working group where anyone can participate, and there are definitely some productive participants who don't work for Google (or any of the other big companies).


Like they listened to varnish author about making actually useful changes? For example implementing proper sessions, so cookies could go away.


Just because they didn't agree with his suggestion to make major changes to the protocol semantics didn't mean they didn't listen to him.


TLS Token Binding provides exactly that.


Yeah, and who is using it? Why cookies are still there?


Cookies can do more than sessions.


Sure they can, but sessions are pretty much the only reason why end users have cookies enabled.


That's not really true. Independent participants don't have any power to affect any of it.


Running nginx as reverse proxy on internal system. HTTP/2 happens automagically if a client requests it.

It definitely has an impact on our system which requires sub 50ms response times on 2000+ concurrent requests.

It's a PITA if you want to debug the streams because not plain text, but given that we're over TLS, that's not really possible anyway.

In testing, we use ye-olde HTTP/1.1 and no TLS, but even over HTTP/2 and TLS, the browser will still display a JSON request/response happily. Rare that we have to go lower in the stack.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: