Hacker News new | past | comments | ask | show | jobs | submit login

> You could have some hash check to prevent hijacking. The old method would be naive today.

But how do you know that the cached site is up to date? How does the ISP know that? What about dynamic content? What about consistency between different requests that are part of the same page load?

> Sure but they are some switches away.

My point is that this does not matter much. Usually, at least in non sparsely populated parts of the world with modern infrastructure, these switches are close and there is lots of bandwidth capacity.

I just don't think it makes sense for ISPs to save bandwidth on these links by building their own local data centers when they peer with a CDN data center anyway.






The root html would govern what caches are up to date with the hash for some non-dynamic payloads and the root html would not be cached. Etc.

It would be interesting to know how much bandwidth would be saved by caching X gb of the most downloaded films, pictures and textfiles at a neighbourhood level.

In the 90s early 00s I think the share was way bigger than it would be now.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: