Hacker News new | past | comments | ask | show | jobs | submit login

From the website:

    "Doesn't use analytics = respects your privacy"

Meanwhile, Brave stopped this tracker:

    https://static.cloudflareinsights.com/beacon.min.js/vef91dfe02fce4ee0ad053f6de4f175db1715022073587



While you might assume that Cloudflare "insights" is an advertising/analytics system, this is actually part of Cloudflare's anti-DDoS infrastructure. This "beacon" gets injected at random on Cloudflare-served HTML pages, to track you throughout your use of all Cloudflare-proxied sites, as an alternative to an evercookie in building a long-term reputation profile of "human browsing" for your browser.

This reputation profile is then used as part of the heuristic behind CloudFlare Turnstile's "Are you human?" checkbox.

This is why browsers that have NoScript enabled by default for all sites (e.g. Tor Browser), cause Cloudflare-proxied sites to throw endless security interstitials and never let you through, even when you disable NoScript for the protected website. Without reputation-profile data gathered from other sites, Cloudflare just sees a fresh browser profile making its first connection ever to some obscure site that nobody would ever actually visit as the first thing they do on a new computer. And so it thinks your browser is a (not-very-clever) bot.

I don't think it's possible for a site owner to opt out of this reputation-profile data gathering, while still relying on Cloudflare's DDoS protection.

However, I also don't believe that the data Cloudflare gathers via this route is sold to third parties. (Someone please correct me if that's wrong.)


Builds a persistent profile of you across the web… this is directly at odds with "Doesn't use analytics = respects your privacy"

I’m sure the author isn’t aware of it and it’s just an oversight, but still.

Why does a single static html file even need a CDN?


Probably not a CDN, persay, but is using Cloudflare Pages as a host. Hosts static HTML for free on CF's CDNs. I use it for all my sites.


"need", probably not.

But anything that can be served from a cdn is better off if it fits. From a latency and bandwidth efficiency perspective


Why does this static site need DDOS protection is the important question.

Hint: it bloody doesn't.


It doesn't matter that this is a static site; it matters what it's hosted on.

If this static site is sitting on a CDN or Github Pages or something, then sure, there's no need to mask its IP address.

But if this static site is hosted on a cheap VPS or on a home PC with a residential Internet connection — or generally, anything with a monthly bandwidth usage cap — then any teenager who learns its true IP address (and then checks out that IP address's provenance with a whois(1)) could decide to pay $5 to throw a botnet at it for an hour — just because they know they can take it down by spending enough of its bandwidth, and want to try it, to be able to brag to their friends that they took something down.

(Yes, teenagers today do that. The most DDoS-ed things in the world today are Minecraft servers — because teens like messing with other teens.)

---

Also, half of what makes Cloudflare useful for "DDoS protection" isn't actually its "bot fight" security system, but rather its caching layer combined with its lack of egress costs (at least until you get forced into their Enterprise billing.)

If you are hosting your content on e.g. a public S3 bucket, where you're billed for egress bandwidth, but where S3 also sends sensible long-expiry Cache-Control headers; and you put Cloudflare in front of that S3 bucket (even just Cloudflare's free-tier offering!); then suddenly your S3 bucket will only be serving requests for each resource a few times a day, rather than for every single request. 99.999% of the traffic to your bucket will be a cache hit at the Cloudflare level, and so will be only a conversation between Cloudflare and the customer, not between Cloudflare and S3. So, even in the face of a DDoS, your billing won't explode.


And the data that's collected here includes the full page URL-- which, in this case, includes the fragment and therefore whatever data is being "stored", at the time of capture.

This is probably beyond the author's control, but they shouldn't host it somewhere that can inject scripts outside their control (like Cloudflare) and then claim "privacy".

(The Cloudflare script makes a request to `/cdn-cgi/rum`, with the full page URL in its JSON payload at `timingsV2.name`.)


On a similar token:

>Doesn't use a server = no downtimes

Except there is a server, whatever and wherever it is behind notepadtab.com.


Also

> - Doesn't need cookies = immune to data loss by accident

How is this immune if you have to remember to save it manually? That seems much worse than relying on cookies. Sure you can maybe restore it from the browser history, but if cookies are not considered reliable, then the history even more so. It's easier to delete history than cookies.


Is there anything in the HTML spec that tells browsers to always show a cached version a page if it can't be loaded the next time you try to access it?

I think PWAs might have something like that, but haven't tested it in a normal browser or tried building one.


Maybe the entire page could be a self-updating data-url?

edit: I tried this and common browser security no longer allows this type of thing. 10 years ago it may have worked.


Here:

  data:text/html,<body contenteditable oninput="history.replaceState(0,0,'%23'+btoa(document.body.innerHTML))" onload="if(location.hash)document.body.innerHTML=atob(location.hash.substring(1))">#SGVsbG8sIHdvcmxkIQ==


Unfortunately this doesn't work on FF due to security Uncaught NS_ERROR_FAILURE


I'm not getting anything like that, for some reason. I assume I changed a setting and forgot. (probably something to make bookmarklets work) Pasted in URL bar in a new tab, private browsing, FF version 126.0, up to date fedora, history doesn't save.


Yeah probably(or corp profile), it works in chrome...

Anyway this one works in FF with my settings

https://gist.github.com/joakin/f05fd565e8df77a805e21d2d3469d...


Beautiful. Beat me to it.

Much better than a relying on an HTTP response from someone else's computer.


Wonderful!

How can I silence the Firefox security error messages?


HTML dictates how webpages should be structured and then rendered.

You're probably asking about HTTPS, in which case: No. The first rule about HTTPS is no caching, because you want to validate that what you see is from the server and you can't prove that with a cache.


Yeah, you can do something like that with a ServiceWorker - it does require some JavaScript though.

https://developer.mozilla.org/en-US/docs/Web/Progressive_web...




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: