Hey, I'm the main author of this. Happy to see that people like it!
As I wrote elsewhere in the comments we're planning to redesign the results page and rewrite all the text later this year. The current page hasn't changed much since mid-2016. Need to refresh/expand technical advice, be more clear about the limitations of the service, etc. I'll note down all suggestions here.
I'd also like recommend a similar project that was inspired by Webbkoll: https://privacyscore.org/ -- it's slower, but also integrates things like testssl.sh, and most importantly lets you make lists of URLs to check. It uses OpenWPM [0] which has been used for many interesting studies,[1] such as "Online Tracking: A 1-million-site Measurement and Analysis".[2] (Webbkoll uses Phoenix+Puppeteer)
Great tool. But if you'd have a list of top 10 checked sites at the top, say by sum of all three counters, it could actually bring some action, no one would like to be in this shame list.
Really nice! Positive and negative results are marked green and red. Colorblind users will need additional visual clues (e.g. checkmark vs cross) to differentiate between the sections.
<meta name=referrer> is the older of the two, and has IE/Edge support if you use one of the values never, always, origin and default, while the Referrer-Policy header is still in development for Edge.
Using the header form will be more efficient on HTTP/2 due to HPACK reducing it to roughly one byte.
Pretty happy with the results. I have to have access to referrer headers for it to work.
Beyond that, there are some things I can tighten up in regards to some other xss headers so I’ll take care of those.
Also, I don’t keep traffic logs. Sometimes (maybe a couple times a year) I will do a small log capture for a few minutes if I need data to test an upgrade or feature experiment but that’s it.
> While still a work in progress, Referrer Policy is now supported by all major browsers (except Internet Explorer, although it is supported by Edge, the new browser in Windows 10).
Wouldn't it be fair to say that third-party requests aren't automatically a problem? For example, mozilla.org sends third-party requests to their CDN at mozilla.net, but it's the same parent company, so not really a third-party in the privacy sense.
For sure, it's just hard to tell them apart from a technical point of view. Tracking Preference Expression (DNT) [0] does make it possible with a Tracking Status Resource's same-party property [1], but it seems like almost nobody uses (or talks about) this. Medium.com is the only major site where I've seen it implemented: https://medium.com/.well-known/dnt/
This is a great tool! I have used it to check government websites. Government websites typically have a monopoly on the service they provide. As a citizen you can not easily choose to take you business elsewhere.
For a government website you could argue that there should be no third parties using the information about the visitor's interaction with the government. But a lot of government websites still use 3rd party scripts that specifically use information about your for ad targeting.
Thank you kindly. I used this + one more resource to get my startups site all green (I thought HTTPS redirect was enough) - hopefully it doesn't cause any bugs/issues this busy week but it's now done :) Cheers
Nice! It seems there are some things I overlooked on the websites I manage — HTTP headers for XSS, HSTS and such, which are going to be easy enough to add.
Kudos to the authors, thank you for helping making the web a better place.
Would like to thank the people that made this, I found a few things that I can improve for my company site. It's also really good to get a resource that can be shared with other people via a link!
While I don't doubt dataskydd's good intentions, their advice about referrers is a sign that we live in Clown World.
Yes, your browser's tendency to provide a referrer might well give away information you would prefer it didn't. Unfortunately for you, the browser vendors have chosen to provide browsers that do that.
In a parallel universe it would be obvious that this is a problem (among many) for the browser vendors to address. In Clown World, you are supposed to rely on each and every site providing a special response header.
Just a historical note that I found interesting - it was in fact obvious (to some) already 22 years ago. From RFC 1945 (HTTP/1.0), May 1996, 10.13 Referer [sic]:
"Note: Because the source of a link may be private information or may reveal an otherwise private information source, it is strongly recommended that the user be able to select whether or not the Referer field is sent. For example, a browser client could have a toggle switch for browsing openly/anonymously, which would respectively enable/disable the sending of Referer and From information."
This recommendation was not followed in any meaningful way, but Referrer Policy (https://www.w3.org/TR/referrer-policy/), which supports a whole bunch of different policies and is very easy to implement (and now widely supported), at least makes things slightly better.
It's not an either/or thing at all. The browsers are slowly trying to tighten policies on Referer, but it's a process because there are still some webpages that unfortunately rely on it being sent. (You can force it off, at least in Firefox, if you want to.) The point of this site header is so that sites that don't need Referer to be set can explicitly tell your browser that, protecting you from snooping third parties.
> this is a problem (among many) for the browser vendors to address
I'm guessing the reason others are downvoting you is that Referer Policy is exactly that: it's the attempt of modern browsers to address this problem (a problem that yes, they did create, but the fact they're supporting Referer Policy at all at least shows that the problem was created out of incompetence rather than malice).
If I enter https:// for a site I manage, the tool automatically changes it to http://, claims that's what I entered, and then tells me the connection is not secure.
I think this means that the tool is not automatically being redirected to https:// when visiting your website at http://, so it's marked as insecure (See point 2. on the website).
As I wrote elsewhere in the comments we're planning to redesign the results page and rewrite all the text later this year. The current page hasn't changed much since mid-2016. Need to refresh/expand technical advice, be more clear about the limitations of the service, etc. I'll note down all suggestions here.
I'd also like recommend a similar project that was inspired by Webbkoll: https://privacyscore.org/ -- it's slower, but also integrates things like testssl.sh, and most importantly lets you make lists of URLs to check. It uses OpenWPM [0] which has been used for many interesting studies,[1] such as "Online Tracking: A 1-million-site Measurement and Analysis".[2] (Webbkoll uses Phoenix+Puppeteer)
[0] https://github.com/citp/OpenWPM
[1] https://webtransparency.cs.princeton.edu/webcensus/index.htm...
[2] http://senglehardt.com/papers/ccs16_online_tracking.pdf