Hacker News new | past | comments | ask | show | jobs | submit login

Is this more or is it less effective than a large /etc/hosts blacklist? It doesn't seem to be doing anything special like the Privacy Badger. The only relevant sections in the paper I saw were

  We implement an API based on Google Safe Browsing, a 
  mechanism for efficient URL-based blocklist updates and 
  lookups [9]. We use a subset of approximately 1500 domains 
  from Disconnect’s privacy-oriented blocklist to identify 
  these unsafe origins [10]. We update the blocklist every 45 
  minutes to minimize the effects of incorrect blocklist 
  entries.

  Another open challenge is applying Tracking Protection only 
  to third-party content. We can avoiding cross-site tracking 
  by blocking content from high-volume sites such as 
  facebook.com without breaking them when visited directly. 
  Heuristics such as the Public Suffix List4 can help better 
  determine the set of domains that are considered first-
  party.
Would it hurt to copy their blacklist into /etc/hosts? I'd rather do it on the OS-level so I can use any browser I want, anyways.



I "did the /etc/hosts" thing for a long time. But it seemed like some things hung and tool longer sometimes. Overall /etc/hosts was a big improvement. Where are the good lists and what is the current thing you map all the bad hosts to (is it localhost, or something else?)?



It hangs because it waits to time out. It will fail faster if you set the IP to something impossible instead of merely unavailable.


0.0.0.0 works well for that purpose since it's invalid as a remote address


This will generate a /etc/hosts or /etc/dnsmasq.conf from the two common lists: https://github.com/jakeogh/dnsmasq-blacklist




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: