Hacker News new | past | comments | ask | show | jobs | submit login

> I've heard that fake data, like from AdNausium, just becomes noise as the advertisers know the patterns to filter them out.

It's actually much worse. That fake data is dangerous because data brokers don't really care how accurate their data is. Even the fake data AdNausium stuffs into your dossier will be used against you eventually, just like the real data will be. If you get turned down for a job, or your health insurance rates go up, or you have to pay more for something than you would have otherwise, you won't even be told that it was because of data someone collected/sold/bought. You sure won't be told if it was fake or real data and you won't be given any opportunity to correct it.




> If you get turned down for a job, or your health insurance rates go up, or you have to pay more for something than you would have otherwise

It must suck to live in a capitalist dystopia. Dunno why Americans put up with it.


We don’t. Individualized health insurance rates like that are illegal.


We do.

> Insurers contend that they use the information to spot health issues in their clients — and flag them so they get services they need. And companies like LexisNexis say the data shouldn't be used to set prices. But as a research scientist from one company told me: "I can't say it hasn't happened." source: https://www.propublica.org/article/health-insurers-are-vacuu...

See also:

> Is it legal? As explained by William McGeveran, University of Minnesota professor of law, and Craig Konnoth, University of Colorado associate professor of law, it is — largely because federal law hasn’t kept pace with the modern, technological world in which we live. source: https://www.chicagotribune.com/2018/08/29/help-squad-health-...

Another important takeaway from that second article is that none of your "protected" HIPAA data is prevented from being sold as long as it's "anonymized" which is a total joke since it's often trivial to re-identify anonymized data. It's about as secure as requiring companies to ROT13 your data before they sell it. It will be used to identify and target you individually.


> which is a total joke since it's often trivial to re-identify anonymized data

HIPAA doesn't say ROT13 or anything else in particular counts as "anonymized". It's an after-the-fact assessment. If your "encrypted" data is accidentally released, and there's any reasonable suspicion inside or outside the company that it's crack-able, then it's a YOU problem and you need to notify a bajillion people by mail and per-state press release plus large fines.

I think you're being overly pessimistic on the strengths of US regulations on this with regard to preventing deliberate malfeasance, and that most of the stupid we see in stories is really just by accident or individual actors.


> HIPAA doesn't say ROT13 or anything else in particular counts as "anonymized".

ROT13 was only an example of a step that makes data look "protected" in some way when it really isn't, just like the ineffective means used to anonymize data makes it look safe to sell that data when it really isn't.

There is a lot of research showing how easy it can be to identify an individual using data that has been anonymized. (https://www.technologyreview.com/2019/07/23/134090/youre-ver...)

HIPAA does provide a standard and guidelines for what they call the "de-identification of protected health information" (https://www.hhs.gov/hipaa/for-professionals/special-topics/d...) and it includes, for example, a list of specific identifying information that must be removed from the records before they can be sold or otherwise passed around in order to get safe harbor protections. It also includes an option where an "expert" ("There is no specific professional degree or certification program for designating who is an expert") can just say "Trust me bro, it's anonymized".

If somebody was able to buy their re-identified data from a broker and they could prove that was sold by a health provider bound by HIPAA, they would still have to prove that the provider who sold the data had "actual knowledge" that the broker would be able to re-identify the individual, where:

> actual knowledge means clear and direct knowledge that the remaining information could be used, either alone or in combination with other information, to identify an individual who is a subject of the information.

Which all seems like it would be almost impossible to prove unless the provider left obvious identifying information in the data, or if a whistleblower came forward with records of direct communication between the seller and buyer where the buyer was reassured that the data being sold to them would later be able to be re-identified.

Awareness of the fact that we have mountains of research showing that individuals are easy to re-identify from anonymized data doesn't count as "actual knowledge":

> Much has been written about the capabilities of researchers with certain analytic and quantitative capacities to combine information in particular ways to identify health information.32,33,34,35 A covered entity may be aware of studies about methods to identify remaining information or using de-identified information alone or in combination with other information to identify an individual. However, a covered entity’s mere knowledge of these studies and methods, by itself, does not mean it has “actual knowledge”

Which leaves us with healthcare providers who can use methods to "anonymize" data that have been proven to be vulnerable to re-identification, then freely sell that "anonymized" data to third parties with a nudge and a wink.

I'll admit to being pessimistic. We know that the strength of the regulations we have in the US has done little to slow down the buying and selling of our healthcare data.

We've also already seen a lot of very shady behavior by health care providers and companies such as tricking or coercing people into giving up their rights so that they don't even have to pretend to protect their data with anonymization before selling it. (see https://www.washingtonpost.com/technology/2022/06/13/health-... and https://www.washingtonpost.com/technology/2023/05/01/amazon-... and https://news.ycombinator.com/item?id=22177812 and https://www.12onyourside.com/story/23852025/on-your-side-ale...)


> Dunno why Americans put up with it.

Have you seen the guns that enforce it?


Where do you live, that sucks less?


Australia seems significantly better in most quality of life metrics. Many EU countries as well.

The UK doesn't seem so good any more from recent reports though. :(


It's the democracy. The big capital one.

/s


> That fake data is dangerous because data brokers don't really care how accurate their data is.

This makes me think that people could make bank by doing nothing at all but generating 100% fabricated data to sell to brokers then. Why bother even collecting it, just have some GPT clone hallucinate some gigabytes of formatted BS. xD




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: