Hacker News new | past | comments | ask | show | jobs | submit login

Let's talk about the individual and collective consequences of mass data collection. Are you saying you know something about those? Do tell!

Maybe I'm just terribly dense, but I seriously can't think of any reasonable objection to what google does. The best I personally can come up with is that most people don't understand what google is doing and if they did know some of them might object.

When I google "the individual and collective consequences of mass data collection" I get results that talk about the NSA and human rights -- this doesn't seem to have much to do with what google is doing though. When I add "google" to that search I get a rambling article on "How surveillance changes people's behavior".

Please help me out here -- how am I or anyone else being harmed by google knowing what sites I visit?

I don't think the author is being disingenuous. I do think there is a sizable subset of privacy advocates who have become so stringently ideological about this issue they would downvote even thoughtful replies and are so caught up in their bubble that they seemingly can't have a conversation with anyone outside of it.




>Let's talk about the individual and collective consequences of mass data collection. Are you saying you know something about those? Do tell!

Yes, let's. I'm not an expert, so I don't have any insight. Do you?

Still, this question seems like an essential part of essay about the ethics of working in the ad industry.

Perhaps you meant to direct your sarcasm towards the author?


Are you being sarcastic? You cannot see consequences in the fact of a gigantic tech company having access to: searches, emails, attachments, photos, videos, location, messages, calls, apps installed and their usage, sleep schedules, driving styles, medical records, and about 1001 things I forgot to mention?


As long as they're not doing anything illegal? No.


genuinely cannot tell if trolling or not -- do you not see a security implication to centralizing PII (or similar data)?

to me it seems really bad to mine and centralize PII (note: this PII is also arbitrarily being shared with third-parties, usually without explicit or informed consent from user).

---

ie: this is data which is mined in a way, depth and scope of any implication is usually abstracted away from the user or hand-waved away in legalese or presented in such an annoying way users have become conditioned to unconditionally accept that which they do not understand, and therefore they likewise usually remain uninformed/ignorant/naive of any implication, security or otherwise, in order to get to the service asap. and this is something absolutely exploited by these companies.

---

do we really want companies, companies who have demonstrated they are not immune to simple mistakes leading to vulnerabilities, or leaked PII, mind you, to be in a position where we the user have no choice, but to trust it won't leak PII to nefarious persons (persons who can then do meaningfully harmful things with even basic PII)?

we are already bleeding enough PII as it is -- when should we truly be concerned with stopping it? if never, and there is no concern as you seem to indicate, then let us arbitrarily share medical information, too.

on that subject, there are also so many instances of arbitrarily collected and shared PII, for the sake of ads, that would almost unequivocally be a HIPA violation in other contexts -- to me it seems asinine to have such well-defined understandings of PII for the protection of the person in some contexts, but yet in the context of ads, suddenly <i>anything</i> goes, and the spiel we always get from the ads advocates is: but it is good for the user and the content creators cannot exist without it, so it must exist as is, unchained.

(un)ironically this is also a psychology presented by abusive relationships where the abuser keeps the abused thinking they need them, and the abuser establishes itself as a (survival) dependency in the abused's mind.

idk, i'm pretty skeptical of the claim that ad tech and the ad industry has good intentions, and i am becoming increasingly of the opinion that most of the advertising models advocates are trying so hard to convince users to enable, are just fucking profit-driven-at-all-costs cancer.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: