Hacker News new | past | comments | ask | show | jobs | submit login

An Apple recruiter recently reached out to me. I am in a fortunate position to turn down opportunities, so I made sure to explain that I am not interested in working for a company that is at the forefront of enabling further infringement on people's privacy. If you are able to push back, do it in any small way that you can.



In many ways Apple is also the world leader on consumer privacy, pushing for changes when the rest of the industry is walking in the opposite direction. Paying with Apple Pay makes you safer because it gives out minimal payment information; the Target fiasco would've been avoided.

Sign in with Apple allows users to provide minimal information in signing up for accounts; the idea that casual users should know how to setup email aliases is a joke. Apple private relay is the closest to getting grandma to use TOR. Apple is working on stopping pixel tracking in email.

Apple is also leading on the story of user permissions, which is a broken model where you blame users for accepting all the snooping in their lives, for not reading the TOS, and for their failure to negotiate against Walmart.


As always when talking about security and privacy, you need to understand the threat model. Apple protects users from some threats while also becoming itself the biggest threat to users. And this is exactly what Apple wants. This is how you use Stockholm syndrome to entrench a feudal system.

The relationship is not 3-way as Apple wants users to believe (Apple the defender, users the victim, third-parties the aggressor). The map of the territory is a lot more complex.


Framing relationships in this "triangle" is not new. As parent says, reality is a lot more complex, and this kind of frame-of-mind can be very detrimental:

https://en.wikipedia.org/wiki/Karpman_drama_triangle


In what case is the software not the biggest threat, though? Software is completely malleable. How many people update their systems by downloading the source diffs of all the libraries and apps, reviewing them line by line, and compiling locally? For the vast majority, the underlying software is considered a trusted system out of necessity.

The surprising thing to me is that there have been so few critical reviews on the system as it exists today - they are 99.9% "what if" scenarios.

To put it a different way, the possibility has always existed that your trust could turn out to suddenly be misplaced in a single, near-instantaneous policy change. So most of the discussions are actually about reevaluating whether they should consider Apple devices to be a trusted system or not based on this policy change, and trying to predict future policy changes based on it.

The reality is that Apple's spat with the FBI was possible because the US legal system allows it. Other countries can demand anything they want, and Apple has to negotiate with them or decide whether they have to leave that market. The scanning is a US-only feature to comply with US regulations.

If say China adopts a policy Apple does not want to abide by, their choice is exclusively to leave the Chinese market and to potentially adapt to no Chinese manufacturing or even Chinese suppliers. But this is no more or less true than last week.


> is also the world leader on consumer privacy

is also an early PRISM adopter and well known on cooperation with totalitarian regimes. Censoring Belarus protesters happened several months ago.


Apple also dropped the idea of E2EE iCloud backups at the FBI's behest. They've also handed over iCloud in China to GCBD creating an incredible apparatus for the surveillance state.

And now we are supposed to believe they can resist adding a hash to a list if the government asks? Are they serious?


They never said they will resist adding a hash. They just implied it with weasel words. They also said they will accept hashes from “other child protection organizations.” Which clearly means each country can define, with its own horrendous laws, what they think children need to be protected from… content showing children being put in unislamic situations (Saudi Arabia), content showing children living happily in a family with gay parents (Russia), content showing a child being abused by being forced to hold a Taiwanese flag (China), etc.


Apple isn't the Dutch East India Company. It is better to ask the citizens of the American democracy to keep their government accountable than it is to ask Apple to be so powerful that it can shrug off national governments.

You don't want CSAM scanning in the USA? Then ask your national government to change its laws requiring companies to be so vigilant against CSAM. Or we can ask that Apple become so powerful that they can shrug off the government.

That Apple cannot resist the government does not negate the observation that Apple is also a leader in consumer privacy.


The US government is NOT forcing Apple to scan customer data, in fact the entire legal framework of this scanning in the US critically depends on Apple performing the scanning without Government coercion. If the Government forced the scanning or even substantially incentivized it, then it would require a warrant per the fourth amendment.


> The US government is NOT forcing Apple to scan customer data

Unfortunately we do not know that for certain. Companies were compelled to sign up to PRISM against their will. Yahoo in particular tried to resist and were threatened with financial destruction [1]. The Feds can essentially levy any sum of daily fines on a company like Yahoo or Apple any time they see fit, with a rubber stamp from the FISA court; they effectively demonstrated that with their confrontation with Yahoo. And it clearly terrified Yahoo.

We have no idea if what's being put into place by Apple is the start of a new program by the powers that be and Apple has little choice but to comply realistically, or be hammered financially (or worse, the Feds might just get dirty and target executives personally). If they were working on PRISM 3.0 and attempting to implement it, we would never know it at this juncture.

It's worth being suspicious of what's going on given the one certainty is we know very clearly what the authorities want, what they'd like to see happen, and that they never stop trying to prod things in that direction. They're always up to something shady, always looking for ways to advance the surveillance state. The Biden Admin years will see that effort turbo charged once again, as with the Bush & Obama years. Whatever they're up to right now (again, they're always working on new programs like PRISM, always), you can be sure it's big, likely illegal and a gross violation of human rights.

https://www.wired.com/2014/09/feds-yahoo-fine-prism/


> Unfortunately we do not know that for certain.

If they were forced the searches would be an unlawful violation of the constitution. Various tech companies have repeatedly testified under oath that they are performing the searches of customer data of their own free will and for their own benefit.

I wouldn't argue that it's an impossibility-- but if true it would be a shocking revelation that would result in hundreds or even thousands of cases being overturned once it was exposed. And if it were true it shouldn't improve our opinion of Apple's actions in the slightest: instead of being just an unethical invasion of privacy for commercial gain, they'd instead be complicit in a secret conspiracy to illegally violate the rights of hundreds of millions of Americans.


FSF is a world leader in consumer privacy, Apple is a world leader in replacing things that could be used privately with things that cannot be.


Apple failed to deploy E2EE iCloud because the FBI asked nicely (with a wrench in hand probably).

The leading narrative on /r/apple is now that "oh this invasive spyware has to exist so that apple can do E2EE iCloud", which is nonsense.

If they had done their job and deployed E2EE iCloud we wouldn't "need" this system in the first place.

It's a classic government pattern:

1. Create the problem (blocking E2EE such that providers have unencrypted copies of your content)

2. Screech and complain about this

3. Demand they do the thing you really wanted in the first place to solve the problem you created


Steve Jobs would have told the FBI he was doing E2EE and he'd see them in court.


Only because they found a common enemy in the likes of FB, etc al. If privacy didn't benefit them in some way, they wouldn't care.


You get the feeling though that this held water, until recent news thought right?


As a long time Apple user and developer (since black and white Apple II), I have witness the rise/fall/rise again/and now fall again of Apple. The future computer I WILL be Linux/FreeBSD for all important stuff.


Well done, I'm with you. I've being doing similar in the games industry for years. If we, the actual makers stand against immoral action it will both build pressure and incentive for alternative ways of doing business.


Also, after you're hired, a group of Apple employees can sign a petition saying something you wrote 10 years ago--from which they saw an excerpt--bothers them....and Apple will fire you.


Being an employee of such a big company is close to being a public servant, which may not suit everybody, anyway.


> I am not interested in working for a company that is at the forefront of enabling further infringement on people's privacy

Conducting scans on device instead of on server is your idea of infringement of privacy?

Apple's system keeps everything off their servers until there is an instance where many images on device match known examples of child porn and a human review is triggered.

Google's system scans everything on server, so a single false positive is open to misuse by anyone who can get a subpoena.

We've seen Google data misused to persecute the innocent before.

>Innocent man, 23, sues Arizona police for $1.5million after being arrested for murder and jailed for six days when Google's GPS tracker wrongly placed him at the scene of the 2018 crime

https://www.dailymail.co.uk/news/article-7897319/Police-arre...


>Conducting scans on device instead of on server is your idea of infringement of privacy?

Why are you asking if the poster still beats their wife?

(More specifically, you're pre-supposing scanning must happen, which by itself is a highly debatable assertion)

Your point with Google is absolutely sound, but you seem to stop short of actually accepting that actual privacy (no peeking damnit) is dead on arrival. This is a case of rhetorical stealth goalpost moving whether you intended that or not.


> you're pre-supposing scanning must happen

No, I'm relaying the fact that scanning does happen, and has been happening for the past decade.

>The system that scans cloud drives for illegal images was created by Microsoft and Dartmouth College and donated to NCMEC. The organization creates signatures of the worst known images of child pornography, approximately 16,000 files at present. These file signatures are given to service providers who then try to match them to user files in order to prevent further distribution of the images themselves, a Microsoft spokesperson told NBC News. (Microsoft implemented image-matching technology in its own services, such as Bing and SkyDrive.)

https://www.nbcnews.com/technolog/your-cloud-drive-really-pr...

>a man [was] arrested on child pornography charges, after Google tipped off authorities about illegal images found in the Houston suspect's Gmail account

https://techcrunch.com/2014/08/06/why-the-gmail-scan-that-le...

Apple refused to implement this until they found a more private method to handle things.

Only photos you upload to iCloud are scanned and nothing happens unless multiple images match known examples of kiddie porn. In that case, a human review is triggered to make sure you didn't just have several false positives at once.


>Conducting scans on device instead of on server is your idea of infringement of privacy?

It's an infringement on my right to freedom of speech. Client-side scanning merely opens the door for my device to censor me from sending any message of my choosing and impacts my ability to freely communicate. What is today child abuse, tomorrow is health information and further descends to political and religious memes, or whatever other content is deemed problematic.


> Client-side scanning merely opens the door for my device to censor me from sending any message of my choosing and impacts my ability to freely communicate.

Nonsense.

Only photos you attempt to upload to Apple's iCloud are scanned. If you don't like it, turn off iCloud photos.

>Q: So if iCloud Photos is disabled, the system does not work, which is the public language in the FAQ. I just wanted to ask specifically, when you disable iCloud Photos, does this system continue to create hashes of your photos on device, or is it completely inactive at that point?

A: If users are not using iCloud Photos, NeuralHash will not run

https://techcrunch.com/2021/08/10/interview-apples-head-of-p...

All the files you upload to your Google/Microsoft account are already being scanned today.


I stand behind my comments for any type of client-side scanning.

There are two upcoming changes from Apple that are often conflated. First, indiscriminate server-side scanning of photos in iCloud against a non-public source database. Second, client-side scanning of messages for child accounts looking for nudity.

Again, client-side scanning is part of the changes that Apple is implementing and I'm projecting on how the terms and conditions of this client-side behavior can and almost certainly will change over the coming years and iOS versions. It's a slow yet accelerating descent to hell.


> I'm projecting on how the terms and conditions of this client-side behavior can and almost certainly will change

The same gloom and doom imaginary projections can be made about what Google might do to Android in the future, and are just as accurate.

For instance, discovery from one of the antitrust suits shows that Google pressured device makers to hide Android privacy settings.

From this, I can project that Google will be completely removing privacy options in their entire ecosystem.


I don't want to be that guy, but for this job there were lining up 300 more people.

Nobody except of a tiny group of nerdy guys (including myself, ofc) is against this apple csam move.

Just ask your parents or your non-tech friends if it's "ok" to scan people's phones to find those "bad pedophiles" in order to jail then up for the rest of their life. You will be surprised how much support Apple's initiative has in the broad public.

And that's why apple made this move. They don't really care for the 3% of people who we belong to. They do it because they know they will have the public and political support.


> Just ask your parents or your non-tech friends if it's "ok" to scan people's phones to find those "bad pedophiles" in order to jail then up for the rest of their life. You will be surprised how much support Apple's initiative has in the broad public.

My Dad is an old teacher today and was formerly a farmer.

My view is he clearly understands these issues and has done since I was a teenager sometime in the last millenium when I followed him around the farm and we talked about stuff.

Maybe your parents are like what you describe but don't underestimate other peoples parents. They might not agree immediately, but if one is careful many actually aren't unreasonable.

Also everyone: stop this defeatist attitude. Instead of asking leading questions, talk about it calmly and politely.

Just explain that once this system is in place it will be used for anything, not just photos (or otherwise bad guys could just zip the files). And when everything is scanned some people will add terrorist material (i.e. history and chemistry books), other will add extremist material (religious writings), blasphemous material (Christian or Atheist teachings in Saudi Arabia), and other illegal content (Winnie the Pooh, man against tank etc in China).

In the paragraph above there should be something to make everyone from Ateheists through Christians, Muslims, nerds, art lovers and Winnie the Pooh fans see why this is a bad idea.


some people will add terrorist material (i.e. history and chemistry books), other will add extremist material (religious writings), blasphemous material (Christian or Atheist teachings in Saudi Arabia), and other illegal content

Apple doesn’t seem to do that covertly, what is wrong with blaming them when they actually do above things? This is a genuine question, because rn I see no issue with automated searching CP through their my iphone.

In theory, Apple could silently push above updates without any prior practice. Then why this issue became an issue only now?


You know how in history, occasionally the good old king would die and his crazy nephew would take over and burn all the peasants? This was possible because the king had absolute power.

Now, you might argue that this absolute power in this case is being used for good - but given the brush the US just had with accidental Nero, it’s worth being wary of how tools and powers might not only be used by current powers, but by future ones too.


Exactly. If you don't want weapons of mass destruction to exist, don't create them and definitely don't tell anyone (too late).

If you don't want tools of mass oppression to exist, don't create them. (We are here now.)

The fight against Japan in 1945 was important, as is the fight agains child abuse today.

But we will have to live with the choises we make, in the short run like I wrote about above and also in the long run when a crazy president is elected like you write.

I actually trust local police and courts. But I don't blindly trust future police, future courts and future politicians.

And when it comes to multi national companies I trust them to maximize shareholder value, even if that means doing what China or Saudi Arabia wishes.


We are here now

As a sibling commenter noted, we are long past that. I see points of all of you in this subthread and I agree, but this doesn’t answer my last question. Tools of mass everything are already here for more than a decade, ready to deploy and use. And when these are used to do an actually good thing (stopping dickpics to minors), everyone wakes up and blames them for the possibility that could always be deployed overnight without any prior notice.


> Tools of mass everything are already here for more than a decade, ready to deploy and use.

You are forgiven if you have missed it but in the wake of Snowden Google and others have hardened their systems massively.

Signal, Matrix and others are actually making it hard to do dragnet surveillance.

> And when these are used to do an actually good thing (stopping dickpics to minors), everyone wakes up and blames them for the possibility that could always be deployed overnight without any prior notice.

Because boundaries have been overstepped again. This is a constant battle that we software people have with authorities :-)

There has been an informal truce that they leave our devices alone and we accept that they scan the cloud.

Now things are about to change and we'll respond. We've won before and I think we can do it again.

PS: There are always good reasons.

PPS: We won the last big one: Cryptography software was "munitions" and couldn't be exported until someone took it on them to make a book out of it, ship it to Europe and let cryptography people here scan it.

So according to the argument up front terrorists won, and I guess we should have a lot of problems now, but we don't have.


If you've never seen a slippery slope in action, ask someone else if they have. It always starts with something everyone can agree on. It's the inevitable slippage over time as the population replaces itself with people who arrn't intrinsically cognizant of the "before" state and the implicit normalization of deviance that represents. We may only live for about 100 years, but I challenge you to look at the size of the United States Code and what has been specifically carved out as illegal or aberrancies normalized just in 200 years.

It all adds up.


If you don't want the atmosphere full of toxic corrosive oxygen, don't breathe it out during photosynthesis. (we were here 2.5 billion years ago, worked great for methanogens)

If you don't want to live in oppressive stratified sedentary society don't invent agriculture. (we were here ~dozen millenia ago, worked great for hunter-gatherers)

etc.


Nero was an OK emperor btw, certainly wasn't actively malicious. The popular image is (christian?) propaganda.


Take Hitler then as someone malicious who took advantage of existing laws and systems to abuse the population.


Because apple has to change terms of service to permit this

But also this is a terrible argument. As a frog i will not wait until i am boiled to raise complaints.


> Because apple has to change terms of service to permit this

Apple's ToS change with every iOS release. You have to accept to continue. Do you ever read them?


No, because they are void where I live.

(More specifically, in the EU any terms/conditions imposed after sale of goods are non-enforcable).


Of course, you are not forced to accept the new ToS, but you also don’t get the newest iOS Version if you decline.

So sooner or later, you probably need to accept them if you don’t want to be stuck with iOS 14 for the next 7 years.


I used to read the terms when I was younger and even more stupid than today.

I fully expect someone else does it now and I even think there exist GitHub repos or some SaaS or something (some tldr for eulas?)

(Today I more or less consequently don't read thenm because 1. as a European they aren't valid if they go beyond what European law allows 2. nobody can be expected to read those anyway and if I admit to reading them I just make my life harder.)


> ... what is wrong with blaming them when they actually do above things?

It is unfortunate because although the consequences are seismic the actual problem is subtle and abstract; and difficult to get people excited over. Basically, civilisations that protect the weak against the strong (and in this case, the strongest party is by far the government and the police) are more prosperous. The more the strong are empowered to act against the weak the worse the actual outcome gets. Although not on any one easy axis to measure.

I suspect part of it is that every political movement hinges on a small network of people organising it. These systems are fantastic plausible-deniability screens for powerful people to disrupt and destroy those networks to preserve the status quo. Like, for example, how China tries to operate.

You can see signs of similar systems developing in the US. Note that Trump then Biden were both the targets of official investigations (Trump-Russia, Bidan's son & Ukraine). That isn't going to go away, it'll probably be a long time before we see a president who isn't being investigated for something. The tools that people like Apple are building will be drawn in to the struggle, and not to promote truth or fairness but to destroy their support networks if they aren't friends of the Tim Cooks and Susan Wojcickis of the world. And make no mistake, powerful people aren't looking after your interests because you like the companies they run.

Plus on the way through they are going to be used to target minorities. That part is just sort of traditional, though incidental. Like when they decide to search people's phones for marijuana use but not cocaine and it turns out different racial groups use different drugs.

The only defence is blanket bans on activity that could be used to target people.


> once this system is in place it will be used for anything, not just photos

The system seems designed to make it hard to use it for anything else. How will a hash driven by a visual perception based neural net be used on ZIP files? How can you add ZIP files to your iCloud Photo Library?

Sure, Apple could possibly do any number of bad and worse things. It’s a matter of trust that every time we update our iPhones that the update doesn’t include a ZIP file scanner or a blasphemy-scanner. This has always been the case, even before the introduction of the CSAM voucher mechanism.


I think the main issue is just that Pandora's box is now open. Once you make this move, you can't go backwards.

If you'll indulge me, would you bet $1000 that this style of scanning isn't expanded in the next 3 years to include non-CSAM content?


Not in three years, but in 10 I am almost sure it will. After making it completely impossible for a citizen to survive without a device.

They are winning the battle in incremental steps, they have no need to rush.


I will take you up on that bet. Feel free to contact me and we can set it up.


Would you take the 10 year version of that bet?

I'm less interested in the actual bet. I'm more interested in trying to understand if we both think this system will eventually be abused.

If you WOULD take the 10 year version, would you take an open ended "some point in the future, this will be abused" bet?


I would probably take a 10 year version of that bet.

I would probably also take a more broad version of that bet, if we agreed upon a good definition of "abuse".

10 years is tricky though, because this topic has a political angle about it.

I look at this stuff as a political move by Apple as much as anything else. There's a lot of political pressure around encryption, and the "think of the children" angle is very compelling for a lot of people. This CSAM voucher system is cleverly designed to handle that concern without compromising privacy or security for anyone who isn't uploading multiple previously-known CSAM images to their iCloud Photo Library.

How this political situation will unfold over the next 10 years is hard to say. I hope for the best. But it's important for threats to privacy and security to be challenged.

I have wished for more legitimate and valid criticism of this system. Almost every criticism that I've seen is based on plain misunderstandings of how the system works, which isn't helpful.


> once this system is in place it will be used for anything, not just photos

There is a system in place to make a 'backup' of the entire device to a remote server in place and has been in place on every iphone since October 12, 2011. The entire device is covered; logs, calls, messages, files, photos etc. Pandoras box has been open for 10 years.

It is called iCloud backup. If they want to repurpose an existing function against the expressed permission of the user to exfiltrate their data and use it against them, why not just use that instead?


> Nobody except of a tiny group of nerdy guys

I think you oversimplify this by a lot. No, Apple reputation won't be severely damaged by this move immediately. But I do believe that those "nerdy guys" did a lot to push the Apple brand, and a big part of that push was due to security and privacy. Until recently Apple was always the "privacy brand" and it was hard to argue against it without going the full FSF route of argumentation.

This is no longer the case and I'm sure this will deal some damage over time, even if it only starts with the "screeching voices" of the (nerdy) minority. Maybe not directly to their revenue, but certainly to their reputation. Nothing wrong with shaving off a bit of the prestige of working at Apple ;)

> I don't want to be that guy, but for this job there were lining up 300 more people.

This is the case for many jobs that don't come close to the holy "working at Apple".


> But I do believe that those "nerdy guys" did a lot to push the Apple brand

We sure did. We're also ignoring all the celebrities that helped to push the Apple brand (remember the iPod and it's terrible but iconic white headphones?).

Popular culture has a much stronger influence on Apple's standing with non-nerds, than nerds do. We like to think we're important, and that people value our opinions, but increasingly that's not true. as the differences between the options diminish to 'probably bad for the world long term' (big tech) and 'probably bad for you today' (open source / pinephone / etc, they're just not usable enough yet), the value of our opinions drops, because there's no real choice between bad or badder.


I Was the reason my family switched from Windows and Linux to OSX back when it was popular among hackers. They had seen the ads with celebrities for years and it didn't move them the way I did. I thought we finally had a unix-like OS that we could all standardize on without my mom freaking out at Gnome and for a time that was true.

They remind me every time I bring up stuff like this that I was the one who pushed them to use Apple. In retrospect I wish I had taken FOSS more seriously.


I dunno. Isn't this how Chrome became so popular, and Firefox before that?

"We" are the people that support, recommend and install this stuff. Hardware just has longer life cycles.


Not to mention, the "nerdy guys" play a big role in deciding what hardware your company buys, what OS and applications are installed, etc. Friends and family reach out to the nerdy guys for recommendations, etc. The impact is slow but massive.


No nerdy guy pushes apple for security and privacy. It is all closed source with data leaking by default - partially not possible to turn off.

They rather push devices like librem 5, Pine phone, Fairphone with /e/os, but not Apple...


You can't seriously be recommending a Librem 5 for mom and grandpa? That's absurd. The thing barely functions.

As the parent commenter noted, Apple was really one of the only reasonable recommendations from privacy-oriented nerdy guys to their friends and family, if they didn't want to go the out-of-touch-with-reality route by recommending the phones you listed.


As I mentioned elsewhere in this discussion, when talking about privacy and security you need to talk about threat models. Privacy-oriented is not a single direction. Those recommending iPhones to parrents have a different threat model than those choosing FLOSS devices.


If you think that you’re completely wrong. I have been asked about it by four non technical people so far after it made national news in the UK. There is a lot of anti surveillance sentiment here and it’s appearing in general public regularly.

I regularly go out with groups of random people on Meetup with no shared technical interest as well and I’m surprised at how much anti tracking and surveillance sentiment there is. It got to the point that out of 25 people on a trip out no one used NHS track and trace because they don’t trust it or don’t own a smartphone. This is across the 20-50 age group.


>no one used NHS track and trace

Why would they? so they can get to stay quarantined for 2 weeks?


> but for this job there were lining up 300 more people

let them take it then. I try to minimize the blood on my hands.

>Just ask your parents or your non-tech friends

My parents were unhappy with it - they're non technical and not particularly concerned with privacy. I don't think they'll switch but they did ask how to mitigate it. I'm currently scrambling for a (friendly) alternative to icloud photos.

> They don't really care for the 3% of people who we belong to

Welcome to cyberpunk dystopia! Grab a devterm by clockwork (no affiliation), and log in, cowboy.


I agree with you, the general public doesn't give a shit. There will be headlines for a few times, some people change their phones, and that'll be it. The biggest of these movements, I think, is the "de-googling" one. There's a myriad of articles, subreddits, guides, websites even, listing alternatives. And look what happened to Google. Nothing.

When the alternative to apple's surveillance is to smash the phone against a wall, and buy something that's much less convenient, suddenly surveillance is not that big of a problem. And this is very important to note because many world's powerful entities are moving in this direction.


But it doesn't happen over night. The cracks are forming though.

Duckduckgo.com is now at 93,533,476 searches daily.

My non-technical brother just purchased a 3 year Fastmail account and switched from Chrome to Firefox. For added effect, he bought a subscription to Bitwarden. I didn't push him to do any of this, I just told him what I'm using.

His wife refused to put an internet enabled webcam in their new babies room, citing security concerns.

It's happening. But Rome wasn't built in a day.


I visited my parent's church earlier this year and they're all freaking out. The general consensus is that everyone should stop using smartphones.


Everyone I asked understood the potential for misuse it introduces.


I don't think this is true. If you tell people they scan your stuff if you upload it to the cloud, a lot of people ask for alternatives.

I tell them that they can have a reasonable backup and network storage for a small amount of money and I believe their data is much safer on there.


And so what? People are well within their rights not to work for companies they find unethical.

Yes, it won't make a dent in Apple's finances, but at least that person can sleep better not supporting a company they find immoral


We are a part of the public and we can influence politicians. Spread the word.


> You will be surprised how much support Apple's initiative has in the broad public.

I wouldn't be, but that's not the issue, the broad public is gullible, the overwhelming majority probably still believe that Iraq had WMDs before invasion.


> I don't want to be that guy, but for this job there were lining up 300 more people.

All of which could decide to stand up for individual rights, but won't with similar excuses to the one you formulated.

I know in US culture some see it as a strength to be selfish, but yet they complain about the society and the politics this kind of mentality necessarily lead to. If all the others are selfish, why should I be the sucker who pays for having principles?

Because suckers with principles shape a society until they don't.


>Nobody except of a tiny group ...

were against hitler, ussr (inside ussr), unlimited king's powers, religion fanaticism, witch hunting, etc ... in the beginning

today it's surveillance and attempts to legalize such abuses by Apple using some BS cover story intended to create emotional response and this way to fog the real issue: Spyware Engine installation/legalization




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: