Hacker News new | past | comments | ask | show | jobs | submit login
Apple is suing smartphone emulation software startup Corellium (technologyreview.com)
602 points by webmobdev on Aug 18, 2021 | hide | past | favorite | 107 comments



Headline implies that Apple sued Corellium because of their CSAM research, but that's not at all related. This is quite clickbaity.


Ok, we've attempted to change the title to something accurate and neutral, in keeping with the site guidelines.

https://news.ycombinator.com/newsguidelines.html

(Submitted title was "Apple says researchers can vet its CSAM tools. But sues a startup for it.")


The current title "Apple is suing smartphone emulation software startup Corellium" takes away a lot of context of what the actual article is about. In my opinion, the article title (and the HN title I posted and edited to "Apple says researchers can vet its CSAM tools, but sues a startup that can") is not "clickbaitey" when it is actually factual:

> On Monday, Corellium announced a $15,000 grant for a program it is specifically promoting as a way to look at iPhones under a microscope and hold Apple accountable. On Tuesday, Apple filed an appeal continuing the lawsuit.

> ... “With their left hand, they make jail-breaking difficult and sue companies like Corellium to prevent them from existing. Now with their right hand, they say, ‘Oh, we built this really complicated system and it turns out that some people don’t trust that Apple has done it honestly—but it’s okay because any security researcher can go ahead and prove it to themselves.’”


I didn’t get that impression from the title, but I also knew they were referring to Corellium before clicking the link. I interpreted it as, “Apple says researchers can vet its CSAM tools despite aggressively suing the one company that makes such research feasible.”


I heard nothing about Corellium before, so my reading of the headline was that Apple is retaliating against a company for trying to audit their CSAM tools despite saying that it's totally fine if people want to do those audits.


I have a passing awareness of the Corellium situation and interpreted the headline just like you.


That’s completely incorrect. This entire article is fluff. The suit against Corellium has been going on for years.


On one level, I'm not surprised and can understand why Apple might be pissed off that Corellium is still making news.

"Researchers can audit our CSAM process...except for you, who we just handed a pile of money over to and are still on our shit list."


Handing Corellium the underdog story on a silver platter again is not how you avoid them making news though


As I see it, Apple had 3 options for how to deal with this case - in decreasing order of hostility:

1) Sue the company out of existence 2) Buy the company 3) Settle

They chose option #3, which from what I've seen is solely related to DMCA claims. With this in mind, why should Corellium be allowed to continue this work having just gotten out of the proverbial "doghouse"?

Apple's already marked them as a bad actor in this regard and continuing to make noise seems ill-advised.


Actually, Apple chose option #1, and when their ass was handed to them by the judge in the case, they opted for #3 to get out from under it.

No matter how they handled it though, the fact remains that they are providing lip service to the public ("security researchers can easily vet our methods") while continuing to fight against the very same researchers who might want to investigate this new scanning technology.


I agree that this is more damage control on Apples side, no amount of spin will make on-device scanning a good idea.

Apple should realize that users just want the privacy they have heard so much about over the years.


And they only went for #1 after #2 didn't work out.


Or Apple says researchers can vet its CSAM tools as long as their results agrees with their ones.



Apple is apparently still pursuing a copyright claim: https://www.macrumors.com/2021/08/17/apple-appeals-corellium...


Yeah, I thought like that too. It could’ve been better though.


Yes but the article also says:

"Part of the two-year case was settled just last week—days after news of the company’s CSAM technology became public."

And

"On Monday, Corellium announced a $15,000 grant for a program it is specifically promoting as a way to look at iPhones under a microscope and hold Apple accountable. On Tuesday, Apple filed an appeal continuing the lawsuit."


Both the headline and the HN title (which is slightly different) use "is suing"--which is a tense in English that indicates an ongoing or even perpetual action--in a way that doesn't in any way imply direct causality on the nested prepositional phrase "that can / does just that": Apple is, in fact, suing--and has been, for a long time now!--a startup that not only "can" "vet its CSAM tools", but "is doing just that" (which maybe itself needs explanation: "just that" doesn't mean "only that" but means something like "emphatically that" in this context).


This is the impression the headline gave me as well.


Correct. I am quite disappointed in the editors.


The quote from the Apple executive is misleading to outright dishonest.

> “Security researchers are constantly able to introspect what's happening in Apple’s [phone] software,” Apple vice president Craig Federighi said in an interview with the Wall Street Journal. “So if any changes were made that were to expand the scope of this in some way—in a way that we had committed to not doing—there’s verifiability, they can spot that that's happening.”

Apple uses complex cryptography to shield themselves and their list providers from accountability. You cannot determine if they've included non-child-abuse images in the database through inspection.


It's crazy that people argue what is being scanned, but the very fact that it will take place at all. Imagine a landlord coming to your place and looking around for drugs. This is insane violation of privacy regardless of intent.


Landlords have the right to enter their property at any time given a "reasonable" notice which is typically 24 hours and almost all states have provisions for "inspecting the premises". I don't have to imagine, this happens almost daily.


You'd have a darn good case for suing your landlord if they started going through your drawers, though. The inspection is for condition of the building and fixtures.


Hypothetically you could remove the drugs and be sure they're gone before inspection. Images being "deleted" from the cloud... not so sure


s/your landlord/your house maker


The feature is only supposed to be about pictures stored on iCloud, so the landlord comparison is more correct.

When Apple will start analyzing pictures on iphones and not in the cloud, then your substitution will become correct.


They scan the images on the iPhone before they are in the cloud. That's the whole point of Apple's system.


Nobody seems to remember that the EARN IT Act nearly passed in a bipartisan vote last year in the US. Apple's CSAM Scanning is better than that potentially coming back.


No, but you can verify that only photos uploaded to iCloud are scanned, and can verify that photo metadata beyond the security voucher is not generated or sent to anyone but Apple.


The former, yes. The latter no: because Apple can send it on further beyond your ability to observe. :)


Yeah but once you move something from your personal device to a cloud operated by a private entity why would you expect privacy anyway?


Horrible counter argument, what's the point of Apple advertising privacy ad nauseam if your shit isn't safe and secure for your eyes only.


How can you verify what Apple does on its iPhones?


Apple has special iPhones for security researchers.


Why can't we all be security researchers?

Why are we defending this move by Apple?

Richard Stallman was right about everything. In twenty years, people will be standing up for Apple as they upload your health data to your employer and insurance provider.

"It won't matter if you eat right and exercise," is a quote I expect to hear.


"And even if it matters, so what, everyone does it"


Because most people don't give a fuck on this topic. It ticks the boxes we need ticked. You're different, that's cool. Shout it from the rooftops. I support you.


And VW has special software for emission tests. The whole point of security tests is to test on the same iPhones the normal user has without the doubt of special test devices.


Yes, but they're loaner units primarily intended to contractually obligate security researchers to work for Apple. I trust a security researcher using an official Apple device about as well as a Trump-funded 2020 election audit.

Apple uses the technical protections on the phone to make it very difficult to actually even be in a position to do security research without also being NDA'd. There is no owner override like there is on, say, M1 Macs. Apple's position is that nobody but them loads OS code onto your iPhone. Not even you - and you can't practically do any research or auditing on things like the CSAM scanner without having the ability to poke around in the OS.

If Apple had an owner override on iOS-fused devices, then we could load our own kernels, call into the neuralhash framework, and so on to actually validate that the system does what it says. But that would also mean that Epic could sell Fortnite skins outside of the App Store, and we can't have that. So instead we need to gag and muzzle security researchers... which also makes them no longer independent auditors of the iOS security model.


> Trump-funded 2020 election audit

How droll.

Don't claim in one breath that an iPhone can't be looked at by security researches, and in the next say "well, not the ones I want"

I understand if their security research policy doesn't go far enough, but let's not pretend there is nothing.


Researchers can vet the client-side part of the tools, but per the Security Device program agreement they can’t discuss anything publicly without Apple giving them the go ahead.

And presumably the hash match database downloaded to the device is encrypted and unable to be examined.


Even if they weren’t lying through their teeth, this changes nothing. My objection is to Apple believing that they can use a device I own for proactive law enforcement and then acting on that belief. Nothing about how they do that is even worth considering. Apple is spying on you for the police.


The last sentence is the simple truth that no amount of Apple marketing dollars can hide.


And the thing is they're spying on you for nothing - at least not to catch pedos.

Pedos are extremely tech-savvy, they need to be to survive. Starting now, none of them is gonna use Apple products and that's it.

My guess is they'll catch as many pedos as terrorists that were caught by the TSA.


> Pedos are extremely tech-savvy, they need to be to survive.

This is not at all borne out in reality. When the FBI rounded up a big ring of CSAM creators/consumers a few years back it came out that they (the people sharing) had rules for how to interact with the community that would have fully protected them, but many of them were sloppy. Same thing with the amount of CSAM that FB reports.


> [In 2018, Facebook Messenger] was responsible for nearly 12 million of the 18.4 million worldwide reports of child sexual abuse material, according to people familiar with the reports.

https://www.nytimes.com/interactive/2019/09/28/us/child-sex-...


The narrative here--which I feel like a lot of people aren't grokking somehow--is that to analyze Apple's CSAM tools you need to be able to extract them from a phone to debug and work with them, which involves reverse engineering the implementation; and Apple recently said quite strongly that a reason you can trust their client-side CSAM is because, by virtue of being on the client, security researchers can do this analysis.

Only, simultaneously, Apple hates the idea that people ever should get access to the software that runs on their phones and reverse engineer it: they tend to downplay results that are found in a way that often involves going to war with the security research community, they sued Corellium--which provides tooling to security researchers--and insisted that their clients were doing things that were inherently illegal, and they are so stingy with giving general access to their devices that not only can you not opt out of their lockdown they won't sell you special bright yellow open devices either... after many years of pleading with them, they finally decided to allow some researchers access, but it requires not only being invited but then signing off on gag clauses that are generally considered to violate the ethical responsibility of practitioners.

It thereby feels like Apple is talking out of both sides of their mouth... though, of course, that's nothing new for them :/. On the one hand, they want to claim that security researchers are important to their overall security strategy; but, on the other, they simultaneously abuse and prosecute people who dare to either directly pull apart their systems or have the audacity to provide the tools required for others to do so.

And, for anyone who is stuck in the mental frame "BuT I tHoUgHt ApPlE lOvEs SeCuRiTy ReSeArChErS", barely over a year ago (wow time flies when you are living alone and physically falling apart during a pandemic, huh? ;P), I wrote a thread on Twitter that documented a ton of the issues that we run into with Apple, including using specific examples, and touched on this lawsuit against Corellium. FWIW, I don't personally know of anyone in the security industry that thinks Apple is doing well on this front, and I doubt many exist.

https://twitter.com/saurik/status/1295024384596312064?s=21

Also: here is a thread on Twitter from a few days ago (started by Runa Sandvik, the senior director of information security at the New York Times) about Apple's recent statements, as well as a direct link to a reply sub-thread from Kurt Opsahl--the Deputy Executive Director and General Counsel of the EFF--that quickly got updated re the Corellium appeal.

https://twitter.com/runasand/status/1426232172109869057?s=21

https://twitter.com/kurtopsahl/status/1426314930001567751?s=...

(edit) I am realizing it is probably also worth explaining another key detail here that is probably more than just a bit confusing: one reason this is particular news right now is because, in addition to the big CSAM background story, Apple just announced an appeal of the case they lost to Corellium.

I think it is important to triple underscore that: a lot of people know about how Corellium and Apple recently settled, but that was over other claims that Apple (seemingly) gave up on; Apple can't appeal that AFAIK. However, in December, Apple had most of its (extremely weak...) case dismissed by the judge.

https://www.reuters.com/article/us-apple-corellium-idUSKBN29...

> U.S. District Judge Rodney Smith ruled in favor of Corellium LLC, saying its software emulating the iOS operating system that runs on the iPhone and iPad amounted to “fair use” because it was “transformative” and helped developers find security flaws.

It almost certainly isn't the case that Apple decided to do this appeal because of Corellium's press release, as it almost certainly takes more than less-than-a-day to put that together and file it ;P. It will be interesting to see if Apple manages to put together a more coherent argument in their appeal.


> Apple hates the idea that people ever should get access to the software that runs on their phones and reverse engineer it

From: https://www.macrumors.com/2021/08/17/apple-appeals-corellium...

> Back in December, Apple lost a copyright lawsuit against security research company Corellium, and today, Apple filed an appeal in that case, reports Reuters.


I realize that “child safety features” is the absurd euphemism in the title of the article itself, but the title really should be changed here on HN. Allowing a title like “child safety features” to describe this technology within a community that both knows better and is prominently featured in Google gives that description credence. It should be changed to something like “…invasive screening technology” or “…government backdoor into your iPhone”.


How long before people figure out the CSAM service endpoints and block them via a PI Hole device?


As I understand what has been said, the CSAM detection is part of the iCloud Photos upload process, via attaching some metadata to the photos that're being uploaded. So you could block this service, but it'd be functionally equivalent to just disabling iCloud Photos entirely.


This makes sense, iCloud can just be blocked with the rest of the malware domains.

For fun corporate and university IT types can start adding iCloud-related domains to their internal blacklists.


For consistency, they'd need to block Google, Facebook, and everyone else who's scanning all your uploaded content.

See: https://en.wikipedia.org/wiki/PhotoDNA


The distinction there is that it's the server doing the matching and reporting, not the client. People expect remote servers to be accessible to government searches, but not their own personal devices.

This distinction is even codified in U.S. law. The government needs a warrant to search your phone, but only needs a subpoena to search a remote server that's storing your files[1].

But yes, I can see why that distinction might feel a little arbitrary at times, particularly in the modern age where cloud storage is so common. Perhaps the 4th amendment should cover third parties storing "papers, and effects" on a person's behalf.

[1]: https://grandjurytarget.com/2020/10/28/by-search-warrant-or-...


> The distinction there is that it's the server doing the matching and reporting, not the client. People expect remote servers to be accessible to government searches, but not their own personal devices.

Speaking solely for myself, I don't see a meaningful difference between these cases. They're both "content you upload to a server is scanned", with the only difference being that the scan happens immediately before upload rather than sometime after upload.

My opinion would be notably changed if Apple was scanning content you're not uploading, but the current system doesn't seem to allow for that.


The difference is you need just to change a flag in the process to scan all your pictures on the device no matter if iCloud or not. It's their software and you have only their promise they only scan pictures meant to be stored in the iCloud.


The reason the distinction is meaningful here is that Apple end to end encrypts iCloud uploads, meaning that Apple does not have access to the files. Scanning on the device is this weird middleground where they do sort of have access, and in some cases might very well end up with access (false positives).


Apple doesn't end to end encrypt iCloud Photo uploads -- they do transfer and store them encrypted, but they still have a key to view them if they want to.


This [0] seems to indicate that they do:

> iCloud secures your information by encrypting it when it's in transit, storing it in iCloud in an encrypted format, and using secure tokens for authentication. For certain sensitive information, Apple uses end-to-end encryption. This means that only you can access your information, and only on devices where you’re signed into iCloud. No one else, not even Apple, can access end-to-end encrypted information.

[0] https://support.apple.com/en-us/HT202303


Read that support article again, it agrees with me. :-D

There's a section "End-to-end encrypted data" which explicitly lists the things which are actually e2e, and iCloud Photos isn't on that list.


You are indeed correct, I totally missed that section.


The notification has one title, the title here has another, the article title has another, and I'm immediately assaulted with pop-ups and page scroll shifts... it can be REALLY annoying checking HN sometimes, you know?!


Did the font break on my device, or is everything just in bold? I cannot read it like that.


Apple is now the bad guy.


Groklaw PJ, where are you now that you are needed the most?


I got an ad overlay from the top of my screen, a growing banner from the bottom, and a fade-in pop-up over that. Within five seconds of loading the page, precisely 0% of it remained unobscured.


Serious question: Why do you browse the web without an ad blocker? I can't imagine subjecting myself to that kind of torture.


Why should I? I don't bother, that way I know which sites to not visit because they can't put together a usable website. If I go out of my way to force their site to be useable then they see they get traffic while doing such things instead of learning not to, or going out of business.


> Why should I?

To reduce the risk of malware delivered via your browser?


uBO shows you what it has blocked on each site. So you can see which sites would subject you to ads without actually having to expose yourself to the psychological terror of seeing those ads.

It also blocks things that you don't see (tracking), improves performance and prevents malware infections.


Edit: Replies are far more informed than I am. I'm using Firefox on Android for the reasons outlined in my original comment, preserved below

---

> Serious question: Why do you browse the web without an ad blocker? I can't imagine subjecting myself to that kind of torture.

As best as I know: doing this on mobile (assuming that's their platform) requires both:

* a non-iOS device (Android basically)

* a non-Chromium browser on said device (Firefox basically)

That pairing is the only reason I can adblock on mobile. Not sure if things changed on Chrome or related browsers on Android, but as best as I remember, iOS and Android+Chrome aren't adblock-friendly.


On iOS, there's AdGuard if you want to use Safari, and Firefox Focus if you don't.


Firefox focus blocks ads in Safari too. Any content blocker will work in Safari, as far as I know.


FWIW, I use firefox focus in Firefox in iOS, and it didn’t stop one of the things the parent mentioned.


NextDNS works really nicely on OS/Apps. Also blocks ad/tracking content in iOS. It's actually quite easy to roll this yourself, which I intend to: wireguard+pihole.


would need to subject oneself to the pushy notifications, whims of Brendan Eich, and the dubious approach of the BAT cryptocurrency [1], but the Chromium based Brave browser on Android has a built in ad-blocker.

I think Opera might have one too.

[1] <https://davidgerard.co.uk/blockchain/2019/01/13/brave-web-br...>


If I am not wrong, don't Brave and Vivaldi (Chromium browsers) have inbuilt ad blocking?


You are not wrong, I'm using Vivaldi on Android and it has built-in ad blocker. On this page though are two big dialogs: for cookies on top and "2 free stories remaining" on bottom, so even with ad blocker only 1/3 of the screen is visible until you manually remove these annoyances.


> a non-Chromium browser on said device (Firefox basically)

Mmmm. I can use Adblock in Edge Beta on Android.


I turned off adblock to check, this is madness. 2 slide ins (top & bottom) for upselling their paid subscriptions, instantly another overlay sliding in on the top about cookies, scrolling down just to try to bring the content up darkens the screen with another overlay asking for newsletter signup, briefly followed by a slide in from the right promoting some other product. All before seeing a single word of article content. If anybody every tells me using adblock would be unethical, this is my new goto site why I think the opposite.


I love newsletter signup modals.

Always use "legal@oracle.com"

edit: now I recommend privacy@chevron.com


That's the beauty of using Firefox + uBlock Origin on both your computer and non-ios mobile - you will completely avoid these kind of irritating annoyances, experience quicker loading and more responsive websites and save a lot of bandwidth. (On ios AdGuard does the same but is limited because of Safari. Google is also working to cripple ad-blocking on Chrome, so Firefox is currently the best browser to comprehensively avoid such ad-infestation and trackers online).


I’ve been using 1Blocker on Mac and iOS for years and every once in a great while I have it disabled temporarily and can’t believe people what people have to deal with where everything is covered in ads. Every once in a while some feature on a site doesn’t work so I click disable content blockers for that one page temporarily then all is good or can always just open up chrome for a particular site on those rare occasions.


I’m using AdGuard on iOS. Whereas 1Blocker only allows one type of filtering to be active at a time in the free version, AdGuard can block many things in Safari on iOS for free. Ads, trackers, social widgets, annoyances. All blocked for free by AdGuard.


It's shocking how borderline unusable websites can be. And it's all google ads.


1Blocker isn't even good, hasn't blocked YouTube ads for months now.

I regret paying for it.


I moved to Wipr after being disappointed by changes in 1Blocker.


Simply turn off Javascript and watch instantly all inaccessible pages become accessible with all ads and popups gone. The magic. The 90s HTML page.


I recently put NoScript into a less-trust mode and manually enable sites and links if a webpage I want is broken.

It is pretty amazing to auto-bypass paywalls, and how much faster sites load, that you can see how many external JavaScript sources there are on every site by default.

It’s a little annoying when I realize that X or Y page doesn’t look or work right and need to adjust, then reload, maybe a couple times, but overall worth it! Magic is right.


Indeed. Fortunately, umatrix exists and is very helpful – I turned off cookies and scripts for the page, and all was good.


For me the whole page is just some kind of abstract image captioned "MS Tech" with a tiny shred of the actual article at the bottom of my screen.


Yay. 2 free stories remaining.


Most misleading headline of the century.


This is outdated reporting, Apple did drop the suit against Corellium a few days back.

Funnily enough the same NeuralHash has already been generated for completely different images, so good luck explaining why your 4th of July pics cost all of your safety vouchers to Apple/FBI


Already covered in the article:

> In the lawsuit, Apple argued that Corellium violated its copyrights, enabled the sale of software exploits used for hacking, and shouldn’t exist. The startup countered by saying that its use of Apple’s code was a classic protected case of fair use. The judge has largely sided with Corellium so far. Part of the two-year case was settled just last week—days after news of the company’s CSAM technology became public.

> On Monday, Corellium announced a $15,000 grant for a program it is specifically promoting as a way to look at iPhones under a microscope and hold Apple accountable. On Tuesday, Apple filed an appeal continuing the lawsuit.


[flagged]


This doesn't look like "fake news" to me.

Timeline:

Monday, August 16: Corellium launches its "Open Security Initiative" to fund "research projects designed to validate any security and privacy claims for any mobile software vendor". The announcement prominently lists Apple's privacy and security claims about its CSAM scanning as one of the topics that would be eligible for funding under this initiative. (https://www.corellium.com/blog/open-security-initiative)

Tuesday, August 17: Apple appeals the copyright case that it lost against Corellium. Reuters reports that the appeal was a "surprise" after the recent settlement. (https://www.reuters.com/legal/transactional/apple-files-appe...)


Correct timeline:

August 2019 - Apple sued iOS virtualization provider Corellium for copyright infringement and DMCA violations

December 29, 2020 - Apple loses copyright claims in lawsuit against U.S. security bug startup

August 5, 2021 - Apple announces new protections for child safety

August 17, 2021 - Apple says researchers can vet its child safety features. But it’s suing a startup that does just that.

Unless there is an iTimemachine I struggle to see how Apple sued a company in August 2019 for saying in August 2021 it will vet help vet its CSAM tools announced in August 2021.


Apple settled the lawsuit on August 10, 2021, just to file an appeal on August 17, 2021 (one day after Corellium's announcement). That is the focus of the article, not the original 2019 filing.

From the Reuters link in the article:

> The appeal came as a surprise because Apple had just settled other claims with Corellium relating to the Digitial Milennium Copyright Act, avoiding a trial.

> Experts said they were also surprised that Apple revived a fight against a major research tool provider just after arguing that researchers would provide a check on its controversial plan to scan customer devices.

https://www.reuters.com/legal/transactional/apple-files-appe...


It’s going to be practically impossible for most researchers to vet the CSAM detection tool without Corellium. It’s already very difficult with Corellium, but Apple is going out of their way to ensure such research is infeasible in most cases. That isn’t limited to their CSAM detection tool—hence the lawsuit that predates that tool.

Apple is claiming that researchers can vet the CSAM detection feature while simultaneously attempting to take down organizations that make such research possible. It’s a stupid statement on their part.


Neither the headline nor the HN title say that the lawsuit is "for saying" anything... they both merely--and very correctly--claim that Apple "is suing a startup that does just that". Are you claiming that Apple is not suing Corellium? Alternatively, are you claiming Corellium does not "do[] just that"?


No they didn’t. They did settle, but they subsequently appealed a week later.


So, it is maybe worth explaining this a bit: Apple lost these claims in December... and like, really badly: the jugs outright dismissed them; but then the trial was going to continue with the rest of the claims, which were settled.

Apple is now appealing the claims they lost, not the ones they settled (they can't do that: that would undermine the premise of settling anything at all). Legal complaints are not atomic all-or-nothing affairs in this way.


Thanks for the added context. I assumed it was either something along those lines or the settlement was never actually finalized. It’ll be interesting to see what comes of all of this.


Note to self: make sure every settlement agreement includes a clause like "the agreed-to settlement amount triples on every subsequent legal action by $other_party concerning this subject".


You mean Apple lost, but they appealed already

https://www.reuters.com/legal/transactional/apple-files-appe...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: