> Apart from the “security concerns” of allowing use of non Apple services
I would think that if the EU requires Apple to allow third parties to use the new screen sharing APIs with the same privileges, that would be a security concern.
I kind of figure if I'm sharing a constant stream of screenshots of my screen with one company why not share them with every company? Might as well send it to the hackers too, I guess.
I don’t know if you’re joking but this is actually an amazing example of a very real organizational challenge: people get tired of dealing with edge cases, so they give up all together. I absolutely 100% trust Apple to do what it thinks is best for profits, and I don’t see any way that eroding/risking their privacy stance would pay off. Not to say “I don’t have anything to hide”, but I definitely don’t have any profitable to hide!
I'm pretty sure that SharePlay screensharing is encrypted like FaceTime is, so I don't think you're sharing that stream of screenshots with Apple by using it. Their concern would be that if you allow third parties to use that API, they might stream your screen elsewhere.
My jest was aimed at Siri's Apple Intelligence (AI) upcoming Onscreen Awareness. I'd like to also share my screenshots stream to be analyzed Microsoft AI's Copilot+ Recall feature for ultimate AI analysis and any other company that wants to send my screen to AI. Alphabet Intelligence? Sign my screen up!
Apple announced many of the new features are not on-device. There are three categories:
1. On device
2. Private cloud compute (aka Apple’s servers)
3. OpenAI (but only if you approve each time it’s used)
This isn’t really a new thing - you could always tell that some of Apple’s AI features depending on remote servers by disconnecting from the internet, using Siri and seeing how much of it stopped working.
2 I guess you can argue; they say that's all encrypted and they're sharing the OS with security researchers to audit. 3 I don't think is applicable, OpenAI is not getting screenshots for awareness, they're just getting "I asked Siri 'write a limerick about the DMA,' Siri can't do that, can you do that?"
I find (2) to be kind of a BS narrative. It’s not like they are open-sourcing the code. All big tech companies have security audit programs. Does that assuage your security concerns when Google, Meta, etc do it? Basically, what they are saying is they are doing it on the backend and you should give them credit for doing it on the frontend because some security researchers are going to get a Disneyland tour.
Personally, I actually am somewhat encouraged that tech companies are subject to legal process (in the form of lawsuit discovery etc). Tech companies have paid billion-dollar settlements for allegedly breaking the laws in segments of their business that don’t generate billions in profits. Obviously there’s a lot of room for them to push the limits of the law, disagreement about what the law is, and there mays be ways for them to get away with things. But the fact that audits are happening, the code and logs can end up in court acts as some kind of constraint on their behavior.
“We’ll take the extraordinary step of making software images of every production build of PCC publicly available for security research. This promise, too, is an enforceable guarantee: user devices will be willing to send data only to PCC nodes that can cryptographically attest to running publicly listed software” [1].
I would think that if the EU requires Apple to allow third parties to use the new screen sharing APIs with the same privileges, that would be a security concern.