Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's totally dumb that a functioning phone is bricked by an update because of repairs done in the past. Imagine the same thing happening to your car. "Sorry sir, the software update done to your car has now disabled the vehicle because in the past someone not related to x (insert name of car company here) has repaired it, your car is now junk (you can't even resell it) and you'll have to buy a new one".

It's just petty revenge because you had to temerity to go to another party other than apple to get your phone repaired and maybe even saved some money in the process. So now, in retaliation we'll destroy your phone in software. I'm sure this will go down well with the various EU courts.



Seems more like short sightedness than "petty revenge."

The new home button hardware can't be securely validated, and a future OS update fails on the unexpected condition of invalid Touch ID hardware. I can imagine that this wasn't a prioritised testing scenario, likely not even considered by Apple when developing iOS.

It's a lousy way to fail — the phone should just disable Touch ID and Apple Pay, and anything that relies on the secure co-processor.

But it feels far more like there's an `else` branch sitting in the iOS codebase somewhere where a programmer has written:

//NOTE: This should never, ever happen

//.. code that triggers error 53


There's a lot of stuff that depends on the secure element - in fact the phone would be quite useless without it.

In fact, when you first reboot your phone, even contacts cannot be accessed until you authenticate with your passcode to unlock the secure element. Incoming text messages only show the phone number.

You're right, however - a Touch ID sensor that cannot be verified should not brick the phone. Apple should just disable Touch ID and sever any link between the sensor and the secure element.


> There's a lot of stuff that depends on the secure element - in fact the phone would be quite useless without it.

Disclaimer: I don't own an iPhone with Touch ID.

However, it seems to me that the phone should still work if you logged on with your PIN instead of with TouchID. It should therefore be as useful as most phones that didn't have TouchID in the first place (which happens to be all my Android and iOS smartphones).


Correct. In fact, a passcode is always required the first time after you reboot your phone. The passcode secures the Secure Element, which contains the fingerprint data used by Touch ID.


On TouchID phones, the PIN is held in the same secure enclave as the TouchID data.


OK, thanks! Can you reset the PIN without knowing the original?

If you can't, there's no need to brick the phone...


The issue is Apple cannot verify a secure touch ID replacement over a compromised touch ID replacement. Without knowing if your replacement is secure the change potentially compromises the security of the whole device.

IMO bricking on touch ID issues is extreme, but maximises the security of the device.


>IMO bricking on touch ID issues is extreme, but maximises the security of the device.

We are all smart people here and there are several ways to have security without bricking expensive hardware.

First, the update can wipe the device instead of bricking it.

Second, Apple can provide an option to replace the fingerprint chip and charge, $150-$200 or whatever it costs for it.

There would be several better solutions that the most profitable company in the world could figure out if they wanted to. It's funny how their particular solution happens to make them even more money through shutting down third party repairs and making people buy new phones.

This is like your home alarm software(made by the home builder) remotely burning down your house and telling you to build a new one because someone may have tampered with home access and could possibly enter your home.


  Second, Apple can provide an option to replace the fingerprint chip and charge, $150-$200 or whatever it costs for it.
At the end of the article, it said that affected customers should contact Apple Support. Are you sure they are not offering a hardware fix at that point? It doesn't sound to me like they're just letting people hang.


From a different article:

>When Olmos, who says he has spent thousands of pounds on Apple products over the years, took it to an Apple store in London, staff told him there was nothing they could do, and that his phone was now junk. He had to pay £270 for a replacement and is furious.


There is a failure in the apple stores vs phone support. I went to two Apple stores to try to get my watch band replaced or fixed under warranty and was told by both of them "no way no how" - but phone support had no problem replacing the band.

I find the stores are somewhat inconsistent in their application of policy. (Particular if the policy isn't well defined ahead of time, as in this case)

(As an aside, the practice of requiring an appointment to talk to a support person or even just drop off a broken computer is maddening.)


Alternative interpretation -- "A custom voided his warranty by installing some rando third-party aftermarket parts, and is furious that it didn't work out."


Yes. Just as destroying the phone with a hammer maximizes the security of the device. Effective but entirely useless.

The phone would still work perfectly fine and safe if Touch ID would be disabled and input from the sensor wouldn't be trusted.


> The issue is Apple cannot verify a secure touch ID replacement over a compromised touch ID replacement. Without knowing if your replacement is secure the change potentially compromises the security of the whole device.

What are you even talking about?

If the fingerprint scanner is suspicious, just disable it and leave the rest running. And this is in fact what happens, until a software update is installed and then the phone suddenly decides to brick itself completely.


Does the 911 feature still works on these phones? 911 should work even without a SIM card and without any other authentication, to purposefully disable a phone in this way may have bigger repercussions than just 'security'.


How? TouchID is the less secure authentication than password/PIN anyway (which is shown by the fact that you need to enter PIN/Pass right after boot). How would just disabling TouchID auth be a worse option?


>TouchID is the less secure authentication than password/PIN anyway (which is shown by the fact that you need to enter PIN/Pass right after boot).

The fact that you need to enter PIN right after boot, just shows that they use "two factor authentication" to make it even more secure.

It doesn't IN ANY WAY show that TouchID is "the less secure authentication" method of the two.


You can do anything you want on the phone without using Touch ID at all. The fingerprint sensor is not a necessary factor in their implementation, while the passcode is.


> You can do anything you want on the phone without using Touch ID at all

I believe ApplePay requires TouchID.


I can't try because Apple Pay isn't available here yet. According to this support document it works without Touch ID (emphasis mine):

> To help ensure the security of Apple Pay, you must have a passcode set on your device and, optionally, Touch ID. [...] To send your payment information, you must authenticate using Touch ID or your passcode.

https://support.apple.com/en-us/HT203027


Fingerprints are impossible to change and can be brute-forced. Therefore, fingerprint security is less secure than a password that can be changed.


Brute forced with what? Trying different fingers?


A fingerprint, like any piece of data, is handled at the lowest levels as a number. A number with some constraints, but a number.

By feeding numbers into the scanner instead of fingers, you can accomplish the same effect as feeding random strings into a password box. Further, it's also possible to take fingerprints through social engineering, or by getting at the database of a company that uses fingerprints as security. Five bucks says someone's already storing a bunch of fingerprint data as plaintext.


>By feeding numbers into the scanner instead of fingers, you can accomplish the same effect as feeding random strings into a password box.

Isn't this exactly why they DON'T allow you to use the iPhone with a potentially tampered with HW/TouchID -- e.g. the very feature/issue we're discussing?


Well, yes.

I'd argue that fingerprints for security are just silly to begin with.


> The issue is Apple cannot verify a secure touch ID replacement over a compromised touch ID replacement. Without knowing if your replacement is secure the change potentially compromises the security of the whole device.

The correct solution there would be to pop up a warning saying the TouchID hardware has been tampered with, and giving the user an option to validate it.


That wouldn't really be a good idea. Someone could steal your phone and replace the TouchID hardware. Then this popup comes up and they say, oh yeah this hardware is totally legit! Then they get your data, impersonate you, charge stuff etc.


The prompt would have to be after you authenticated your phone in some other way, like via the passcode.

I think it's totally OK not to accept authentication from an unvalidated device, but a legitimate user should be able to do the validation.


I think the post is referring to a hotel maid scenario.


Fingerprint scanners are useless for security. My fingerprints are everywhete, especially all over my phone. Touch id merely buys time, which can increase security but if they get my fingerprints, make a dummy finger then they need very little time to open my phone. If they are determined they'll do it. If they are not, probably they won't care about the data in my phone.


They have at most 48 hours (or perhaps 24?) and 5 tries to find your fingerprint and unlock the device. TouchID will discard the keys and require a passphrase if it is not used for a while or after the fifth invalid fingerprint attempt. The window of opportunity is not that big. I would not characterize it as useless at all.


> The issue is Apple cannot verify a secure touch ID replacement over a compromised touch ID replacement.

Couldn't they just ask the user? Use the backup password to authenticate.

If it's my device, I want to be the one who chooses what I trust.


In fact, when you first reboot your phone, even contacts cannot be accessed until you authenticate with your passcode to unlock the secure element.

I don't have a password on my iPhone. I don't need one, because I don't store any critical data on the phone, or spend any significant time in environments where it's likely to be stolen. And I guess I'm naive enough to assume that no one from the NSA Tailored Operations department is going to sneak into my bedroom at night and install a malicious fingerprint sensor.

So, no, there is absolutely no reason for Apple to brick my entire iPhone if the sensor fails validation. They should act to maintain the level of security chosen by the user... no more, no less.


That's true. I meant to say it should disable any hardware that was directly tied to the secure element, such as Touch ID and Apple Pay's NFC chip. I didn't mean to imply all encryption services in iOS should be disabled.


The scenario they're protecting against appears (to me) to be a bad actor changing the hardware. If the Touch ID sensor is bad, and you propose falling back to a PIN, who's to say the digitizer isn't compromised too and is recording touches?

Sounds a bit like a downgrade attack if they didn't fail fast and hard at the earliest opportunity.


> If the Touch ID sensor is bad, and you propose falling back to a PIN, who's to say the digitizer isn't compromised too and is recording touches?

So why not brick the phone out of the box? Even if the finger scanner is original, somebody may have compromised the digitizer anyway.


I don't think they are protecting against that scenario so much as not accounting for it. I expect Apple's assumption is that they provide all components for their devices. People who install unauthorised third party components can no longer have those devices serviced by Apple — so it no longer matters to Apple whether those devices are compromised, because they aren't really "Apple" devices at that point anyway.

This botched handling of replaced hardware that hasn't been paired with the Secure Enclave ties into the above. Apple doesn't expect people to replace such hardware through a third-party, so they don't think to engineer their software to fail gracefully when it happens.


There's no need to be so hostile and assume malice when there's plenty of perfectly sound explanations otherwise.

Apple is a fairly security conscious company now so security tradeoffs should not be a surprise.


Assume malice? How do you mean?


There is only one valid reason to authenticate the fingerprint scanner before using it, and that is to prevent the use of aftermarket replacements.

No matter what the motives behind this mechanism were, it was put in place exactly to prevent 3rd party scanners from working.

And if they implemented authentication and didn't even test what happens if it fails, then well... how do they know it works at all?


If you have a secure enclave within the device, then any hardware which has a direct connection to that secure enclave must be authenticated. It doesn't matter about aftermarket replacements.

The entire purpose of the secure enclave is defeated if it trusts any hardware connected to it.

I'm not saying they didn't test what happens when it fails. I'm saying they didn't do user testing on what happens when it fails. I'm sure the engineers tried out the hardware authentication system. They just didn't test the whole scenario once iOS was sitting on the end product.

So yes, it was put in place to stop any hardware that could not be trusted from accessing users' secure data. But no, it was not done to prevent aftermarket replacements.

The only reason I can see Apple caring about aftermarket replacements is because they are often low quality, and cause customers to go back to Apple with unauthorised repairs. (I've witnessed this more than once in an Apple store, someone coming in who had their screen replaced outside Apple and the touch digitiser was failing. Apple just sends them away.)


> If you have a secure enclave within the device, then any hardware which has a direct connection to that secure enclave must be authenticated.

Consider reading the description of iOS security features linked somewhere in this thread.

Because what you are describing is a disaster, not security. If some off-chip sensor had access to fingerprint data or crypto keys, anybody capable of installing such chip would also be able to simply dump all the data himself in the comfort of his lab.


If I understand this correctly (not an iphone person), the touch ID sensor is just a fingerprint scanner?

As a standalone measure, biometrics make a shitty password substitute because you can't change a finger print if it's compromised, so shouldn't the iphone be secured on the premise that the finger print scanner is already compromised, hence losing it should not qualify as a downgrade attack?


Touch ID is a fingerprint scanner, but the Touch ID system is paired with the "Secure Enclave" in Apple's AX chips.

Secure Enclave is a separate coprocessor running its own L4-based microkernel. This hardware is directly paired with security-sensitive hardware (Touch ID, Apple Pay NFC chip, etc). It provides all cryptographic operations for data protection key management and maintains the integrity of data protection even if the kernel has been compromised.

So when you stick a third-party Touch ID sensor in an iPhone it's obviously not going to be paired with the secure coprocessor. It doesn't really matter whether biometrics are shitty passwords, the iOS update process realises there is compromised hardware touching the Secure Enclave and fails in the worst possible way for the user.


Does this mean your fingerprint never leaves the scanners coprocessor and is inaccessible to iOS and the other processors and OSs within the phone?


Basically, yes. The Secure Enclave is hardware isolated from the rest of the chip.

Apple's own security guide explains it best [1]:

> The Secure Enclave is responsible for processing fingerprint data from the Touch ID sensor, determining if there is a match against registered ngerprints, and then enabling access or purchases on behalf of the user. Communication between the processor and the Touch ID sensor takes place over a serial peripheral interface bus. The processor forwards the data to the Secure Enclave but cannot read it. It’s encrypted and authenticated with a session key that is negotiated using the device’s shared key that is provisioned for the Touch ID sensor and the Secure Enclave. The session key exchange uses AES key wrapping with both sides providing a random key that establishes the session key and uses AES-CCM transport encryption.

Regarding the actual fingerprint storage, it looks like the encryption key is kept in the Secure Enclave and the entire decryption and verification process occurs within the Secure Enclave. However the encrypted data itself may be stored outside the Secure Enclave:

> The raster scan is temporarily stored in encrypted memory within the Secure Enclave while being vectorized for analysis, and then it’s discarded. The analysis utilizes sub-dermal ridge flow angle mapping, which is a lossy process that discards minutia data that would be required to reconstruct the user’s actual fingerprint. The resulting map of nodes is stored without any identity information in an encrypted format that can only be read by the Secure Enclave, and is never sent to Apple or backed up to iCloud or iTunes.

[1] https://www.apple.com/business/docs/iOS_Security_Guide.pdf


Yeah, there's the source of that quote, 'sub dermal ridge flow angle mapping', which at the time was described as 'how we know it's really your finger', along with supposedly measuring 'micro RF fields' to ensure it was a live finger.

Except it could be defeated by a laser printed fingerprint on a piece of paper (initially).


Yes, at least according to Apple: https://www.apple.com/business/docs/iOS_Security_Guide.pdf (page 7).

"The Secure Enclave is responsible for processing fingerprint data from the Touch ID sensor, determining if there is a match against registered fingerprints, and then enabling access or purchases on behalf of the user. Communication between the processor and the Touch ID sensor takes place over a serial peripheral interface bus. The processor forwards the data to the Secure Enclave but _cannot_ read it".


Yes. The OS can only tell if the fingerprint matched or not. The checking is done solely in the secure enclave. Presumably the OS also gets told which fingerprint matched too?


I'm pretty sure there was no direct malice here (except for the usual Apple disregard for 3rd party components), but the security reasoning behind this move is still questionable. The only danger of a compromised Touch ID sensor, is that it could record your fingerprint and let the attackers replay that fingerprint to access all your encryption keys on Secret Enclave.

That would be a huge vulnerability, if there hadn't been thousands of other ways to record your fingerprint, and while most of them are less accurate than a trojan Touch ID, they're also much easier to pull off.

And at the very worst, if a sophisticated malicious actor got the chance to meddle with your phone, they could just skip the Touch ID sensor altogether and install a stealthy fingerprint digitizer in the touch screen or on the back of the phone.

So in short, Apple's security measure, if my understanding is correct, does absolutely nothing to protect the user.


THE critical security system of your phone has been tampered with. PIN data, TouchID data, crypto data and everything else related to security is on that same bus. You do not detail secure info over that channel.


Aha, that makes more sense.


Too bad the compiler doesn't raise a fatal error for comments like that.


It may even be against the law in Europe because the phone was working perfectly after the repair and only the later update locked the phone without the ability to unlock it.

To keep with the care maker example, in Germany, the car makers used to void the 10 years guarantee on the painting of the car if the regular checks were done by an independent car shop. This was then declared as illegal.

I suppose Apple will soon release an update and declare that this was "a bug".


How will affected users get the "new update" if their phone is already broken by the previous? I mean, I hope that, in the case of already broken phone caused by their own update, apple _at_least_ give them a new iphone for free, else they are gonna get rape-sued here in EU


No, the phone did not work perfectly after the repair. The fact that the user didn't realise it didn't work perfectly doesn't change that. The repair compromised the security of the device.

Image if you have your tires changed by an independent car shop and a month later one of your wheels falls off on the highway. Do you start complaining about it to the car's manufacturer because 'it worked perfectly before'. No you don't.

The repair shop didn't repair it properly, if it was repaired properly the new TouchID sensor would be securely paired with the Secure Enclave and this issue would not occur.


Your analogy is way off.

This is more like your Tesla car's keyfob misfunctioning and you get it repaired by a non-Tesla dealer. The dealer could've put in a backdoor to get into the vehicle.

Tesla releases a big new update for their car software and now your Tesla is completely bricked and Tesla refuses to repair it, saying you have to buy a new car.

Is that acceptable?


In regards to warranty repairs automakers can (and often do) deny coverage due to the presence of non-OEM parts. Outside of warranty / safety repairs they are certainly not obligated to perform service.


Almost, except the Tesla store just says you have to buy a new (authentic) key fob rather than a new car.

Apple or other authorized repair shops can still fix phones that have been disabled due to security chain errors.


I don't think so, see my other comment.

https://news.ycombinator.com/item?id=11048311


Hunh. My bad, I was under the impression (based off some other comments) that replacing the home button/finger scanner with a legit one and updating the security pair would make the issue go away, but looks like I was wrong.


and how can I be sure that in the Apple store they are fixing the phone with a thrustworthy component? and so on... Apple excuses make no sense this time.


Except that the wheels didn't fall off the phone. This lockup happens due to code proactively added by Apple. You are confusing three different issues: the design of the system, the legality, and how security should work. In this case, none of those three items align. This is Apple's problem - they chose the easiest option for themselves, not what would benefit customers legally, functionally, or by securing the device properly.

You are fundamentally misunderstanding the threat model. What is the exact threat that Apple is guarding against? Is it an evil maid attack planting new sensors, switched devices, someone's fingers being cut off? All of these require different mitigations - none of which for a general purpose consumer phone are to brick the device when upgrading.


It would be definitely illegal in France which has a law against planned obsolescence - but EU doesn't have it yet. https://en.m.wikipedia.org/wiki/Planned_obsolescence#Regulat...


Only if it was "planned obsolescence" and not "a security measure".


I don't think you can excuse something as "a security measure" when it is clear that it is not a reasonable security measure and the reason for disabling device is something else.

Anyway, this confirms that I'll stay away from Apple stuff.


>Imagine the same thing happening to your car.

That sort of exists now. There are parts, like the throttle control on Volvo's, ABS units on BMW's, etc, where the unit is coded to your vehicle.

You can't, for example, swap an otherwise identical throttle from a junkyard into a Volvo. It puts the ECU into limp mode, and the car is essentially useless.

Now, that's not the result of an over-the-wire firmware update, but that's really the only piece missing. When/If vehicles start regularly updating firmware over the wire, you'll start seeing stuff like this.


How is this the top comment?

We can see that it is clearly nothing to do with 'revenge' or 'retaliation'. The evidence clearly contradicts that.

Apple makes strong promises around security and Touch ID in particular. This is clearly designed to maintain the integrity of that system.

It's certainly a terrible failure mode - the error message should explain what the problem is.

But imputing these bizarre motives just reveals bias on the part of the commenters.

The alternative would be at some point a headline like "compromised Touch ID sends fingerprints to bad actor".

[this was obviously going to be downvoted, but for the record nobody has either made an argument for, or provided evidence for the 'revenge' motivation, while the 'poorly implemented security policy' explanation is clearly supported by the evidence.]


> The alternative would be at some point a headline like "compromised Touch ID sends fingerprints to bad actor".

There would be no such headline. Before update to iOS 9, the affected phones functioned normally, but with disabled Touch ID.

And if you have to know why people come up with theories like that - either Apple didn't think what happens to users who already had this hardware replaced or they thought about that. Maybe those who accuse them of malice simply overestimated their competence - after all, they make big claims about high quality, attention to detail and whatnot.

Not to mention that the first part of this comment makes sense and probably expresses well how people feel about this screwup.


> Before update to iOS 9, the affected phones functioned normally, but with disabled Touch ID.

This is irrelevant. The problem is that those affected phones could still have had compromised fingerprint sensors.

Apple did the right thing in protecting people from this, but did communicate about it poorly.

The only malice in this situation is on the part of the people accusing apple of being motivated by 'revenge'.


> This is irrelevant. The problem is that those affected phones could still have had compromised fingerprint sensors.

... which wouldn't be used after the phone determines it's an aftermarket part. Equally well the attacker may have installed a malicious piece of brick inside.

> Apple did the right thing in protecting people from this

Sure, they protected users from using phone with replaced home button and disabled fingerprint scanner by bricking the phone completely.

Hard to tell in what proportions malice and/or stupidity were involved in this case, but either way it wasn't "doing the right thing".


Stop saying they bricked the phone. It's untrue. The phone is producing an error message and the problem can be rectified by visiting an Apple Store.


Maybe something has changed after this article, I'm not going to make such claims anymore without verification.

However, you have to understand that when I posted this, I knew about cases (including TFA) when the problem had indeed been "rectified" by visiting a store (sorry, couldn't resist ;)) but no case when the device had actually been fixed and data recovered, whether this is technically possible or not.


>Imagine the same thing happening to your car.

Given that apple is working on an iCar, it's just a matter of time I presume.


This whole fiasco makes me wonder if the link between the TouchID sensor and the rest of the phone is so fragile and easily spoofed that they are concerned a fraudulent sensor could be trusted implicitly by the phone, and therefore used to inject malicious data ("here, I am sending you [USERS] thumbprint").


What if that "repair" was done by NSA, CIA, etc? Should the phone boot like nothing happened? Seriously?


Looking at teardowns, like the one at ifixit [1], the touch id sensor seems to be a pretty standard imaging sensor that heads to an NXP chip. I'd be willing to bet that the encryption of the print happens on the nxp chip instead of the imager, so if the NSA/whoever were doing a "repair", they'd probably just put an MiTM chip on that insecure path for later playback. Against a state actor, Touch ID is a triviality.

[1] https://www.ifixit.com/Teardown/iPhone+5s+Teardown/17383


Maybe they want to move encryption onto the sensor chip in future generations, because the scheme you described is indeed a joke.

But to be honest, it's not like fingerprints are such a hard to obtain secret in the first place.


Fingerprint scanner disabled due to unauthorized modification. Please contact support or type "I want hackers to steal my data" 10 times to reenable.


Hell, they could, on boot, display a message with something like "This phone contains contains a non genuine apple part"

Then the question is, could the NSA/CIA/etc trick the phone into thinking the repair was valid?


Almost certainly so. Decap the chip, pull the flash contents out to get the key, etc, flash a new backdoored chip, you're done. Hobbyists have been decapping chips and pulling flash for a while, so this is certainly not beyond the abilities of the NSA/CIA/etc. You wouldn't need the exact same make of chip, just one that presents itself as the same.


On boot? Many naive users never boot their phones after the first time they open the box.

So then, what, if not on boot, show it all the time, with no option to suppress? If you offer a way to suppress it, that will be used by the bad guys.


I say show a message whenever Touch ID is activated. Then again that would be really freakin' annoying. Still better than bricking the phone.


LOL, given how those organizations work they wouldn't do the "repair" they would ask Apple to install it for them.

I'm sure they already did that assuming they don't have better ways.


My thoughts exactly.


Remember the woman who proved that Apple consistently slows down sold iPhones with "updates" right before the launch date of a new model? [0]

Apple is in the business of selling (overpriced/overengineered) hardware. Their tactics make that very clear.

The only problem with that is that there is no real competition in mobile phones anymore, it's just a giant duopoly. Sure, you can switch to Google/Android, which as the giant ad company (in other words: personal data hover/mass surveillance company) is just as bad.

My only hope is for Ubuntu to make the right moves soon.

[0] A source: http://www.dailymail.co.uk/sciencetech/article-2709502/Does-...


Remember the woman who proved that Apple consistently slows down sold iPhones with "updates" right before the launch date of a new model?

No, because that never was a fucking thing.

Seriously, this is supposed to be a community of reasonably well-informed tech-oriented people. Step back for a second, take a deep breath, and think before you spread nonsense like this. Jesus.


Thank you! This shit is getting out of hand.


That does sound ridiculous.

But please don't comment like this and https://news.ycombinator.com/item?id=11047606 on HN. That just makes the threads worse. Instead, please stay civil even when some people are being silly.


You are, as ever, totally right; there's no reason to be rude. Sorry.


> You are, as ever, totally right

Good lord no. But thank you for the polite response and intention to change. It really is a collective effort.


That was frankly horribly researched. They started off with a supposition, and ignored any data that didn't help prove it.

They completely ignored that a new handset release tends to come in lockstep with a new OS release. This is relevant because it means many, many people are changing the operating characteristics of their existing handsets at the same time - leaving their phone not only not a "known constant", but potentially creating the same results even if a new generation of phone hadn't just appeared at the same time.

Seriously, if you take their graph and replace the labels for each 3g, 3gs, 4, 4s etc, with ios2, 3, 4, 5 .. the data is still accurate, but the take-away assumptions change entirely.

They ignored that geekbench scores show that scores for any given generation stay roughly constant over the phone's lifetime. That's a fantastic set of data that isn't coloured by changing expectations over time.

And they were quite happy to brush off that the samsung numbers showed exactly the same changes over time, as they didn't spike around releases. Completely ignoring that Android has entirely different update strategies (having to wait until your telco 'blesses' the update, etc).

If you look at their graphs, and think of each label as an OS release rather than coinciding with a new handset (which is still accurate), the story it now tells is that ios updates receive much larger adoption in hurry, when compared to adoption of android updates.

Which we already knew. Apple's update strategy is that every compatible handset is able to update "today", whereas android updates are staggered by various levels of support from different mffrs and different providers.


> Remember the woman who proved that Apple consistently slows down sold iPhones with "updates" right before the launch date of a new model?

No. Story linked from daily mail (that's not a "source") is more nuanced.

http://www.nytimes.com/2014/07/27/upshot/hold-the-phone-a-bi...


From the link: "The important distinction is of intent. In the benign explanation, a slowdown of old phones is not a specific goal, but merely a side effect of optimizing the operating system for newer hardware"

When we try to judge intent we do judge the context. So let's look at the context: people do know that iPhone updates will probably slow their device(even though Apple doesn't tell them that). And there's no way to stop the nagging update notification (aside from jailbreaking).

So let's not be naive. Maybe it's hard to prove at the legal level, but there's a pretty decent chance that this is intentional by Apple.

EDIT: if you upvote, please explain why you think this is wrong.


Software getting slower with updates is a fact of life. It's been an obvious and expected thing since I started being aware of computers and updates in the late 1980s and I'm sure it was a thing even before that. The odds that Apple is doing this on purpose, rather than as a standard side effect of cramming ever more features into their stuff, are so low it doesn't bear more than a moment's consideration.

What's next, people say Apple deliberately destroys batteries after a few years, rather than being a natural consequence of battery chemistry? Apple deliberately makes their screens shatter when dropped?


I'm not arguing against the update making the device slower. But the fact it's built so people would upgrade(via irremovable nagging), even when it's clearly not what's best for them.


Huh? Of course you're not arguing against the update making the device slower. You're arguing that the update not only makes the device slower, but that this is a deliberate action by Apple to make older devices slower. And I'm saying that's ridiculous since it happens to pretty much all software anyway and all Apple would have to do to make this happen is just develop updates the same way everyone else does.


"Remember the guy who proved that Apple consistently slows down sold iPhones with "updates" right before the launch date of a new model?"

No, I don't, but I'd like to read more about it. Got a source?


>> Sure, you can switch to Google/Android, which as the giant ad company (in other words: personal data hover/mass surveillance company)

While on the iPhone: Do you use Google search ? Do you use Google maps ?

Well, if you do , there's not much sense in hiding from Google. And if you use Android, you could use tools like cyanogen's "privacy guard" to hide your data from all/most app authors , which i don't think you can do in iOS.

Also , unless they target you specifically, so long that you use an alternative keyboard - i don't think Google on Android collects your key press data - so you can use anonymity apps for special circumstances, But i could be wrong about that. Same goes for encryption keys, unless your targeted.


With Microsoft shutting down Windows Phone, Blackberry giving up on making it's own OS, Palm's WebOS now just a TV interface and Firefox OS rightfully shut down, the market has very clearly proven that there is no room for a third mobile OS.

Hell, there's barely any room for a third manufacturer, let along operating systems http://www.theverge.com/2016/2/3/10894200/android-smartphone...


No, it doesn't -- It just shows that people google "iPhone slow" more when new models come out / Apple releases major versions.

> No matter how suggestive, he says, the data alone doesn't allow anyone to determine conclusively whether their phone is any slower.

> There are other explanations for why an older model iPhone may slow down, he claims.

> For instance, the latest version of the Apple operating system, iOS, is always tailored to the newest device and may therefore not work as efficiently on older models.


Do you have a source for that? Curious to read it.


Well, regarding hardware and UI styles you actually have a lot of choices on the Android side.

And it is really hard to argue that Google intentionally slows down old Android phones.

The source is available for everyone to study and you can root most devices and install Android versions that don't even use any Google software.


I could maybe see as a good thing from a security point of view, although I'm not sure if bricking is the right solution here. If my self-driving car gets hacked (whether physically or its software), I'd rather it stopped working than let it drive hacked.


Where did the hacked self driving car come from?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: