I wish he had commented on the remark that Apple is going to make "bars 1, 2 and 3 a bit taller so they will be easier to see". Right, so they're easier to see. Not intending at all to play any Jedi mind tricks on you.
How stupid is it that Apple is going to increase the height of the first three bars? This sounds like the kind of design by committee that happens elsewhere. Not naming names, because it happens all over the place.
It's as if someone in engineering stood up and said:
"Hey, we knew our 'bar' algorithm was shit when we implemented it, and we in engineering said something about it, but no one listened. So fuck those guys in marketing, we need to implement a realistic algorithm, pronto."
Then the marketing team responds:
"Well, it might be good engineering, but users certainly aren't going to get the warm and fuzzies. Isn't there something else we can do?"
UX reluctantly speaks up:
"Well, we could make bars 1-3 taller, so they appear more like the old 4-5 bars."
I made this comment/question in the other thread about iPhone 4 letter, but I'll paste here:
"i'm wondering why bars are not equally the same height across. the quantity alone should be enough (1 vs 5) to convey strength. with the height being variable, it seems to make things seem worst in the worst possible way -- 5 will still be 5, but when its a 1, because the height is so much shorter, it seems worst than 1. what is the upside to variable height when you already have variable width in terms of number of bars? it's not as if there's an algorithm that determines the correct height for that particular signal, as there is a (faulty?) algorithm to determine the correct width (bars)."
I think it is because the "varying heights" icon has become part of brand identity. Woe betide the putatively-informational display that becomes part of the "brand identity".
But then you're back to square one, except you've made the problem worse by dividing it into 100 units instead of 5. What's "100%" mean? It's pretty clear what "0%" means (no signal), but what's "1%" mean?
100% would mean whatever equates to getting full telephony/full data bandwidth. As throughput drops, you can certainly apply a percentage to it, until you get to 0%, no throughput.
People would then learn that at x% they are getting decent data speed, and at x% things become unusably slow, at x% voice service drops off, etc..
All the more reason to give yourself a bigger domain (101 elements instead of 5). You don't need to map into the percentage domain directly from signal strength; you could easily define a function of data/voice connectivity as throughput to a percent scale.
"But data and voice have different quality of service criteria and blah blah blah!" So make up a function that accounts for this. Or add another indicator for data exclusively and keep the bars for voice.
you could -EASILY- define a function of data/voice connectivity
as throughput to a percent scale.
"But data and voice have different quality of service criteria
and blah blah blah!" So make up a function that accounts for this.
Translation: insert magical code here?
You remind me of people who say "Your videogame is so slow, why don't you use multiple cores? Just use threads, you could easily do it."
And yet this problem has not only been solved but been subsequently broken by marketers. Some things that seem easy are actually hard. I'm sure defining the appropriate function isn't all roses but a semi-competent software/radio engineer could figure out something with the right properties. With Apple's budget, they couldeasily do this.
Measuring RF and presenting it in a way a normal person can understand is difficult. I've had first hand experience trying to explain to customers that a 0dB signal is not a problem -- in fact it's ideal in our system, or that there's such thing as too much signal, or that signal quality alone isn't a very good indication of reliability/speed without taking into account other important factors such as SNR or real bit error rates, or that a ~4dB signal range swing due to thermal conditions alone is completely normal. In the wireless world you have to represent all this in a little 20 or 30 pixel wide meter.
A better method than the traditional 5 bar icon meter would be a status icon that could show signal strength, signal quality, and also take into account other experience factors like latency and speed. To you could have 5 orange bars showing strong signal while also acknowledging there was some other factor causing less reliability such as SNR problems or channel congestion. You could have 3 green bars showing good RF and network conditions. More accurate but we'd have to ween people off the old system. Ideally you could the choice in your settings for a basic or advanced meter icon.
Actually, I think they didn't think the bar algorithm was shit when they implemented it. They just changed what "5 bars" meant: "~100% reception" vs "good enough reception for all uses".
No. The engineer is crying because fudging the arbitrary visualization is considered the solution instead of "implementing a realistic algorithm, pronto".
Since the code is secret, you have to trust them that, despite having misrepresented signal quality in order to show their phones in favorable light before, this time, they will show a precise measurement.
A friend of mine said that Apple should just switch to showing either 4 bars or no bars (and patent this of course). With proper customer education everyone will soon be able to see the wisdom of this decision. It either works really well (with a some room for improvement) or it doesn't, and it's AT&T fault.
"the iPhone 4 did show a greater dropoff in signal strength in every holding position compared to the iPhone 3GS."
"However. . . the iPhone 4’s reception is definitely better in low-signal situations than the other two phones."
“Reception is absolutely definitely improved,” AnandTech wrote. "There’s no doubt in my mind this iPhone gets the best cellular reception yet, even though measured signal is lower than the 3GS."
Next update, you should see OS 2.1's 3g signal update reversed to what it used to be. Let's hope they fix the bigger (read: REAL) issue related to the proximity sensor that actually kills calls.
Huh? Apple's PR letter specifically says that their 'mistake' has been there since the "original iPhone". And the 'grip of death' can kill calls and more importantly, IMO, affects data transfer rates (as I use my iPhone more as a browser than a phone and naturally hold it the 'wrong way') so I'm not sure why you conclude it's not a real issue.
Gruber has never tired of making jokes at AT&T's expense even though he personally has great service in Philly and AT&T has the fastest network. Let's see if he brings the same level of snark to his Apple and Jobs posts henceforth.
Well played, sir. My emphasis on REAL in the parent comment was meant as proximity sensor bug being more severe/higher priority (IMHO) than antenna bug in terms of dropped calls.
This letter illuminates an important question: what the hell do those bars actually mean, and who watches to make sure there's fairness-in-advertising when an AT&T commercials makes claims about "5-bar reception"?
I keep wondering, If the iPhone 4 does indeed have better reception than previous models, doesn't that mean that previous model (3g, 3gs) antenna design is partly to blame for dropped calls and not just AT&T?
That letter is a distraction. It's not about the bar display. The iPhone 4 drops signal by 19.8dB when held normally (compared to not being touched at all), while the previous one drops by 1.9 when held normally. http://www.anandtech.com/show/3794/the-iphone-4-review/2
No software update will be able to fix this. Externalizing the antenna was a fun idea in theory, but badly realized in practice.
I bet iPhone 5 will have a yet again redesigned antenna.
The reality is Apple will likely insulate the antenna bridge with a non-conductive coating, i.e. using diamond vapor-phase deposition like Anand and Brian mentioned.
I wouldn't be surprised if Apple followed his (their) recommendation exactly. SSD manufacturers actually switched/redesigned controllers for future models based on Anand's findings and recommendations before. The man (and Brian Klug) is usually spot on in his (their) analysis.
I've seen this repeated at a lot of places, but I'm quite sure this should be "diamond-like carbon" and not actual diamond deposition.
Even then, I could not see any source for this, besides one blog comment somewhere saying the screen coating "may even be" DLC. All articles on the topic made no mention of the composition. Where does this information come from?
Not that it matters a whole lot, but I find it annoying when people repeat technical information carelessly. Even to a layman (and I know nothing about chemical deposition), doesn't "diamond vapor-phase deposition" sound fishy?
I know about CVD and have read this article. The applications listed are mostly related to cutting bits and actual diamond. It says notably: "[some nice properties] would make it a nearly ideal non-stick coating for cookware if large substrate areas could be coated economically." If coating cookware with it is economically infeasible, I imagine coating a screen would be pretty expensive too.
Which is why I doubt the iPhone used this and would like to know where that info came from.
The "real-world" performance is only better if you don't hold it in your left hand.
I can't hear very well out of my right ear, so I've always used my left hand to hold phones to my left ear. I never had any trouble with my 3GS, but my iPhone 4 is almost instantly transformed into an iPod Touch 4 when I hold it normally (i.e. the "wrong" way).
Personally, I like the rest of the phone's new features enough to adapt my grip during non-voice usage (which is most of what I use it for anyway). However, people aren't exaggerating the attenuation issue at all.
Though I had seriously considered it, I decided not to return it (and do worry that I may yet regret that decision).
I use my phone for voice calls infrequently, so the speed upgrade, better camera, and amazing display outweigh the reception issue for me. I also spend a lot more time with WiFi available than not, and WiFi reception doesn't seem to be affected.
It wasn't an easy decision though. I paid a premium for a device that is effectively defective by design. That stings no matter how you slice it.
Yeah but is the real-world antenna performance more than 18dB better? Probably not, because that would be revolutionary. So the net result is still negative.
It's radio performance, not antenna performance. The electronics attached to the antenna are more sensitive, so the SNR is better even though the total power delivered to the electronics by the antenna might not be.
That page you linked to explains why the bar display matters:
If your phone is at -60dB (5 bars) a 24dB drop is -84dB (5 bars)
If your phone is at -84db (5 bars) a 24db drop is -108 (1 bar)
Worst case:
If your phone is at -90db (5 bars) a 24db drop is -114 (0 bars)
That can be fixed with software. Given that the article says
the iPhone 4 performs better than the 3GS at -113dB, the perception
of the problem is twice as bad as the actual problem.
So it might be worthwhile to fix that perception, though the fix
will have other side-effects (a perception of worse reception overall).
One thing nobody is questioning: why does the perceived signal strength (what is displayed) affected by how users hold the phone? If the displayed signal strength doesn't correlate well with actual signal strength, I don't see how gripping it a certain way should make any difference??
The little bar diagram that phones display is an oversimplification and generally wrong. It's a bit like downsampling your monitor to 5 pixels and complaining that they don't accurately represent the original. Most bar displays compute some form of signal or signal to noise ratio, but even that is a joke compared to the complexity of modern digital multiplexing, shared control channels, inter-tower hopping, and that doesn't even consider what happens to your voice packets after they are on the ground at a tower. Your phone doesn't even have the proper instruments to deduce the likely hood that you will have a clean connection.
If people want to complain about actual dropped calls, thats one thing. To nitpick the weather prediction that is your reception "bars" is absurd.
My question has nothing to do with the formula used to render the display, or the accuracy of visualization of signal strength.
What I'm questioning is: if it's truly just a display (software) issue, why is it affected by gripping the phone in a certain way? The software problem (incorrect visualization) is seemingly triggered by a hardware issue (gripping the phone). They describe the problem as if it is strictly a software issue, but it is clearly triggered by a hardware issue.
I understand that you think you are right. I do. But you're clearly a software person (as are most people here), just like me, and we see the world as simple and properly abstracted. The reality here is that this is not a simple problem, and will never become one: there are just too many variables.
Here's an example that might be simpler to relate to: audio. The power of sound is measured in watts (as any stereo will advertise). If someone were to make a noise measurement tool, they might display the level in watts. This would be wrong because we don't perceive sound linearly, and "a little more" would actually display as much more particularly at higher decibels.
But even reporting volume in decibels does not reflect how "loud" something is. We perceive loudness in a complex manner, and even at low volume a person yelling can sound loud. This is why pop music and commercials are louder at the same volume[1].
Now imagine that we will create a display of 5 bars that will show how loud something is. We would start with the logarithm of power (dB) because thats pretty easy. But now people put their hand over the device and it still sounds loud, even though our display shows it to be half as loud. Is this a broken system? Not really - just a complicated one.
So please, I understand that you do not understand. But try to understand that this is complicated, and Apple actually has some idea what they are doing.
Yeah. Such as "We decided from the outset to set the formula for our bars-of-signal strength indicator to make the iPhone look good - to make it look as if it "gets more bars". That decision has now bitten us on our ass."
What does the karma here get you? I mean, a higher rated comment might be seen by more, but I don't think your total lifetime karma count effects that at all.
Hilarious. I read bits of the letter to my wife this morning as she dashed about before work (she ordered her iPhone 4 a few days ago) and after the bit about bar display I said "the subtext here is that this is AT&T's fault!"
The blame-shifting approaches subliminal. If you aren't looking for the hidden message, you probably won't see it.
I'll probably get downvoted for this comment but in this case, I think Apple tried to hard-code in a reality distortion field with the bars and it backfired..
A heavily-biased 3G signal meter that shows 5 bars for over half the signal range seems to have been implemented in iPhone OS 2.1 back in 2008. Problems show up with that display implementation due to antenna being on the outside on iPhone 4 and crossing the threshold from 5 bars to "other".
AT&T on the other hand is recommending a more accurate approach instead of interpretting 50% as 5 bars.
Not sure why people are reading this as an AT&T issue.
>I'll probably get downvoted for this comment but in this case,
Good post except for this bit. Please don't bring that phrase to HNs. Good posts can stand on their own.
And yeah, I think the main issue is that the RDF algorithm is collapsing. It was originally a great way to alter perception about AT&T's service, but it backfired due to the issues the 4 may or may not be having with the antenna. The problem of sudden dropping calls at low has been there all along, but with the antenna holding issue (or "issue"), the bubble has been popped.
Out of curiosity, is Apple the only company that does this? Does anyone else display <100% as 5 bars, or is that strictly an iPhone thing?
If you are in favor of the capitalist system you have to respect Job's attitude of: "I think we have the best products. If you don't think so please return it for a full refund, and buy a competitor's. Thanks, bye"
I was thinking about buying an iPhone 4.0 when I qualify for an upgrade later this month, but given all of the issues with the antenna doesn't it seem like they're going to release an upgraded version of the hardware in a relatively short peroid of time?
the antenna issue seems like it indeed is software and really a non-issue. the "feature" to show 50% of signals as 5 bars was introduced in OS 2.1 update back in '08. Here is a graphic to showcase that: http://fscked.co.uk/post/754590440/this-infographic-hopefull...
The fix is to undo that hard-coded reality distortion field into AT&T's recommended formula. In this case, AT&T isn't the bad guy as it seems like apple was trying to show their phones had better signal by misrepresenting the strength.
The other, IMHO bigger, issue is the proximity sensor. To me, that's a NEW issue, and needs to be addressed (hopefully by software patch), since I've had issues with 100% of my calls.
but what does any of this have to do with the fact the signal goes down when you hold the phone in a certain way? whether it drops from 4 to 2 bars or 2 to 1 based on whatever algorithm they had surely doesn't change the fact that it is still going down?
it goes down because if you're at the floor of that green bar (that fills up half the range), and then you put your hand over it, you cross the threshold and it looks like a dramatic decrease.
what should happen, most likely with other phones, is that the range should be evenly distributed so that a drop in 1 bar is a drop in 1 bar. because iPhone 4's distribution is weighted/biased towards the top half of the spectrum, it fails when you cross the threshold. This "feature" seems like it was introduced in OS 2.1 back in '08.
You also can't avoid drops in signal when putting your hand over it -- this happens with every phone. just iphone's visual representation algorithm makes it seem far worst than it really is, and they will fix the way it displays in the next version.
EDIT: Why the downvote? Gruber even stated that the issue is the introduction of the biased bar meter. The exposure of the antenna just highlights the issue when you're at the threshold between 5 bars and "other".
If you read the blogs and apple's own letter, it does in fact happen with previous iPhones.
EDIT: again, why the downvote? To quote apple's letter:
"To start with, gripping almost any mobile phone in certain ways will reduce its reception by 1 or more bars. This is true of iPhone 4, iPhone 3GS, as well as many Droid, Nokia and RIM phones."
"..this mistake has been present since the original iPhone"
But this doesn't happen to the same degre as the iPhone 4. The iPhone 4's antennae placement makes this happen with a relatively standard hand position. I've tried this on a Pre and EVO, and I can't get either to do it, although I'm sure there must be some hand position where it happens -- but it shouldn't be one of the most common hand positions.
I don't disagree that the signal goes down if held in a certain position with the iPhone 4. I just think we might all not have noticed it had they not implemented the 3G signal meter "feature" in OS 2.1, since the signal degradation might be only 1 bar (see: anandtech's article with numbers). In real practice, my calls get killed because of the proximity sensor issue from my face, not the antenna degradation from my hand.
24 dB of loss is a staggering amount of power. Watt for watt, it means about 1/250 of the signal power is actually available to the phone's radio hardware when you hold it the "wrong way."
That is simply not acceptable regardless of what various blogs, experts, and apologists say. The fact that it was necessary for Ars Technica to hack the phone to obtain a quantitative figure at all is also damning.
In all honesty: do you feel the same anger about the Nexus One? If you feel Apple should recall or replace the iPhone 4, do you feel Google should do the same with the Nexus One?
I ask because -- per Anandtech's numbers -- the Nexus One can lose up to about 30% of signal strength depending on how you hold it. Which, while less than the iPhone 4 (which can lose up to about 38%), is still a rather large issue.
I'll admit I stopped reading when I got to the part about the iPhone 4 losing 24 dB when held a certain way. How does the article reconcile 24 dB (a 250:1 power ratio, or a 16:1 ratio expressed in terms of voltage) with 38%? These are two vastly different values.
The Anandtech article only gives the straight-up dB (iPhone 4 loses up to 24.6 dB, Nexus One loses up to 17.7). The percentages I thought came from it as well but I can't find them now; I'm currently hunting for the article I'd seen them in.
At any rate, the question stands and I'll just rephrase it in terms of just the dB numbers: is 17.7 dB OK but 24.6 dB not? If so, on what basis? If not, where's the outrage over the Nexus One?
And that's without getting into questions like "is it OK to lose a bit more signal if the ability to keep functioning on reduced signal has improved enough?"
Read the Anandtech article. Yes, it loses some strength, but that's not unique to the iPhone 4 or to iPhones, and they have actual numbers for different types of phones that you can use to compare and consider what you'd like to do.
> the antenna issue seems like it indeed is software and really a non-issue
Incorrect. Read the text on the page you linked to and the original Anandtech article: the iPhone 4 loses almost 20dB of signal when holding it naturally in your hand, without using a case. By comparison, a Nexus One loses about 10dB, and the 3GS loses about 3dB.
Yea, the iPhone 4 loses more signal than others due to antenna being on outside. But real world performance shows it still outperforms 3GS/other phones. Other parts of the Anandtech show that iPhone 4's sensitivity is greater.
Is there any evidence either way on whether the proximity sensors problems are hardware or software in nature, and if they're hardware problems whether they can be worked around in software?
You get a 30 day return period with AT&T or Apple. That's the part of this story that has never jived with me. If lots of people are having this problem we should be seeing lines at AT&T and Apple stores for people waiting to return their phones and, as far as I know, we're not.
I didn't do a return because of the antenna issue. I did do an exchange because of the proximity sensor issue though. And, the replacement phone still has the problem. All my calls get are affected by this. Randomly detects my face as input and initiates facetime, puts calls on hold, randomly tries to SMS people, or just ends the call. The real issue is the proximity sensor as that is a bug killing calls now.
If the iPhone does 95% of what they want, and they return it, now they get 0% of what they wanted (and they can go buy some other phone and get 80%). What they really want is Apple to improve it, that's their best outcome.
Corporations have reached the masterstroke of PR where they don't even have to lie about how they are lying to you.
They just tell you outright how they are going to fool you.
Next BP will issue a press release admitting how they are moving around sand to cover up oily sand to improve perception and how people should appreciate that.
This has been happening in politics for quite a while now. Serious pundits and campaign advisers openly discuss issues of narrative, image, and perception on mainstream news programs, with little regard to how well that perception matches reality, and without any concern that viewers will read between the lines and realize that they're being sold an illusion.
Meanwhile, everyone knows that no politician writes their own speeches anymore, but clap and cheer anyway when they're told what they want to hear by a professional marketer whose job is to create warm fuzzies around a brand.
Upon investigation, we were stunned to find that the formula we use to calculate how many bars of signal strength to display is totally wrong.
"Oh, and we also disabled the Field Test mode so that you can no longer obtain a quantitative RSSI reading in dBm. Because if somebody were to, say, hack the phone to re-enable it, they might notice that holding the phone by the antenna could degrade its noise figure by as much as 24 dB (http://www.anandtech.com/show/3794/the-iphone-4-review/2), killing all but about 1/250th of the signal power seen by the front end. And we don't know about you, but that just wouldn't be magical enough for us.
"Not only that, but we've either fired the RF engineers who designed our 'magical' antenna, or we're just now getting around to hiring our first ones. Sure, we only release one or two new products a year, but we apparently need three full-time PhD-level antenna engineers ( http://www.engadget.com/2010/06/30/apple-hiring-iphone-anten... ).
How stupid is it that Apple is going to increase the height of the first three bars? This sounds like the kind of design by committee that happens elsewhere. Not naming names, because it happens all over the place.
It's as if someone in engineering stood up and said:
"Hey, we knew our 'bar' algorithm was shit when we implemented it, and we in engineering said something about it, but no one listened. So fuck those guys in marketing, we need to implement a realistic algorithm, pronto."
Then the marketing team responds:
"Well, it might be good engineering, but users certainly aren't going to get the warm and fuzzies. Isn't there something else we can do?"
UX reluctantly speaks up:
"Well, we could make bars 1-3 taller, so they appear more like the old 4-5 bars."
Marketing and UX: <HIGH FIVE>
Engineering: <cries in a corner>