I'm so glad to see actual numbers for this. Time and time again I've been called delusional for saying that Android devices have more observable lag than iPhones. It is a very, very small component of the overall device and OS that has an enormous affect on the overall user experience, at least to me. It doesn't seem to be that big a deal to many as evident by phone sales, but it drives me absolutely nuts and is the main reason why I won't make the switch to Android.
For comparison, here's another set of benchmarks from a guy who appears to use the same methodology (240fps camera, count frames between input and screen response in custom lightweight apps).
And, to add a tv game console into the mix, apparently the latency between input on a PS3 wireless controller and home screen interaction is also about 50 ms.
From the linked page: "THIS IS HOW COLORS! PERFORMS ACROSS DEVICES – WHICH MIGHT NOT BE HOW OTHER GAMES/APPS BEHAVE ON THOSE SAME DEVICES. Each device has their own way of doing input and rendering, and we have done more work on latency on some devices than on others."
In other words, one test was designed to be a benchmark from the ground up, and the other is a cross platform app.
The note has different tech in it than the vast majority of android devices (stylus support). Also, i could not find the result to verify whether it was for touch or stylus.
Not surprised at Nintendo's good showing. I remember noting that the older DSi felt incredibly responsive when using its stylus to draw. The several drawing apps I've tried on iOS haven't the same impression that your input is immediately laying down ink.
I've got no direct knowledge of this in particular but I've heard that the mouse movement is one of the highest level interrupts in the OS and won't be preempted -- so the number is probably very small but the response by the application when you click may be longer. Also, this is why sometimes you'll see the entire computer locked up except for the mouse movement.
Part of that is due to having a hardware mouse cursor. Basically all the interrupt handler has to do is load the new coordinates into some registers on the video co-processor and co-processor takes care of all of the work of blitting the mouse around the screen. It makes processing the mouse interrupts very light-weight.
Resistive screens don't need to be scanned like capacitive ones. The moment you make contact on a resistive film the controller can detect the resistance change and perform the A-D conversion. There's more latency in the host communication traffic than the conversion.
On the other hand, I have rolled my eyes at people like Gruber who went on and on about how the iPad is totes for creation and not just consumption, and how styli are stupid - after spending time with my iPad and trying to draw and "touch write" in apps myself.
Apple makes great devices, and I wouldn't use any other phone and tablet, but I think people like Gruber have been completely blind to the technological shortcomings of iOS devices in this area, because Steve Jobs told them styli are stupid, and the iPad is going to disrupt this-and-that.
It also brings up the whole "to stylus or not to stylus" discussion again, because one of the main reasons you wouldn't want to use a stylus for an iPad is that it highlights just how (comparatively) poor the touch latency is for the tasks that lend themselves to work with styli.
It perhaps best underscores that iPads are very much - still - disruptive technology, which by definition is in many ways inferior than sustaining technologies, but makes up for it in other compelling ways.
It's one of the many reminders that we shouldn't get too busy throwing out everything paper and analogue. Disruption is not synonymous with obviation.
I'm just saying that Apple have a good reason to discourage people from using styli with their iPad, for reasons other than the new finger-based touch paradigm. (As it exposes inherent shortcomings.)
I don't fault Apple and Jobs for that, though; I am just bothered by Gruber's rather naïve praise and uncompromising defence of the iPad as a device for creation over consumption, an argument I think he more than Apple and anyone else champions.
I think you misunderstand the point that he is trying to make. Gruber's "iPad isn't for creation" jokes are purely a rebuttal to all the naysayers that repeated, ad nauseum, that iPads are only for consumption. iPads may not be the best drawing/production/editing/writing devices out there, but it's clear that people are using iPads for professional level work.
I don't get the sense that Gruber thinks the current incarnations of the iPad are even particularly good for much beyond gaming and reading.
The iPad is eminently mediocre for any principally visual task such as sketching or painting. The touchscreen, while highly responsive, simply lacks the resolution (even with a stylus) to do detailed work without a massive zoom ratio.
For things that don't require visual precision, such as music production or writing or concept diagramming, the iPad is highly capable. I look forward to the day its touchscreen can handle pen input to an equal degree.
Wow, you had better explain that to my brother - he just finished illustrating a children's book with beautiful water-colour style images - all done on an iPad. But maybe I imagined that...
People do great things with mediocre tools all the time, so that does not tell us anything about the quality of the iPad as a tool for this type of drawing.
They do. When they don't have access to better tools for one reason or another. This is not my brother's case though - he simply found the iPad to be a superior solution to the problem.
I agree. I've been doing professional illustration for print (among other things) since the late 70s. My most recent published project is a book cover illustration. I have access to a huge range of media - gouache, acrylics, oils, pastel, oil bars, inks, etc - as well as pressure sensitive graphics tablets and various painting and illustration software for desktop platforms.
I was free to choose any of these, but I chose to use an iPad because it allows the immediacy of sketching and painting with natural media and the significantly faster turnaround of digital image making.
Like any tool it takes some adjustment, but once it clicks, there is nothing more immediate than using your finger(s) to directly paint an image. The fat touch region is a non-issue once you've learned to use it - after all, there's nothing terribly immediate about flexing one's fingers and having a mark appear 2-10 cm away at the tip of a pencil, pen, or brush - we've just become so accustomed to it that it seems "normal." A bit of time with an iPad and tablet finger painting seems equally immediate and natural.
Speaking as someone who regularly sketches on his iPad, this is just wrong. You want to know what sucks for graphics work? A trackpad — it took me fifteen years before I could manage 3D and bezier manipulation with a trackpad as well as with a mouse. (I could draw freehand with a mouse within a day.)
I'll second that the ipad is a great creation device but just not for proper stylus based creation - some paint styles, most drawing, and any handwriting. But it is good for more common creative applications - a lot of photo apps, music, keyboard-based writing. If Apple also made a stylus-based product I'd get it in a heartbeat (because ironically it's weakest as a paper notebook replacement for meetings and sketching ideas), but if I had to choose between those types of creative applications, there's more mileage in the ipad.
The Android numbers from the OP are just for few devices - for e.g. the Tegra devices with DirectTouch aren't included. Also from another post above the Note has better latency than iPhone 5. If you want to go Android - you can mostly get what you want :)
Is it safe to update to the nightly releases? Can you do it automatically?
I just upgraded to the latest release candidate (from a 1+ year old version) and it re-locked my phone and disabled the Google Play store. I finally figured out after much googling that I had to reflash google apps but was pretty stressed for a while there.
I'm wondering if perhaps I auto-update to nightlies if I wouldn't run into that problem again, but then I'm also worried about a bad release.
This is probably because you came from 10.1 and upgraded to 10.2.
Then it locks your phone again, but its easy to re-enable. Updates between 10.2 versions don't lock your phone.
You can just download new updates in your configuration, system submenu of your phone.
Just download and then click install. I never had a problem with it actually (although they are nightlies), sometimes battery improvements are noticable and system improvements, that's really cool.
So, i never experienced a bad release, and i'm always capable of downgrading my phone to the previous release (i download a new upgrade, install the new release and after some days i delete the old installation files).
A lot of days, i just do an update every day at midnight when i'm going to sleep.
One day, my phone went from 1 day battery time to 2 days (after the update from 10.1 to 10.2, somewhere the fifth nightly update i did).
One advice, if you install 10.3 in the future, install GAPPS in your system/app directory first. So you can install it immediatly.
It's complexities like this that I am too old to have time for that would make me more than happy to pay $x/year for a premium Cyanogenmod "subscription" service as floated recently that takes care of all this for you.
I hope I see the battery life improvements, that's one of my main complaints about Android.
I've been running nightlies for maybe 8 months to a year now. I update every few days to a week. I've had two issues in that time:
1. When the alarm clock app was rewritten/significantly changed it began crashing in startup. Cleared the app data for it and it worked fine.
2. More recently, when they merged in the 4.3 branch from AOSP, it required an update to gapps as well. I didn't actually read the change/upgrade logs and ended up just reinstalling my phone clean.
Really not all that many problems when you consider it, and not all that serious. The worst case scenario (reinstall phone, restore from backups) only came up once.
I have a Samsung Captivate, an iPhone 4 and a Nexus S (developer phone) all purchased in Q4 2010. Nexus S updated until 4.1, iPhone is still current with iOS 7 and the Captivate got ONE update (manually via USB) from 2.1 to 2.2... the Software Update NEVER updated.
I had this same experience. In 2010, I decided to buy the Captivate over the iPhone 4 after reading a bunch of reviews online. IIRC, I bought it right when it was released, which almost exactly coincided with the release of Froyo. After four or so months of putting up with a laggy interface, cheap (feeling) build quality, and no software updates in sight, I sold it and finally bought that iPhone 4, which I used for the next 3 years.
Recently, I decided to give Android another try and went with the Galaxy S4 when that came out. I was pleased with how far the OS had come since Eclair, but it still felt less responsive than even my 3-year-old iPhone. As for the build quality, that's my biggest regret in buying a Samsung product again. It's hard to beat the feel of an Apple device, but at least the Nexus 4 and HTC One are trying.
Omitting Flash sizes - there is Nexus 4, Galaxy S4, HTC One and soon Moto X which Google will update like they do any other device they sell. Pretty good variety than iPhone models I'd say.
Key difference is you don't get the mostly similar 3 iPhones - you get 4 vastly different phones with meaningfully different feature sets - and they are even made by different companies. Way more variety and choice.
I bet all of the Google Play sold devices would see at least 2 major updates. (My Gnex got 4.1/4.2/4.3 - and unlike my iPad 3 ios 7 update, it actually got better at every update.) That's good enough for me considering that Android updates are really not the same as iOS updates - most apps including the Keyboard get updated independent of the OS.
I'm glad to see numbers on this too. Hardware benchmarks like this are nice because they can capture end-to-end latency, but they're difficult to run. If you want to easily capture some latency numbers yourself, I've been working on an all-software benchmark for input latency: http://google.github.io/latency-benchmark
One of my biggest uses for the ipad is as a control surface for synthesizers (via Lemur). You need a high precision screen. There is a reason iOS devices are popular as control surfaces and synthesizers.
And just for those who care, Rheyne does a phenomenal job of using iPads as controllers (http://vimeo.com/72861463)
It depends on the device. Some Android devices are very snappy. For example, Samsung's 7" tablet appears to be very snappy to me - and I'd be hard pressed to pick the difference between it and my iPad 3rd gen.
However, there are a few Android phones that are particularly laggy. The Galaxy III comes to mind - when I last used one. It didn't feel anywhere near as responsive as my iPhone.
The international version or the US (LTE) variant?
I have the Verizon SGS3 (albeit running CyanogenMod) and the responsiveness is very good. TouchWiz (the Samsung Android overlay) probably adds some latency and CyanogenMod uses a newer kernel (3.4.xx vs 3.0.xx) so that might also contribute to the difference...
Agree and disagree. The touch screen is the most important aspect of a touchscreen device. Observable lag affects the user experience when they're moving faster than the device will allow.
It is a big deal but I don't think the majority of users are bothered because they see it as an inherent trait of a computer to lag.
the latency on android gives the phones the feeling of using a computer. the lack of latency on ios gives phones the feeling of actually manipulating things on the screen.
iOS latency is still well above what's noticable by human perception. Tack on to that all of the delays you get once you're running a newer version of iOS on older devices (which I suspect is the majority of people using Apple products) and I think it's pretty obvious that there's less of a gap in how people perceive the interaction model than you're suggesting.
It's indisputably obvious how much better the iPhone screen is than other platforms. How is, for example, the difference in menu scrolling not apparent? I can't believe we needed a study to realize that. And this is coming from someone who's never had an iDevice in his life, just Android and Windows.
The reason it isn't apparent is that most people do not have the opportunity to really compare the two phones side-by side, with everyday actions: swapping between apps, swiping menus or notifications, etc, and so VERY few people get to really experience the nice things about one platform that the other doesn't do well, and vice-versa.
I don't think it's that rare... people often have multiple devices.
I have an Android phone and an ipad. It's very clear after a little use that the ipad is "snappier" (and animations are smoother etc), but iOS also feels very very limiting by comparison with Android. There are so many little things that Android just does right, that iOS...doesn't.
So which is more important, smoothness or functionality? I suppose it depends on the person and how they use the device... but really, I'd like both... :]
Besides the things iPhone copied blatantly from Android phones (Notification center, 'Today', Quick toggle settings panel, new multi-shot camera etc.), you have:
100 things iPhone 5s-5s can't do that Android can.
http://www.youtube.com/watch?v=uVTrazT99Ps
It's very possible that there's a difference, but it simply isn't large enough to be something that even registers for me - when I use my device my attention is not on trivial stuff like that. There are annoyances and things I notice lag, but touch response or menu scrolling just are not amongst them.
I think people here are significantly overestimating how much most people pay attention to these things.
There are a lot of things to learn from Apple: I don't [want to] understand why browsing Internet in the iPad Mini is smoother than in my high end notebook. Why my iPhone 1 browsing experience was better than the Samsung S2.
Just like the pixel density of an image, beyond a certain value, makes no difference to the human eye, perhaps the touch screen response time also gets perceived only up to a certain speed. Any faster and it makes no difference. Outliers with above average visual capability do perceive things differently and maybe you have a similar ability with respect to noticing touch screen response times. That might explain why most people don't care so much. Just theorising here, but if anybody has any numbers, it would be good.
These latency numbers are around the border of minimum perceptible visual latency, which is somewhere around 100ms (http://stackoverflow.com/a/2547903/547213) A difference between between 55ms and 123ms is enough to matter.
Compare any two drum set applications - no drumming skill needed - on even a high end Android device vs a 4 Gen iPod Touch and the iPod has much better response aurally. In fact, Android instruments are universally so laugh they are unplayable. If there is a usable android drum program/hardware combo I'd like to see it... So far, I don't know why people even make instrument apps for android other than sequencers.
There's a certain amount of lag I can get used to with Android. I noticed that upgrading from 4.1 to 4.2.2 didn't eliminate the lag, but it reduced it to being much more usable. I used a WP8 phone for a while, and I found that even though the swooshy animations can take a relatively long time to complete, it was so responsive in starting an animation that it was hard to go back to Android.
According to this test WP8 isn't any better than Android. Well, specifically the Lumia 928 isn't any better than the Galaxy S4. Need to be specific about which device, since we're almost exclusively talking about hardware here (touch controller, GPU, and display controller) - the OS isn't doing much.
Are you sure? The article doesn't seem to say whether it's a hardware or software issue, and the iphone's hardware parts aren't exactly secret proprietary stuff... I mean the Galaxy S4 is supposed to have the better PowerVR GPU for example. I'm thinking the difference is in the software stack, I think you'd be surprised at how much code gets run between the time there's an input event and an app reacts to it on the screen.
Anyways, it'd be interesting to know if running Android on an iphone showed the same delays.
I mean, fundamentally it's both. Touch controllers and their firmware can add 15-45ms of latency based on their technology (hardware) and the quality of their filtering in the firmware (software). On the other side of the spectrum it's all about fill-rate, inherent latency in drawing pipeline, whether the input dispatching is phase locked with the display refresh, etc. etc. Again, some of this is software, some of it is hardware, but most of it is directly tied to the specific device. The actual input pipeline on Android, once you get out of the touch controller, is pretty negligible today (<1ms from touch controller interrupt until the application receives the input event).
The hardware that matters here is not the GPU. It's the touchscreen controller, the display controller, the associated firmware, and to a lesser extent the drivers for those. All either Apple proprietary or heavily customized.
The GPU can matter as well. On the Nexus 10, for example, the fill rate is pretty low compared to the screen size, so applications have to be very very careful not to exceed the limit or they'll end up adding frames of latency (which looks like touch or interaction latency).
But in general you're right, those are the components that contribute the large majority touch latency to a device.
The test measures the "minimum response time", but I don't think that's the most useful measurement of responsiveness. After all, changing the software does make a difference in how "laggy" the phone seems.
Although I periodically am annoyed at Apple for various things, it's frequently clear to me that they understand that certain aspects of the user experience are really important.
Much of their competition treats user experience just like any other consumer electronics company: If it doesn't crash, ship it!
Most Anroid customers don't value these little touches. Being able to install various ROMs and have widgets on the homescreen plus other customizations are more important to them.
The fact that one platform is ahead or behind in one area isn't telling of what a mass of half a billion or a billion plus customers value or don't value.
Most apple fans don't realize that negligible differences in touch responsiveness against a small cherry picked selection of android devices doesn't make up for flexibility, cost, choice...
Except that your finger is bigger than that so you can't see what you're doing anyway. On ANY touch device. STYLUS (as you said) ftw. Or a mouse pointer. Infinitely small.
You are comparing size with time, which is an invalid comparison without further constraints (e.g. a visual response constrained to emanating out from the touch of a finger). Thus, you are making the assumption that all visual responses to the input of a finger will only occur directly under the real estate occupied by a finger - an assumption that I would argue is entirely unfounded in the realm of modern phones.
Agreed. To me it's either when you are at the minimum size of a numbering system with discreet quantum. Also yo ucan do a cross hair with the middle pixels empty so I guess that would be infinitely small, and a closed crosshair would be 1.
Then complain about how they have no idea how to use it and call the nearest family member when their clock widget "disappears" (on the next homescreen page).
To get an idea of the impact and importance of touch latency, see this intriguing demo by the Microsoft Applied Sciences Group: https://youtube.com/watch?v=vOvQCPLkPt4.
Drawing is one of those things that has felt awful on iOS devices to me.
Yep. There's a decent homebrew paint app on the Nintendo DS that runs circles around the smartphone counterparts... The Ds has a joke of a processor, terrible resolution, and no memory worth mentioning. But it has a resistive pressure-sensitive touchscreen and a stylus. That's enough to make a huge difference. Resistive touchscreens are fast and pressure sensitive, allowing far better conditions for art.
The sad thing is that official games aren't allowed to support the pressure-sensitivity provided by the DS hardware, so it's only used in homebrew apps like the paint app you mentioned.
I remember reading somewhere that the DS homebrew "pressure sensitivity" was merely a hack. Since you hit more neighboring pixels when you push the stylus down, you get more "pressure". (This could be outdated information, though.)
>Drawing is one of those things that has felt awful on iOS devices to me.
Compared to what, since, according to TFA they have the smallest touch latency?
Perhaps you haven't tried the right apps. Most drawing apps (including major names, like Autodesk's) have slow-ass drawing code. Heck, some painting apps are even slow on my iMac (Corel Painter, ArtRage).
That's not because of "touch latency" though. It's because of slow draw engines. And I say that, because I've seen apps with very fast responsiveness.
Try Procreate ( http://procreate.si/ ), which uses the fastest engine I've seen (specially coded in OpenGL and 3D-accelerated). And check the artwork created with it by some of the community users (there are 2-3 videos on their site it's AMAZING).
Two other apps I found fast (but not as fast) are: Ideas, by Adobe, and Paper.
Paper, dude(tte)! The analogue, lowercase-letter kind. :)
Paper and Procreate are excellent apps, but the input lag is very noticeable on my iPad 3; if you move your finger really fast and stop, you can see that it takes a short amount of time, before the drawn line catches your finger.
It wasn't long ago that dead-tree books were preferable to tablets, before Retina was a thing. And you could still argue print books are preferable.
Sometimes, the best Technology can achieve is the bar set by the non-digital world, and until then, it's going to feel grating to pedantic curmudgeons like yours truly. :)
+++
EDIT: A closing thought. The highest bar is always human perception:
(1) The optimal FPS for a videogame is, to my knowledge, 60.
(2) The optimal display resolution for a reading device is, TMK, ~326 PPI (Retina).
That is the goal. Anything lower than that will stick out and annoy people like me. Anything higher is not for the engineers, but the marketing department.
The "catches your finger" thing may be input lag, but it much more often is amateur level coding, it seems to me.
One of the first things you learn if you try to do a simple drawing app on a slow machine (think Commodore 64 or similar early 80's 1-2MHz computer), is that if you try to process every position change, you will lag badly.
First rule of doing a decent paint app, is to decide how large deltas you can accept between each position where you actually draw, and drop events accordingly. Small delta, and you will lag; high deltas and you either need to draw lines or will get "dots" instead of a continuous line, but your lag will be limited to that of the input device. Even an early 80's home computer can give you "lag free" painting this way, at the expense of precision if you do large, very fast movements over the screen.
This is "paint app writing 101". Yet a lesson that seems to have been lost on most people writing paint apps these days, possibly because they've become accustomed to computers fast enough that they're no longer constrained by it.
I don't know about iOS, but I do know from observation that out of 30+ draw/paint apps I've tried on Android while trying to find one I'd actually be happy to use (I did not find any; I'm picky), it is obvious that this is the problem for the vast majority of them for the simple reason that there's a huge spread in observable lag for them, and so even if we assume the fastest of them are limited by input lag, the extent of the lag on the majority of them will then still be down to crappy coding.
(the quality of paint apps for Android is just beyond awful in general; don't know about iOS)
As much as I love my iOS devices, they still have MASSIVE latency compared to my Wacom tablet. There's just no comparison. I guess the touch processing had a lot of overhead.
Touch processing adds somewhere in the ballpark of 15-45ms of latency depending on the technology used in the screen and the quality of filtering in the touch controller. Anything about ~20ms of latency (total from touch to display) is perceivable by humans, is usually a good approximation for noticable increases or decreases in latency (i.e. shaving off 2ms doesn't really matter, but shaving off 15-20 will be noticeable).
> Compared to what, since, according to TFA they have the smallest touch latency?
Drawing apps can easily end up with significantly higher latency than the baseline if they aren't careful with how they do input smoothing and such. It's really easy to get that wrong and bloat your touch latency.
I've done some research which involved the perceptual effects of latency, so I was interested in the test rig from their video and found a paper they published in ACM about it:
http://edgey.com/wp-content/uploads/2013/01/p453-ng.pdf
Turns out that to achieve this 1ms latency system (once you have that, it's simple to add artificial latency) they had an FPGA directly processing touch data from a resistive sensor over a 2 MBps serial (optical) connection, computing the center of mass and directly driving a DLP projection display so that parallax between the surface being touched and the display would be eliminated. Nice rig!
Of course the video is a little bit misleading when it talks about 1ms refresh, since touch response stops being the limiting factor before that point. AFAIK the fastest refresh rate for any phone on the market is 250Hz. That's 4ms refresh, which is great, but of course you also have to factor in time to actually process the touch event, change the image, and render it.
It's not misleading at all. They specifically said it was a test setup and stated that their aim is to be at around 1ms latency in the next decade. Obviously that means working on the other limiting factors to get there as well. It's not like they'll just try to get the touch response to 1ms and ignore screen refresh rates.
That's not unlikely, and it's not insane either. Processes are getting bigger, frameworks are taking more memory, and things take longer to fetch back in on app transfers. So if you want the apps to be snappy, you have to make sure they're in memory. Android devices have settled on clubbing the problem over the head with 2G of RAM (basically keeping stuff from being paged out to start with), while Apple went with 1G and has to play more subtle tricks.
Performance is about tradeoffs. The stuff measured in the article is response time of foreground apps. Getting the app into the foreground to begin with isn't free.
Android and iOS make no use of swap, they certainly will page things out that can be ejected (so not process heap, but mapped disk pages, including executable data segments, etc...). There is a little more involved with Android, and no doubt a lot of hackery under the hood in iOS, but both these OSes are running kernels with traditional VM systems and they work.
It's definitely not paging in the traditional, OS-level sense, but the standard Android (and, I think, iOS) behaviour of providing applications with a framework to restart exactly where they left off should be pretty much the same in theory. In practice, not so much.
Err, I see no way in which the activity life cycle is like paging.
In typical paging the application's memory is taken from RAM and placed onto a slower storage medium to free up RAM.
In the activity life cycle, Android is just telling the app "you're in RAM, but you're not running" or "you're going to get killed because I need the space, if you want to give me a bit of state information I'll hang on to it in RAM for you".
I don't see how the two are comparable at all, and I don't see how you could have paging without using a secondary storage.
I find some of the new transitions annoying, too, but it simply doesn't make any sense to say that they are "to hide choppy performance."
We know this because the transitions are very smoothly-rendered. Which takes processing power; the animations and such would not be smooth if there was a CPU issue they were trying to cover up.
It's an intentional decision. There's plenty of performance under the hood, or they wouldn't be able to afford the extra CPU cycles to draw fancy animations.
That being said, I hope they shorten some of these transitions, as well. In many ways, iOS 7 is faster and more responsive, and so it'd be nice if it were faster in this way, too.
Not really, the type of animations they're doing are very cheap and can be performed entirely on GPU, without using much memory bandwidth, I/O, or CPU time.
In the 2nd link the comparison is using the beta of iOS7. Video uploaded Jun 16th. Comparing stable releases to prerelease software is a bullshit comparison.
One video is 3 months old; the other a month. Very likely there would be performance issues in beta software, to say nothing of the likelihood that there may have been debugging code left in that might not be in RTM.
My even bigger problem is Safari: they made web content obviously faster to load, but changed UI to be unpredictable -- you almost never know when it is going to blend in the bigger URL top or the whole lower bar. By introducing changing screen that you don't control, they made the experience much worse.
For the first time in years iPhone starts to behave like Windows before turn of annoying settings. And here I can't even turn them off. Jobs would simply not let them be.
It's not unpredictable; it's been out for three days. You simply have not learned to predict it. But it's very predictable, and I bet you'll be used to it within another day or two.
It is. Bumping at the end is predictable. But it appears even at the times when I just attempt to scroll the middle of the page that I view at the moment. Sometimes when I want it to appear, it doesn't. But more often it appears "just because."
This kind of responsiveness is actually the kind of thing that Apple is sort of famous for by designing both the hardware and the software, but it is surprising that Samsung is still behind.
I think the other thing that is not very often taken into consideration is how resolution impacts responsiveness. The more pixels the graphics card has to push, the slower the screen to redraw.
Pairing bigger devices with big resolution means that achieving consistently high performance is harder without equal advances in graphics processing.
If I'm not mistaken, they license a lot of stuff from Synaptics. But that said, many do so and don't get it working as well as Apple does. I talked with a Synaptics guy who said it's all about the implementation, and "bad" or "slow" touchscreens are generally just poorly optimized in the OS.
Interesting that none of the Android devices tested are Tegra based. Nvidia being gaming focused had to do something about touch latency so they did DirectTouch which was designed for better latency and lower power draw. Given the architectural differences between DT and others, I think DT should do at least somewhat better. See here for example - http://www.youtube.com/watch?v=DehlRJZPsDY .
I have the Lumia 920 and when I have to develop on my iPhone 4 or use a friends iPhone 5/4s/4, this is my main problem. Lumia screen is so responsive that even iPhones 5 feels sluggish to me. Typing is especially challenging so it's nice to see that $AAPL addressed the screen response. Next upgrade to the screen needs to be to fix the size.
I'm not sure. I just know that my 920 feels very responsive, I actually haven't looked into why :). I'm sure that's the case for most consumers. They just notice something better but don't investigate.
It isn't just your Lumia, I have a Samsung Focus (that's Windows 7 first device), and when I am using an iPhone or Android device, I always notice how unresponsive they feel. This is most noticeable (to me) on the keyboard. On Windows devices, the keyboard input is absolutely instantaneous, with the others, I feel like I'm waiting for the key to be pressed.
I didn't notice this until I got my WP phone, but now that I've had that experience, I have trouble going back (note: I switch between iOS,Android,WP(7) every few weeks).
I remember my time with a galaxy s2 and the screen size was a pain. I really apreciate the path Apple choose with the size. So i dont get your point in this aspect
I had the iPhone, iPh3G, iP3Gs, and 4. When I saw the size increase of the 5, I switched to Nokia just to see if a bigger screen (not just longer) was "really" a big deal. I think that the 5 inch screen is a really good size as long as you can keep the bezel size down (Which it isn't on the Lumia 920). Everyone has different preferences but I just which $AAPL had more options. The whole "thumb" argument is baseless.
I think you have valuable insight here and you've got some helpful anecdotal evidence, but the whole dollar sign before the ticker thing made me immediately think of the "M$" people.
I'm assuming you're giving the ticker syntax, but it almost made me question your post, especially considering you're speaking against the iPhone. Just a knee jerk reaction from a passerby.
Oh sorry, sometimes I speak in "Twitter Speak" using the $<stock-symbol> to identify a company. So I would say $MSFT, not M$. I need to stop doing that and just type the company name out :).
The funny thing about you commenting on me "speaking out against the iPhone" is that I just spent 12 weeks building a game for the iPhone [1]. And right now I'm working on a cross platform authentication method using a phone number and the platform I'm testing it out on first is iOS (then Android, then WinMo). I'm actually a mobile-first iOS-first developer that is realistic about the state of the ecosystem. I can't consciously just jump on a bandwagon and say everything is awesome when things aren't.
It seems like an important thing that benchmarks didn't even really exist for until now but I think the topline number is off? 55 ms to 114 ms how do they get 2.5 times faster?
2 times faster is already very impressive, no need to exaggerate further (although I guess thats how you get the clicks).
It is possible the author was using the best times for each device for the calculation, which does approximately equal 2.5x. That said, I agree with your point.
Looking at the numbers a bit more closely I'm guessing the source was comparing iPhone 5 to the pack of Android phones where it's 2.1 to 2.3. This source said 2.0 to 2.5 which seems accurate with that so maybe it's Venturebeat playing up the bigger number.
Now that kind of Flame Wars start again.
I remember the times where in every Mac Magazine you've found a comparison between Macs and 'PCs' where they did measure the times of flipping Images in Photoshop and were proud when the Mac was 0.5 sec faster than the Windows pendant.
I don't care about those kind of benchmark, because its not really necessary in the daily routine of handling a smartphone.
Can I do some calls? - Fine
Can I sync my calendar and contacts? - Fine
If the reaction/ respond time of the GUI is acceptable and without some breaks, I don't care about 1/10 sec in respond time.
Apple has had much better multitouch support from day one as well. They started with a grid of sensors that could detect over 10 touches or something like that. Android devices have slowly struggled up from single touch, to single touch with a few multitouch gestures, to multitouch with an inability to tell apart certain situations, to real multitouch for certain numbers of points. Mostly due to manufacturers always going with the cheapest touch sensors they could get away with.
It also helps that they patented putting 2 fingers on the screen at once and so everyone was afraid to implement it until palm went mainstream with it on the pre.
Yeah, as far as I know mainstream Android hardware has supported multi-touch all the way back to the original G1, it's just that no-one dared do anything with it due to Apple's patents.
It would be nice to know what their testing methodology was - specifically, was the measured times from touch to screen response, or from touch to OS response?
> We built simple, optimized apps to flash the full screen white* as quickly as possible in response to a touch. The apps contain minimal logic and use OpenGL/DirectX rendering to make sure the response is as quick as possible. Since these are barebones native apps doing nothing more than filling the screen in response to a touch, this benchmark defines the Minimum App Response Time (MART) a user could experience on a mobile app on each device.
Not misleading and fairly standard practice. There is no value starting at 0 when the lowest device tested is 55, all you'd do is shift everything to the right or make it smaller and harder to see the distribution (which is the whole point showing the distribution from 55 to 123).
As long as the graph goes up in consistent increments (in this case 10) and shows that all these devices could move to the left (i.e. get quicker than 55) I'd say the graph has done its job.
Imagine if the x axis started at 50. The difference between the iPhone 5s and the next one would look massive -- much more than the 2-2.5x it is. That a lot of people do it ("standard practice") does not mean its not slightly misleading.
No, it's misleading and terrible practice that is only used to introduce bias. It is absolutely unacceptable to hide origin point on histograms because it obscures the relative comparison between values.
I would love to see the code for each of the tests. And most importantly the optimizations used for each.
It's always important to get the baseline right before arriving at a conclusion.
Would I, with my naked senses (eyes, touch) be able to tell the difference between a 55ms response time and 117ms response time? Alternatively, does such a difference add up or combine with other factors that ultimately make it noticeable to me? How? And while we're still at it, is this winning performance by Apple touchscreens a function of the quality of the screen itself (or components), other hardware (ICs and what nots) that work with the screen and are a factor in how it responds, superior code at OS level? I'd love to hear such details, as opposed to merely telling me 55ms vs 120ms.
In general, your visual response time is about 200 ms. Anything below that is fine for regular visual feedback such as highlighting a button after it has been pressed. This is due to latency in the processing of visual information in the brain.
Touch screens are somewhat unique however in that they track physical movement directly. This means that they do not only have to compete with the latencies in the visual system, but also with the predictions of physical movement that we do. Even though we can percieve visual information only at a 200 ms delay, our predictions of the physical world are compensated for that. Thus, even small latencies are percievable.
I think Microsoft did some research on this and they concluded that even single-millisecond delays in touch screen tracking were noticeable. They even claimed that sub-millisecond delays felt "completely different" and much more real.
Perceived audio latencies are much lower - I believe this is why designers don't include much audio cues in mobile UI's; you can do that with 3-6ms latency of keyboards/mice, but not with 50-100ms latencies.
And this is pretty much the difference between implementing the user-space in Obj-C versus Dalvik... I'm sure Meego would have been much better than Android, as surely Ubuntu touch and Tizen will be...
So in addition to having smaller pixels than Android devices with undetectably small pixels, Apple now has faster reactions that Android devices with undetectably fast reactions.
I'm looking forward to hearing about the new iPhone's unmatched fidelity in reproducing ultraviolet images and ultrasonic sounds.
Even the iPhone's measured 50 ms must be clearly detectable (I don't have one, so can't check).
A trivial example - have an app with a picture of a snare drum that when tapped will play a sound of such drum. It's well known f the delay beteen physical touch is more than, say, 20ms - then it will sound "not instant", the sound will be perceived by your brain as a separate event; and even 10+ ms will feel delayed to any drummer.
An intuitive way to sense latency is to drag you finger and see how far behind the cursor trails.
Research showed that people can perceive even quite low latency:
http://www.techspot.com/news/47784-advances-in-touch-screens...
(NOTE: the article incorrectly states this is a device - it is NOT, it's just an experimental setup to measure latency perception, i.e. fake.)
I would expect the whole processing path - touch, bandwidth, CPU, RAM, OS, app, GPU, display - to factor into latency, just as it does for VR (see John Carmack's talk on latency for Oculus Rift).
Interestingly, mouse cursors seem instant to me (below perceptible latency), though it's slightly harder to tell, because the mouse isn't on the screen (and if you put it on the screen, the scaling is way off).
Anyone has any idea what the iPhone 5 touch screen is compared to the Nokia Lumias? The touch on them is pretty damn fast too, clearly faster than Android devices. But iPhone does still feel slightly faster... Maybe Lumia's and iPhones are almost on par.
> "The team built a device dubbed Touchscope that can measure response times to a level of accuracy that is plus or minus 4 milliseconds. It then adds the cloud processing response time to calculate the actual delays experienced by users."
No way the cloud is processing anything having to do with touch handling.
You misread. They make a mobile app that streams stuff from their cloud. Therefore, to calculate actual delays in their app, they must add network response times and touch latency.
Like a single person on earth EVER complained "boy, does my smartphone touchscreen not react"... yeah... Actually, i wouldn't have thought it would be >100ms for my HTC One, but it's certainly small enough that i cannot see a difference.
It is really noticeable when dragging something with your finger and the object you are dragging doesn't stay under your finger. I'm actually surprised the delay on all those devices are so high, I mean I can ping a Google server that's in another city 140 miles away from me in under 10ms...
The article mis-interpret the data. It actually measures the full UI latency, which includes touch input, event system, and UI graphics system. It is somewhat well know that Android event and graphics systems have higher latency. At least, I took it for granted for 5 years. The touchscreen is a hardware component, its latency mostly depends on hardware choice. Nintendo 3DS and PS Vita have much lower latency, because the software stack is highly optimized for games (which obviously wants low latency).
This must be one reason that musical instrument apps are unusable on Android (all the way from my Moto Droid 2 to my Galaxy S3, one can't play drum apps in a way that sounds decent at all) while they are generally excellent on iOS.
Per discussions I've had in the past the audio layer in Android is part of the problem. 60-100 ms delay is a lot too, though and that surely is adding to the perceived lack of responsiveness.
Is anyone else surprised by the wave of positive media towards Apple these past few weeks? I'm honestly so used to Apple being bashed that I was even expecting this article to be negative near the end. I was also expecting to see the top HN comment say something along the lines of "no this is BS I could never use Apple products".
It's fun to measure these sorts of numbers for all kinds of devices. You don't need equipment that is too fancy, a Casio Exilim EX-ZR10 (cheap!) is good enough to get numbers within 4ms by framecounting the high speed videos. I personally find input lag on Android devices infuriating, I wish more attention were devoted to fixing it.
It can be even easier than that: I wrote a benchmark that allows you to measure input latency for web browsers without any specialized hardware or cameras: http://google.github.io/latency-benchmark
Of course this technique can't tell you the hardware latency of a touchscreen or display, but it can measure all of the latency introduced by software, which can be quite significant (and as a software developer, the software latency is the part you can actually influence).
I and a couple friends used to compare the iPhone/iPad vs Android by touching the middle of the screen and wiggling it sideways really fast. iOS is always 100% precise - the area under your fingertip is exactly the same as where you started - while Android only started to keep up after Jellybean, and is still not as good.
"Even a two-year old iPhone 4 beat out the other Android devices,” Relan said. “You expect this from Apple’s design team, while others may view their responsiveness as good enough. Now we know why the Android touch keyboard is not as snappy."
They know why but don't bother to tell us in the article...
This kind of makes sense. Whenever I've used an iThing, the screen always felt a lot more responsive than other devices. From a software perspective, their kernel is optimized for io, and so is the BSD userland. Not a surprising bench mark.
114ms is quite perciptible latency -- it even breaks the persistence of vision barrier:
"The human eye and its brain interface, the human visual system, can process 10 to 12 separate images per second, perceiving them individually." (Wikipedia)
For reference, the average game updates the screen every 16ms (60fps) to feel smooth.
Given those specs, responce latency of that level can be quite noticable.
Here's a domain which it totally wrecks for Android: music apps (synths, etc). In audio, a delay of 114ms makes an electronic instrument annoying to play live. Musicians strive to get their audio latency around or below 32ms.
If you have 114ms touch registration latency (from the moment you press something on the screen) and then add the audio processing latency, then it gets very ugly. That means that, however advanced your CPU and however small your audio buffer, you'll always have 114ms of latency minimum.
To put it in perspective, it means you'll be one 1/16th note behind the beat on a 120bpm track.
Latency is the basic reason audio developers avoid Android, and 90% of the cool stuff is for iOS (Moog, Waldorf, Korg, Cubase, etc).
Is that the human eye with good vision? Does this include people with poor vision?
I don't understand the link between vision and audio. Can you explain what you mean?
1) if you want UI visual response to be perceived as instant, you'd like to aim at 100ms or below - and you can't draw a response that fast if your code only gets the input signal 101ms after the actual physical touch.
2) if you want audio cues (say, a click sound when clicking a button; or app that plays drums/piano/whatever), then the noticeable latency is even lower; if you have to approach 100 ms then it will sound disconnected from the actual touch.
3) A common platformer or fighting game running at 60fps has 16ms per frame. If there's a 100ms input latency, then the action (say, jump or block) happens on a game world 6 frames off from what the player saw when doing that action; which tends to be too late. Only pro-players worry about single frames; but any average player is affected by 6 frame difference.
For input devices - keyboards, mice, gamepads, whatever - the acceptable latencies (not good or great, but barely acceptable) are measured at tens of ms, tops. A 50 ms difference is not tiny, it's huge.
Human perception can notice anything above about 15ms of latency. If you want to easily see the touch latency on your phone and have an Android phone, go into developer options and turn on Show Touches. It's pretty readily apparent once you see that.
This is one of the main reasons I switched from a Droid to an iPhone. I also had an iPod touch at the time and the difference in lag when surfing the web always frustrating on the Droid.
Excuse me, but why all this energy on a millisecond type performance change. Why dont you guys focus on build apps that solve problems or are new companies. Your energies and brains are misplaced. Maybe its fun though.
Here's the deal. Instagram, Pinterest or SnapChat probably never considered about the time it takes for an iphone generation X to perform. They didn't design for it, they designed what the user needed and wanted.
I would say millisecond snapness isnt perceived by lets say 98% of the users of these systems. Focus on the application. Whats missing in mobile ? Facebook and Twitter are missing the picture. Focus on that guys, you are smart, but put energy on the big problems and not some low level hardware/software layer.
This is like saying that "so what if game runs at 15 FPS, it's still better than a worse game running at 60 FPS", this may hold true to certain kinds of games and be a huge deal breaker for others. E.g. Chess/board games vs. Counter-Strike/FPS games.
In the end it boils down to user preference, user experience and individual weighing of those. Some people don't care if the image stutters a bit at times, and for other's it's a major annoyance. Thus simplifying this all to "performance doesn't matter, good ideas do" absolutism is nonsense.
Besides some of us care about things like efficiency. I don't want to pay for a 1200 MHz processor when a 800 MHz would be equivalent given that programmer would have thought a bit more about how to implement things and not just ignorantly thought that "prematu... ALL optimization is root of all evil!", as it seems to be a common "wisdom" these days. That's nothing but sloppy culture, laziness and ignorance mixed together with a bit of cost analysis thrown in. User experience matters, and poor responsiveness is one of the most detrimental things when it comes to that in many applications.
Perhaps it stems from Steve Jobs's philosophy on quality:
“When you’re a carpenter making a beautiful chest of drawers, you’re not going to use a piece of plywood on the back, even though it faces the wall and nobody will ever see it. You’ll know it’s there, so you’re going to use a beautiful piece of wood on the back. For you to sleep well at night, the aesthetic, the quality, has to be carried all the way through.”
As for Instagram, Pinterest or SnapChat: they won the lottery. They were in the right place at the right time, and when that happens, you could design a piece of junk and still experience massive success, a least temporarily, as long as you have the one feature that matters (re: MySpace, MS-DOS).
Quality should be there overall, but especially on those things that really matter. This may be features for an app like facebook, or performance in a highly responsive game. My point is blindly following performance is not achieving quality. Quality is in the eyes of the user and has different meanings depending on what the user perceives and is doing. We could say the web experience using browsers has extremely bad quality characteristics from an engineering point of view, yet it changed the world. So maybe Steve wasnt so spot on :)
On Instagram, have you read about its starting and also Pinterest. Instagram had a product about Bourbin or something that was not a widely used app, but did attach to a niche of presumably drinkers. What made Instagram is that they somehow recognized that people on the Bourbin site were really into the pics taken of their hobby. Instagrams next app was a buggy system apparently! However, I recall reading it was that they recognized the uptick by recognizing the users wanted to post pics (with the filter stuff). So here, there was no real quality, just a feature not exposed well in the social market. Pinterest also seemed to recognize that wedding dresses and women gained them traction. In my judgement its this recognition of opportunity and gaps in the social market combined with what you say, the right time and place. Right time and place can also mean partially that there is a need in the market for something thats not well done.
I'd say designing user experiences so it no longer feels like you are using a computer is one of the most important problems in technology today. Doing so alters the way the mind and brain deal with feedback and completely changes the way users end up using a tool.