The author is way off base carping about the CPU. It's the fastest CPU on a mobile device because it's likely a 3-issue ARM11-derived CPU with out-of-order execution: that means PA Semi (now part of Apple) has designed a chip based on the Cortex A9 or started with an ARM11 and tacked on similar features, like Qualcomm did with the Snapdragon (in the Nexus One) instead of opting for an Cortex A8-derived core.
When the Tegra2 and OMAP4 (Cortex-A9 derived ARM CPUs)start shipping in bulk the playing field is going to be leveled: now everyones got access to a 3-issue OOOE ARM11. Only these will (optionally) be multi-core, something Apple probably isn't interested in as it's not doing multitasking.
When XYZ is released the playing field is going to be leveled
When manufacturers start taking notice that unresponsive, laggy, slow devices are unpleasant to use, the playing field can start levelling. Until then everyone else is doing a Wintel - hardware many times faster, normal activities just as slow as ever. It's not hardware, it's software that matters, it's design.
I've been humming and hawing about whether this will be worth a look given the obvious things to gripe about. I heard on one of the post-announcement streams that the first impressions were roughly "heavier than expected, unbelievably responsive". This quote from DF "if I had to sum up the device with one word, that word would be “fast”." really pushes me strongly in favour. I'm interested.
If nobody else is going to push for responsive software and Apple are, then I'm going to lean more in Apple's direction.
> It's not hardware, it's software that matters, it's design.
Well, we are discussing the Daring Fireball post, and Gruber does mention in multiple places 'setting the software aside.' So bringing the software into the discussion is kind of side-stepping a discussion of the actual article. If anything you should be calling Gruber out for ignoring the software, rather than jumping all over the parent poster.
I would rather put it the other way around: one of the most noticeable things about the iPhone / iPod touch is its slowness, the little glitches in the animations where the rendering drops a few frames, the pauses before there's a response to pressing the big button.
Wintel very much isn't "slow as ever", in my experience. Desktop machines are another class of hardware altogether; I have 12GB of memory, an SSD, 8 logical cores and 2 monitors on my desktop - nothing slow, laggy or unresponsive about it.
The iPhone has the feel of a device where they care about responsiveness but the hardware isn't up to it. Every Windows mobile phone I've seen has the feel that the software is built badly and nothing the hardware can do will help.
Wintel very much isn't "slow as ever"
You are right, I'm spoiled by fast hardware, compared to the machines of yesteryear my desktop is a joy. But a processor capable of tens of billions of operations per second compared to my human senses - nothing should take long enough that I can notice it.
When using Visio and accidentally mousing over the My Shapes menu the whole program locks up for about 10 seconds while it finds zero shapes from a network drive and then doesn't cache the result. While using VMWare vSphere client, almost every click, view change, menu popup, any action at all takes a blinking and a flickering and a delay. Explorer happily does slow folder and drive refreshes and ignores right click popup menu requests on many occasions. IE 8 opening a new tab is a laggy occasion. SQL Server Management Studio 2008 is a blinkenflicker fest all over.
Here and there, slow calls are in UI threads, network requests are blocking, network requests are slow out of all proportion with the amount of data that needs to be transferred due to odd protocols and many layers, crunching data is not offloaded onto other cores, frequent results aren't cached, the spinny circle cursor and the "program is not responding whoops it's back now" affair happen all too often.
I can see two reasons. A) it's hard. B) It's not a priority. I can forgive the iPhone because it mostly works well. I can forgive small companies writing desktop software. I can't forgive the likes of Microsoft and VMWare, because they have masses of cash and eat hard problems for breakfast, so the only conclusion is that they don't care that I get a twinge of reluctance to use their software, a frowny lemon eating face when I wade through it, when I'm drumming my fingers and using the free time to grumble about their software to my colleagues.
Fast responsive small simple software is a joy to use, I am cheered by PuTTY, Paint.NET, 7-zip, VirtualBox, VLC, AnyClient, Vim. Even Adobe deserve praise for the improvements in startup time from Adobe Reader 6 and the plugin fiasco.
VMWare vSphere client, as it fires up the coal boilers and defrosts the Java VM, whips at the tired donkey graph drawing libraries and runs all it's server queries through a dialup connection simulator, oh no. No no no.
If Apple are going to focus on speed, I'm going to focus on giving them my cash. I can feel my mental liking of the iPad converging, decision weightings on it's faults are weakening and it's pros are strengthening, a preliminary RDF field is forming to buffer my perception of the change in my bank account, the justifications are coming more and more easily to the fore.
I will only end up disappointed, apps will chug, serious web pages will lag and layout will take ages, iPhoto will arbitrarily slow while scrolling ... but by then it will be far far too late. :/
A lot of MS software design has been historically clueless about network latency, I'll grant you that. I sometimes joke about imagining a world where the web is built on top of DCOM, where the view in your browser is rendered via object-oriented RPC calls from the server, etc.
Apple are a much more insidious threat, though, in my opinion. One of the reasons I first got into programming was because one of the first computers I got didn't have any I/O other than the keyboard, the monitor and speakers - the tape drive was broken. A flashing BASIC prompt (ultimately MS licensed) greeted me at every power-on. The only way I could do anything with the machine was to write the program myself; and I could keep nothing of what I wrote, and had to write it all again if I wanted it again. Every night I wrote an alarm-clock program to wake me up in time for school the next morning.
Now if I were that age again today, and had an iPhone or iPod touch in my pocket, or god forbid an iPad in my bag, I couldn't even write a program on it, much less save it on the local file system. Apple is almost intrinsically hostile to software engineers, as I see it. Developers are like rodents to be trapped by promises of a slice of market willing (and hopefully forced) to pay for software; but the trap is far tighter than anything MS has ever lain.
> I couldn't even write a program on it, much less save it on the local file system.
Can you not save text files on the iPad?
That's pretty much the only thing you need for a js application that runs in safari (which these days has a wicket fast js interpreter and supports the canvas, webgl, audio & video tags).
Maybe if the server had a way to remember or reopen and tidy your code it would be closing in on PG's RFS#5.
You can't cron-job anything, and you can't work with any language other than Javascript unless you make a server side to->Javascript compiler for it. Another user suggested that there is now or soon to be inter-app file transfer, so depending on how that is implemented it might open other venues.
2.5 years of a closed phone platform is an insidious threat? Where were you for most of the 2000's?
I predict that within 2 years the iPhone OS will allow you to run unapproved apps provided you click through enough, "Hey, we're warning you! Don't come crying to us when you manage to blow up your dock connector", messages.
iPad isn't a phone. iPod Touch isn't a phone. To the degree that these devices are successful, they will be replacing user-programmable devices from the bottom up.
It's all very well for you to "predict that within 2 years" everything will be fine. That's not the way it is today, and it's not the way it has been in the past. Everything I know about Apple tells me they have no love for developers whatsoever, and that their rise is one of the worst things that could happen to this industry in terms of open platforms. Apple likes their monopolies much, much more than MS does.
Something is only user programmable insofar as the user has the capability to program it.
An open device today still wouldn't be user programmable because most people would not have the ability to do so even if they had the desire. For those people (read >95% of the market), functionality and ease of use are so, SO much more important than theoretical freedom.
It might be a threat to your/our way of life, but I'm not convinced it's a thread overall.
You can't write a program for your TV or Wristwatch, microwave or land line phone, LCD screen menu system or DVD player.
People can't do surgery without planning years in advance and studying for years in specialised schools, and this has been the case for hundreds of years, yet children still grow up to be doctors and surgeons; and all sorts of other professions which don't have free-home-tinkering-kits available.
If "everybody" wants an iPad and "nobody" cares that they can't program on it, then we're the minority in a tyranny of the majority situation and the only way out is having enough money to pay a company to design your own alternative.
It feels like something is really being lost - a lot of humans world wide growing up as content consumers, not realising that if they could shape the right forms, they could do the same kinds of magic themselves - equals not subservients[1].
But is anything really being lost? If you were a kid today would your nightly alarm clock program have the same charms when run up against your mobile phone's built in alarm? Maybe it's time for software development to grow up and realise the days you could build a spreadsheet in your home are gone with the days you could make cough syrup from home, and the industry needs to become a more professional, 'trained adults only' kind of industry.
In the early days of machine code, software was the electric circuits that rich eccentric people built in their labs. Basic was the valve radio circuits that Feynman puzzled over as a kid. C was the 555 timers and 74xx logic gates.
We were hoping that software would be a new kind of material to work with altogether, each iteration building on the previous work so that as it gets more powerful it still presents a small surface area, stays easy for beginners, kids, curious people to put modules together.
Instead, modern software is the integrated circuits that need years of training to use. It didn't become drag-and-drop coding, it became
Instead of an OLPC which is both hackable device and the result of hacking at itself, we get iPad - you hack on your specialist hackstation (Mac/PC/...) and view in the iPad browser. Like you make 3D renderings in Blender and view them in a picture viewer, or a nationwide phone exchange which ends in a handset with a pad of numbers, like you design paint in a large chemistry lab and it ends up in a small pot with a brush.
Like many other industries.
[1] I twigged the other day that this is a major difference between me and a hobbyist friend of mine. He treats things with interest as if he is an equal and has every right to learn about them, willing to dive into a subject and believe he can get results. I treat things as nuanced and designed by experts, full of details to be cautious of and traps for the beginner, something to worry about and avoid. I like his style.
My PC is my TV and DVD player combined. I don't know what an "LCD screen menu system" is. I don't have a land-line phone; but I can write a program for my phone, and without needing to pay yearly for the privilege.
Or printing: "SampleForm.Print;" in my case, at the extreme (Delphi).
The microwave, I'll grant you. It's handy for heating up tea that's been brewing long enough to cool down too much; that's about it.
Multi-core processors are plenty useful in a non-multitasking environment. A single process with many threads is sped up by having multiple cores. Apple is definitely interested in multi-core processors, look at Grand Central on OS X. I would argue that it is one of the most interesting approaches to multi threaded/multi core programs in recent OS history.
When the Tegra2 and OMAP4 (Cortex-A9 derived ARM CPUs)start shipping in bulk the playing field is going to be leveled: now everyones got access to a 3-issue OOOE ARM11. Only these will (optionally) be multi-core, something Apple probably isn't interested in as it's not doing multitasking.