This is absolutely right. My company been working on a medical device accessory for the iPhone and the possibilities the "tablet" form factor are amazing.
Lower Cost Structure - In our industry (diabetes) you give away hardware to get ongoing disposable revenue. This hardware is expensive to produce and develop. Plus you spend a lot of effort on areas that don't add much value e.g. reinventing the wheel re: display drivers. This completely changes the economics of the industry.
Higher Product Quality - By taking advantage of the core Tablet attributes like color touch screen display and processor you can do things feature wise that would be prohibitive with custom hardware. The amount of "ooohs" we get showing off our iPhone UI vs the current LCD one is staggering.
New Revenue Opportunities - Again our business has thrived on a single revenue stream, the disposable test strips. With connections to the web all manner of "virtual good", subscription services, and other digital business models get opened up.
Overall it is a huge win for both user and entrepreneur and is going to fundamentally change a bunch of hardware businesses.
Interestingly they're still devices. It's just that with iOS (and soon Android) this is the first time a general-purpose operating system is perfect for a device.
Cocoa's for-scientist, rapid-development ideology is perfect for using the OS to create an ethereal device.
Doesn't it concern you that Apple could shut you down at any stage? They could easily just decide that your app isn't suitable for their store. This would be especially true for industrial applications where there might be good reasons to want to step outside their TOS or guidelines.
The original author appears to be using an iPhone for prototyping, and clearly states that it is the "tablet form factor" that is making things so interesting. Even if they were using an iPhone there would be no point to putting the software on the app store as it would be useless without the appropriate hardware. (Edit: Just noticed author's latest post that yes they will distribute through the app store and sell an accessory. I stand corrected.)
In general, yes it is a bad idea to use consumer technology for devices with the sort of lifetime you find in the medical industry (often up to 10 years). Because of the high levels of regulation and testing, its not cheap to swap out a component for something equivalent when a manufacturer stops producing the one you originally designed and tested your system with.
I'd also be concerned that on Android that the user might have a device that doesn't allow off-Market installs, and then I'd have to use a distribution mechanism I don't prefer (maybe even an alternative market like Amazon was rumored to be creating).
Less of a concern. If Apple decides to go into the Medical gadgets field and updates their terms of service accordingly, you are basically hosed. There's no recourse. Not even an alternate 'off-Market install'.
Unless your customer's Android device happens to come from AT&T, where you have the exact same issue.
(This isn't to say I don't agree with you that they shouldn't be able to kick you off, but, you know what? No mobile platform is good in that respect.)
Yes but your program will still work on other android platforms that don't try to prevent non-app software so the software development effort investment isn't wasted.
When has Apple shut down apps in the store? There was the purge of bikini pic apps and the Google Voice issues, but I can't think of another example of good, useful apps being pulled out of the store capriciously.
I'm glad Sanofi-Aventis is moving forward with an iPhone accessory that will be a medical device, and will be really pleased if they're able to get it through FDA. The Agency has put a lot of scrutiny on mobile platforms in the past, which is one of the reasons you don't see all kinds of medical technology running on smartphones and tablets. Trust me, it's not because those of us in the medical device industry don't want to utilize these platforms. (Note: this is distinct from EMR/EHR, which medical staff can put on whatever they choose)
Not being an iOS developer I'm a little in the grey here. How would developing a product for the private market go in terms of distribution? I gather your product is for mass consumption?
Worth noting here that most analysts feel that the reason the original iPod was successful was that it only did one thing in an extremely friendly format.
I think the key question here is whether or not the app-universe grows in size until consumers desire separation of widgets again.
I know from my own experience that I found I maximize productivity by having separate devices responsible for separate things. For instance, when I pick up my blue iPod it's for education -- I keep books and lectures on there. But when I pick up my black iTouch it's for fun -- I keep only tunes there. My phone -- although it has all kinds of neat wizardry in it -- I use solely for talking to other people.
Perhaps both trends are true. Perhaps we end up individually separating our apps into physical devices based on preference instead of tradition. Neat stuff.
This, in my opinion, is the genius of Apple's choice of form factor. By omitting any physical buttons apart from meta-functionality, and allowing only full-screen applications, an iOS device essentially becomes dedicated hardware for whatever it happens to be running.
I remember marveling at the YouTube app the first time I picked up an iPad. It was the best YouTube experience I'd seen; it felt like I was holding a purpose-built device for YouTube viewing. The home button simply prepared it to morph into a different purpose-built device.
Video on the iPad is its killer app for - I don't check email with it, I do a fair amount of Web browsing and I have read a few books through Kindle on it - but with video (a ripped DVD, iPlayer or a movie rented from iTunes) it is simply wonderful.
[NB I noticed the other evening that an iPlayer at the distance I normally cradle it is larger than our 50" plasma at whatever distance that is from where I sit - no wonder it feels so immersive]
To watch an action movie with others I'd put it on the big screen and turn up the surround sound. Watching a history/science documentary by myself I prefer the iPad.
The one major improvement I can think of would be if the screen could deform itself and make actual, raised buttons. Then it would be even more of a dedicated device towards whatever application you were running.
I think this is actually part of Apple's strategy. People have collections of used iOS device gathering dust or going on eBay. Now they can hook the old iPhone up as a GPS or alarm clock, or whatever and buy the newest model guilt free.
When my iPhone runs the iPod app, it almost physically becomes an iPod.
In a sense, the tablet format is more than just a software or hardware platform. It's more akin to a new kind of material - one that you can mold at will, through software, into a myriad of distinct widgets.
Do you mean it almost becomes an iPod Touch? Because the iPod pre-Touch was defined by it's clickable scroll wheel interface.
Don't underestimate hardware. A slightly dodgy software keyboard like the iPad gets rave reviews, where slightly too small netbook keyboards get slated for not being perfect full size keyboards. It's like Johnson's famous comment on female preachers: "Sir, a woman's preaching is like a dog's walking on his hind legs. It is not done well; but you are surprised to find it done at all."
The key of course is that a capacitive touch screen can do lots of different things okay and effectively disappear when not in use, rather than doing one thing really well and sit there as a big hardware lump the rest of the time.
Good point. When I hand my Nexus One around to people who haven't used it before, they always bump one of the soft buttons or put it in a state which confuses the crap out of them.
Microsoft's control was superficial, and that's how they like it. They don't want to control what you put on your computer, they just want you to pay them the same reasonable price.
I agree MS are a lesser evil than Apple. Their culture is particularly far friendly for developers. However, what you've said is rose-tinted.
MS did try to push people into software as a service (but they failed at it). They support DVD region encoding. They have introduced DRM music services. They struck agreements to hamper attempts to sell dual-boot systems. I didn't experience this but understand that at one point windows media player would happily scan existing content on the drive and helpfully encomber it with DRM. Sharepoint takes data in from all sorts of sources but it's much harder to get it out again.
MS have also EOL'd products that dominate major market segments despite users who would have paid plenty for stability: original VB, excel before UI change, every NT release.
These are the actions of a company trying to increase its control on users, attempting to control what they do on their computer.
By way of comparison, traditional unix vendors would be more helpful of big-pocketed companies who wanted to pay to continue to use old software. They could have reversed a firewall and other patches into NT4 that would have hardened it and kept it going if they'd wanted to - these are the teams that reverse-engineered photoshop and hundreds of games and then hotpatched new code over their memory segments when they loaded so that they could get DOS and W3.1 software to run fine on Windows 95 for the release. Interesting alternate history. What if MS had tried to keep NT4sp3 tight and stable and kept drivers coming, and brought it forward conservatively.
> Microsoft's control was superficial, and that's how they like it. They don't want to control what you put on your computer, they just want you to pay them the same reasonable price.
Oh, how a decade of stagnation can change perceptions...
It has nothing to do with Microsoft's stagnation. I've lived the past decade, and I'd rather be forced to use a mildly overpriced platform where I can run any code I want than a [wm]ildly overpriced platform where the code I can use is dictated by people who have a political agenda, and flat out refuse to let me run what code I want on the property I've ostensibly purchased.
Microsoft killed off all sorts of competitors using their sales channel with suppliers and other pressure tactics. You don't know what you're missing because it never happened.
I wish people would stop saying that Bill Gates has always been a nice guy because he's retired and giving away his money.
Don't fool yourself into thinking that MS is more open, they don't have much of a mobile OS for you to compare with iOS anyways.
So you mean if OS X won instead of Windows, there would be more competitors, right? Nope, there would be even less, because Apple makes it's own hardware, it would have killed off a lot of hardware companies, AMD, ATI being big examples.
And Apple is going to kill off so many hardware companies because if it wins like Windows, it's going to be the sole arbitrator of who gets to build hardware for it.
All MS wanted was for you to pay the Windows tax. Apple wants to control the hardware as well. Eg. Windows Phone 7 is a OS that various OEMs can run on different devices. iOS is basically just firmware for Apple's devices.
BS. MS cared very, very much about what software you ran. Remember Netscape? Borland? Microsoft J++? They put pressure on OEMs not to ship Netscape that make anything Apple's done with iOS pale in comparison.
It may be hard to appreciate today, but in the 90's you couldn't start a software company without a ready answer for how you were going to survive if MS decided they had to kill your product to protect the Windows monopoly.
There is a difference between competing in a business environment and limiting what your customers can do with your product. Although competition may indirectly determine what customers can do (by destroying businesses that may have created software they wanted) it is different from directly controlling what customers can do the way Apple does.
I don't think anyone claimed that Apple's e.g. censoring porn is the same kind as Microsoft murdering competition and slowing down development.
For porn -- relax, much online video works on the iPad. :-)
I would argue almost all areas is better with competition. If you prefer monopoly solutions, there is still North Korea (Cuba will go a bit capitalism, I've seen).
Aside from that, Apple can certainly be prudish, but that's more out of a desire to remain unobjectionable rather than a desire to control what is acceptable for society. They don't care if you play controversial games, they just don't want to be associated with such games.
I left a similar comment on Reddit about another topic, but doesn't Apple's contributions to open source make them vastly less evil than Microsoft?
Just look at webkit and how Apple's competitors build (are in the process of building) their entire foundations on top of that tech and Apple still contributes.
Microsoft has actually contributed to a lot of open source projects and released a number of their own projects as open source (F# being a recent example). It's just that their schizophrenic nature, with their R&D releasing open source software and their marketing department demonizing it, and complete lack of any open source strategy makes these release far less visible than those made by Apple.
I don't understand why people point to Webkit as a big Apple contribution. Basically Apple took KHTML, which is under the GPL and built Webkit out of it. It is basically forced by the license to release the source. Your point would have more validity if KHTML was BSD licensed and Apple released the source out of the good of their heart or if they built Webkit from scratch and then open sourced it. Apple basically saved a lot of effort by using KHTML and would have been sued if they didn't release the source.
Microsoft's entire empire was built on platform lock-in and network effects -- from business to business, to customers, and to the homes of their employees. It was a staggeringly successful strategy that everyone copied, Apple being the most prominant exception. They've always made products to please the consumer who bought it and nobody else. Cherry pick counter-examples all you like, but this is pretty much how it was.
Really? I don't think that Apple has acted with such altruistic intentions. Certainly they're not just pumping out whatever they can to get more cash, but they're obviously working to make a profit.
If their number one priority was really to please the end-user, they'd be selling everything at cost, open-sourcing and giving away OS X for free, etc. People are generally greatly pleased when they can get cool stuff for cheap or free.
Apple wants to please their customers so much that Apple is deciding what apps their customers are allowed to use, because their customers are too stupid to correctly choose the application that works best for his/her needs. And if you attempt to circumvent this restriction, Apple will do everything in its power, technically and legally, to stop you from doing so.
Did Microsoft ever endeavor to do this? Personally, I find top-down control of the entire distribution channel more of a "platform lock-in" thing than encouraging the use of proprietary IE extensions.
It seems to me that Apple develops more to please Apple than to please end-users. The worship that Apple gets is so very silly, in my opinion.
It has nothing to do with altruism, it's about selling directly to the person using the product vs selling to someone who forces other people to use the product. Even if the seller is acting primarily out of self interest, users are going to end up much happier buying something because they like it, and not because forces out of their control have conspired to make it their only viable option.
Apple wouldn't be in existence without the billions it made volume selling to schools. From grammar school to graduate school I was forced to use an Apple/Mac.
Your comment, while mostly fiction IMO, does have a nice storyline.
Why do you keep insisting that people are "forced" to buy these things? Nobody has been forced to buy Microsoft. If your job required a Microsoft product, and you didn't want to buy Microsoft products, you could find a new job. You were not forced to buy from Microsoft any more than anyone has been "forced" to buy from Apple.
I've known plenty of people whose jobs didn't even require the use of a computer; if you are so picky about the software you use, perhaps you could consider a line of work that doesn't involve much computer usage.
You don't have to use Microsoft if you don't want to either -- just don't work somewhere that requires MS products. Do you suppose that there aren't companies where using Apple products is required? There are.
I'm just saying but who knows if in x years you also have to use Apple? You know, there was always an alternative for Microsoft products like there is for Apple ones but people actually vote on this with their wallet: if the vast majority chooses Apple you will be forced to use their products too. I'm sure of this because I would never think about working with Objective-C and here I am doing iOS apps because that's where money is.
It does not counter his argument, but this doesn't mean he's right, either. His argument states that given a platform function, developers have historically used/exploited said function in ways that its creator had not intended. In terms of device functionality, this comes in terms of new applications for and of that device. The OP surmises that increasing the number of "opened up" functions via providing an API for developers will increase the usefulness of the device via the previous statement. Following his reasoning, the ultimate functional and useful device is "open" in the sense of FOSS because developers have the ability to exploit all functions of the device. My question to that conclusion would be, what if hackers made changes to the device that ran counter to the initial design philosophy of the device...which is what the users initially purchased the device? Is this a good thing always? Is this a bad thing always? The answer is neither, and this is why the implied (as interpreted by myself) conclusion is false.
The OP's argument doesn't necessarily follow because usefulness does not equate to functionality and hack-ability. Good design is not something you can quantify by "featurefulness". If it were, users would have stopped buying new copies of Microsoft Office years ago. Following from the end line of Paul Graham's post, if you give hackers an inch, they will take you a mile. PG doesn't state whether or not that is a mile in the direction your preferred direction of travel.
Right, Apple's shareholders are out there saying "I'm so glad we're making great stuff". Apple wants to make money, no matter how much fandom we bestow upon Apple for having a great design aesthetic, Apple is a corporation whose aim is to make money. Apple is a golden boy because they aren't Microsoft, and if we keep comparing the two, we never give Apple an honest shake.
I don't know. I've always gotten the feeling that Jobs is a perfectionist to the point of it almost being a pathology. I doubt his motivations are primarily focused on money, especially at this point.
It's probably safe to say titans of industry tend to have a diversity of motivations. In markets like finance, I don't doubt that it's often as simple as money, but the cultural currencies of other markets can be equally alluring pursuits. Industrial design, and in the last few decades, tech, have a currency of having a positive impact on users' lives. UX, basically, specifically with the opportunity cost of bad UX.
If there were any way to quantify this -- "quality-of-life improvement credits" -- Apple would likely be quite rich in these as well as simply in financial currencies.
Jobs may be a perfectionist, but he is ultimately just the CEO. As such, he's an agent to the true owners (shareholders). His laser sharp focus on design has been extremely profitable and thus his interests are aligned with those of shareholders. When these interests get out of sync, see ya Steve.
Its hard to imagine now but there will be Apple without their fearless leader one day. They are a profit seeking venture like any other--brilliant marketing has people convinced otherwise.
Apple's aim is to make money in the same way that your aim is to reproduce before you die. It's technically true — that's what all lifeforms here for — but it obliterates a lot of important detail about you as a person.
I agree - Apple is a business, and it strives to make money - should it not??? The fact that they have realized that great user experience (which MS is also now making a priority-office 2007-10 is a perfect example)is a form of added value that consumers will pay heavily for is simply a testament to Apple's leadership (NOT just jobs!). Having an unbelievably talented marketing team doesn't hurt either.
My question is: what has Apple done to directly propel the world (particularly the 3+ billion people in the "third world" - impoverished and middle class) towards globalization? Microsoft (perhaps in their bid for control) has donated & sold (or subsidized) millions of products to this group of people. In some countries, apple hasn't even attempted to sell their products at reasonable prices despite the opportunity. Isn't ushering all parts of the world into the next decade as important as creating the next great iOS?
Also disturbing is the fact that Apple is sitting on a pile of $50 billion in cash while the actual Foxconn workers in China who make iPhones have very bad working conditions and pay. Cmon Apple, a few dollars go a long way in China.
Apple actually pays a subsidy to Foxconn workers above and beyond the wage being paid to them by Foxconn, based on a device's profitability. That being said, the assembly workers manufacturing the iPhone 4 and iPad are the highest paid among all Foxconn's laborers.
If the workers were not happy with their workplace and conditions, they are free to leave, or even better, organize a labor union.
Also, be advised the GE could also pay their chinese laborers a bit more, if you would be willing to pay more for your microwave, you water heater, your light bulbs, your cookware, etc. It's not going to happen. Let China enjoy being the manufacturing capital of the world while we still have fossil fuels, k?
One that includes why "ways <inventor> could never have imagined" is desirable so it would convince people who don't already believe it to be the case?
If they used the accelerometer to measure how hard they hit people over the head with iPad when mugging them, and triggered a string of copycat attacks, that wouldn't be desirable for Apple or society.
It's almost arguing against firewalls - why restrict computer access when developers can find so many new and interesting uses for it? (Buffer overflows, spam and more is why. Default-deny is the good practise alternative. Assuming that's my stance, your suggestion isn't a convincing argument at all).
Yes, and the "X can be used for bad" argument you put forward is one that could be used against any advance in technology, with or without open standards or APIs.
And your firewall example is not relevant. Hackers aren't using firewalls against the companies that employ them, they are circumventing the firewalls. What you said is more like "people can climb over fences easily, so we should stop building them around our yards"
Yes, and "X can be used for bad" would be as weak as "X can be used for good" is weak in favour of something.
My example is relevant because it directly counters edw519's implicit claim that "if an idea can be used in surprising ways, that is good" by citing open network connections as an example of something used in surprising ways which are bad, and our opposition to that by using a "not open" stance as the agreed best position.
To be convincing that openness is good, it would need to be established that significant good comes from it, or that more good than bad comes from it, or that at least serious bad cannot come from it or similar.
Saying that X can be used in previously unexpected ways is neither here nor there as an argument for open APIs or collaboration, whether or not I like open APIs and collaboration.
I think that this argument is much stronger than your original one, and in fact, one that I would make myself. Although I still don't see how it helps your firewall example, because I can't understand how circumventing a firewall can be considered "using it in a surprising way."
Actually, on rereading your original comment, I think I misunderstood that paragraph. The exact wording is still very confusing to me, but what I think you were trying to say is that edw519's implicit argument (for all X, X should be open because it can be used in surprising ways) can be applied to network connections, where network connections are the X. I thought you meant that firewalls were the X, and that idea is what I was arguing against.
I think that can be applied to technologies that aren't open -- and it can be a property of the technology itself.
In fact, one entire category of technology that's overlooked right now is sensors. Besides sound (mic) and light (camera), accelerometers and magnetometers are already there. Some other detectors of "state" that make sense:
Pressure (think of being able to predict rain)
Humidity/Temperature (think of an environment controller)
Directional Microphone (4): Conferencing, noise cancellation, etc.
Having outputs is good too. I don't know a single smartphone owner I've talked to about it, who wouldn't love for it to have a general IR out for remote control applications.
Here's the thing though: It's not THAT big of a deal to grab my remote instead of my phone. I'd rather keep the cost of the phone down by not having an IR sensor/transmitter that only works for one function since my TV already comes with a remote that works perfectly well.
The best way I could think of to actually use the iPad for this would be have a person lay his arm (including wrist) on it. The iPad can use the touch sensitive surface to measure how thick his wrist is and how thick his arm is, and then guess his weight.
Using an accelerometer would likely be a worse user experience than just stepping on a scale, but a system where the iPad actually measures your body fat could be easier to use.
However, it will likely not be very accurate. Perhaps if one were to set a reference weight at the start, and it just measures delta using the touch screen?
Using the accel, a possibility is to see how close a person can swing the iPad towards his body. I assume that this would change dependent on the amount of fat around his arms.
A common place people gather fat is on their thighs - thin people often have space between the legs and heavier people not as much. Also a possibility, even if a not very clever one.
So it would be possible to give an estimate of upper body weight if a person would swing the iPhone with hand straight. It could give the body mass as a multiple of the hands mass. It would be easy to cheat though.
What about a dumb add-on mechanism which acts as a cradle for the tablet and which tilts it at an angle depending on the downward force? That was my first thought, anyway. I'm sure the concept (accelerometer = orientation sensor) can be refined.
The way to fix that problem is with sources. Wikipedia is becoming more and more aggressive about deleting articles without sources, because it has been enough times already by articles that 1) plagiarize copyrighted materials, or 2) appear to cite sources but actually "fudge" the sources, or 3) blatantly push a point of view by omitting to use sources that are readily available. The quality screen for new articles is slowly becoming more stringent, so, no, that is not at all a record for speed of deletion of a new article. The long-term slog, for which I wish Wikipedia well (as registered Wikipedia editor) but would have to be paid to do in the current contentious editing environment, is to add correctly cited, balanced sources to the existing articles that have the problems just mentioned.
You're right. Fuller says that etherealization was a name suggested by someone else, but that he preferred to stick with ephemeralization. I fixed the essay. Thanks!
What blows my mind is that I might have been able to see this coming in the 80's - but Buckminster Fuller saw it coming in 19 fricking 38. That man was incredible.
Or, I suppose, Jacquard cards replacing cams in the 1800's. There's a lot more history to this than it's easy to see today. But Fuller's insight, Good God.
Whenever people drop accolades upon historical figures (Fuller included) I always have to wonder if there is a bit of survivorship bias at play. I'm not saying that Fuller wasn't a complete genius (he definitely was), but I wonder how many other incorrect predictions, and other "geniuses", have been forgotten because they were incorrect.
How is this a bias? What value would it be to pay attention to all incorrect statements people have made over the years except to understand why they were wrong? People like fuller who had insight that seems to have lasted over time are interesting because we can learn something not just from the statements they made, but from how they thought that gave them insight.
Imagine you have a set of people over history, all thinking in different ways, all making statements about the future. There's a good chance that some of those statements will be true, just by chance. If you don't account for that, you will be biased towards people who just got lucky (the survivors).
I understand the selection principle, but do we realistically think that's what's happening? Are there really lots of people recording detailed, insightful, coherent, highly varied, but wrong theories of the future that we are rejecting? Why would I have to imagine it if it were really going on?
Yes. "detailed, insightful, coherent, highly varied, but wrong theories of the future" are made all the time.
The really amazing things are not the genius predictions that are right and venerated, but the predictions that are wrong but the predictors are still venerated.
http://en.wikipedia.org/wiki/Thomas_Robert_Malthus - 1798 predictions of "gigantic inevitable famine" because of population growth. (Industrialization and technology have allowed population growth dramatically beyond Malthusian limits. The same arguments are still made today however.)
http://en.wikipedia.org/wiki/Criticisms_of_Marxism - "the socialist revolution would occur first in the most advanced capitalist nations and once collective ownership had been established then all sources of class conflict would disappear" http://plato.stanford.edu/entries/marx/ "labour intensive industries ought to have a higher rate of profit than those which use less labour" (argue about whatever forces you want that have defeated socialism, but the argument continues)
http://en.wikipedia.org/wiki/Simon%E2%80%93Ehrlich_wager Malthusian environmentalist Ehrlich predicts hundreds of millions of people starving to death in 1970s and 80s, and a genuine age of scarcity." He also loses 10-year bet against economist Simon. "All of [Ehrlich's] grim predictions had been decisively overturned by events...Repeatedly being wrong actually seemed to be an advantage, conferring some sort of puzzling magic glow upon the speaker." [Wired]
You would probably like reading "Empires of Light" - one of the interesting insights I gained from that book was how important Westinghouse was to Tesla's success.
I was mystified at this. I can't imagine a way to measure weight with accelerometer that doesn't involve spinning around in circles or jumping up and down on springs.
Isn't it obvious? Drop your iPad on the floor; it won't fall exactly downwards, but will veer ever so slightly off course towards you, and the amount of displacement is proportional to your mass. Done.
It wouldn't need to actually measure your weight. Bathroom scales are just there to collect dust in the corner and make people feel bad whenever they look at them.
Just tweeted pg this but thought I'd say it here too. This is exactly what iRobot did when they saw the Roomba platform being taken apart and used for projects. They gave hackers the tools to use their platform and now sell quite a few Roomba based development kits instead of just plain old robot vacuums.
Haven't we been calling them tablets for ages? People have been designing around the idea of tablets for decades. They had the ideas for the applications of tablets fifty years ago. I'm not sure I understand what this article says that is in any way original.
Yeah, I was nonplussed by the response to this piece. If it had 2007 stamped on it instead of 2010 I'd get it, but it's recycling ideas that have become commonplace over the last couple of years.
The fact that you can change font sizes easily means the iPad effectively replaces reading glasses.
Really? I mean it might help someone not need reading glasses for the tasks they do on the iPad... but they still have to have them to read the dinner menu, instructions on the box of food, and so on. If you can't replace it fully how good is it? The iPhone replaced regular cell phones because of the fact you no longer need two devices.
Just a thought - I've worked with a legally blind woman for a few years. She uses a variety of magnifying gadgets and devices during meetings to read handouts or lunch menus. I think that the present gen tablets are coming very close to being able to replace these expensive magnifying gadgets - and incorporate other technology.
Or maybe an app that will immediately redisplay what the camera views in a larger format on the screen. Probably be a bit clunky, but might not be too bad. Might even help more for people who need high magnification for reading but whose distance vision is still good.
It not only magnifies, it lights up with the camera flash, it can reverse the black/white or add color filter if you find that easier, take a snapshot and let you zoom around it etc.
PressReader will fetch the daily edition of almost any newspaper in the world, and then read it to you. That's got to go some way toward replacing reading glasses
I still find it more comfortable to sit with my mobile devices and reading glasses than with extreme font sizes. Even my desktop computer now I prefer reading glasses.. and even more so as the device shrinks.
This isn't that surprising or revolutionary, it's all just part of the inevitable movement towards ubiquitous computing. It's obvious that we won't always be reliant upon one device's built-in sensors to constantly gather and supply relevant data in real-time, and inevitably we'll have an incredibly integrated network of real-time, physically collocated devices.
For now, tablets are great. And Apple is great at supplying them. But by no means does this mean anyone will be enslaved to Apple in the long term – someone else has the opportunity to create an open platform that enables any and all technologies to communicate with each other. Someone else will have to sell this platform to businesses, governments and, most importantly, consumers. And someone else will have to create the other, new interfaces by which we access and derive meaning from this data collection. And the challenge of preventing this from being too closed, too proprietary, is what will distinguish the best approach from the most profitable approach, and where we as users can choose to avoid a "client monoculture."
The tablet approach is just a step in an ongoing direction. It's way bigger than this.
Replacing keys sounds like an interesting idea, but on one hand you have the problem that the tech needs to be rock-solid (if github is down I get mildly annoyed. If I can't get my door to open, I freeze to death) and you are competing against an already established technology that works really well (rfid tags) and isn't very expensive.
That said, I would love to hear more about your idea, if possible.
"For historical reasons, the device in your pocket or purse - the one that you use to browse the Internet and send email, is called a "phone." We need a new name for that thing." Scott Adams. He suggests calling them "head". http://www.dilbert.com/blog/entry/phone/
On a side note, I really need an innovation for keys, they scratch my iPhone! So, go, that new yc company, go!
We don't need another name for it. Everyone knows what it is. Some people understand why it's called what it's called, the others can either read up on it, or never consider it.
What we now call a phone is developing so quickly now that calling it anything related to what it does now will mean we have to give it another new name in a few years.
This raises a question of how Apple will deal with "Made For iPod" interfaces that get increasingly generic. Right now they have a good framework for evaluating apps and hardware produced by one company to work together. They don't have a good way to understand hardware from company A working with software from company B.
What happens when someone wants to release a NES inspired D-Pad controller for iOS but wants to allow existing game makers to create apps that support it? Right now that is sort-of possible but it's very high friction.
Apple is a company who likes to build the whole stack from hardware to software; they feel like is necessary to create beautiful experiences. Will they compromise on this to facilitate a world where you can connect your iphone to any device in the house?
If they don't, progress may stagnate, hacks (like communication over wifi) will persist, and potentially they are giving up market share. Obviously they need to maintain the integrity and stability of the iOS devices but in my opinion they error too far on the side of caution.
Tablets are obviously great, but does anyone think they'll really replace cameras or GPSes? It seems to me that tablets will cut the bottom out of these markets (those with casual interest in photography or GPS or computing won't need to buy a dedicated device) but they'll never approach the quality of an SLR or a dedicated GPS. Or am I just being short-sighted?
they'll never approach the quality of an SLR or a dedicated GPS
They won't for people who need all of the functionality of an SLR or dedicated GPS. But how many of the people buying such things actually need all of the functionality? Quite often, they don't. They buy more than they need to indicate status, or because they fear having an inadequate device.
Tablets provide an alternate form of status at the moment. And the fear of inadequacy is mitigated by their popularity: If everyone is buying one, how bad can it be?
Single-purpose cameras and GPS devices are well on their way to marginalization as extremely niche devices. I think that will be true of almost everything a tablet can replace.
It's going to take some major technology leaps for cell phone cameras to replace SLR's,but I do think they're already cannibalizing the point-and-shoot market, which was a pretty big part of the camera market (certainly by volume at least).
I think if phones started coming with a better flash, it be very hard to justify bothering with a separate point-and-shoot camera. Although, to be fair, most point-and-shoots have a pretty crappy flash as well.
The SLR option depends on the future of imaging hardware in the near term, but what does dedicated GPS have that's so great?
Another generation or two to sort out the always-on/background bugs of current iOS GPS apps and the battery life bit, and tablets will take over - fast processors, big screens, easily available hardware already present to develop on (sound, internet access, etc). Why not?
I think you are spot on in the SLR department (point-and-shoots are another story), but I do believe the dedicated GPS dies via the tablet. I used and iPad 3G recently as one and it worked great, much better than the generic GPS I was given. I figure with tablets and car manufacture solutions then dedicated GPS will go to a niche market for specialized tasks.
It's not just about replacing the functionality, it is more about combining the functionality with some other in a clever way that the dedicated hardware could hardly do.
For example: gps + web -> checkins
Another important thing is the form that allows me to have a camera everywhere with me.
Reaching better quality is only a matter of time.
A tablet will do most things well enough for most people but are always niche markets. Look at how music is stored. There is still a market for records. :)
An interesting question for the DSLR makers is what they could learn from the tablet. The pictures are great, but especially at the entry level there's opportunity for improved interfaces and easier publishing. (http://www.bythom.com/design2010.htm) has some interesting ideas.
I think one of the biggest issues is transfer. It's SO easy to get your pictures off the iPhone -- no wires, no computer, dead-simple (hence the Flickr succes). High-end camera-makers would be wise to add 3G/wifi/cloud features, although the file-sizes of high-res photographs might be prohibitive for a while...
it'll get better and better, meaning it'll squeeze those SLR and dedicated GPS from the bottom, and you'd only use them in extreme and specialized cases.
So for all intents and purposes, I think it'll replace cameras and GPSes for the mainstream consumer, and you'd only use dedicated hardware in specialized cases.
Doesn't the etherealization of hardware mean that we won't be referring to tablets, mobile devices, or laptops at all?
Isn't the only difference between an iPad and iPhone the screen size. So really, we're starting to refer to these devices based on size rather than power/memory/speed.
Why do you place more value on "power and memory", than the way people use a device (tablet vs. PC)?
Let me give you a clear example: my computer 10 years ago had 128 MB of RAM. My iPhone has 16 GB of RAM. Does that make my old computer not a computer?
My point is that the usage is what matters. I still use my computer to browse the web, compile code, run Excel and Word, etc.
To be pedantic, Flash memory is closer to DRAM than to disks, although it's used for storage. It has similar access cycles as DRAM, the chip architecture of controller plus array is closer than that of a HDD.
we're both trying to make the same point. I used 'power and memory' because I consider those to be the equivelant of 'form factor'. Computing won't reach etherealism until we no longer have to consider this form-factor over that form factor, just like we no longer consider memory or speed.
I think they're still just computers, and will be thought of as such. Sure, they're computers you hold in your hand, not the kind that sit on your desk. Gaming consoles and set top boxes are morphing into general purpose computers too. Those are computers that sit on a table near the TV. But they're all computers the same.
"Computers" still get viruses, require backups done by the user to be safe from data loss, have out-of-date software packages (each with its own broken update mechanism), take ages to boot up, are clunky and slow...the list goes on.
Sure, none of those are specific to the "computer" form factor. But they are all things that people think of when they think about their desktop computer (some mac and linux users excluded, of course: but that's still only 10% of the computer-using population). NONE of those items I just listed are problems with any "Tablet" computer.
The form factor is different, but so is the very model of user software and computing.
iPhones:
* Have the potential for Viruses (don't call it an exploit it's a jailbreak is not great security reporting)
* Reqquire backups done by the user to be safe from data loss
* Since I'm currently rocking a 3G running IOS4 -- are "clunky and slow"
Of your 5 points - boot up and software update are reasonable. The other 3 can be absolutely issues.
That's not the point. Has anyone actually made a virus? No. Will somebody? Yes. Are they perceived as better in that regard? Yes.
The point is that they are not perceived in that way, so people are not scared of them. It doesn't matter how true it actually is, it matters what people think.
Phones take a while to boot too, but like most modern computers you just wake them from some kind of sleep mode when you want to use them.
Most phones don't use package management in the linux sense, so if Apps A, B and C are all using some buggy library that isn't part of the OS then who knows on what schedule they're going to be updated since you're carrying three copies of the same buggy library (This is ignoring side-loading and jailbreaking).
Other stuff being replaced by smart devices: watches, alarm clocks, portable radios, cheap digital cameras, spirit levels, dictionaries and perhaps soon your wallet and physical mass-produced books. Looking at some of the creative stuff people do with mounting their iPad in vehicles, perhaps iPad-like devices will replace traditional dials in cars in the near future.
The only reason we even consider calling them "mobile devices" is that the iPhone preceded the iPad. If the iPad had come first, we wouldn't think of the iPhone as a phone; we'd think of it as a tablet small enough to hold up to your ear. Hence the joke calling the iPad a giant iPhone. That was a pretty good description. If the future of telephony is VoIP, then that is pretty much bang on.
> "perhaps iPad-like devices will replace traditional dials in cars in the near future."
The dashboards of the near future are going to start piping a lot of control over to paired devices, but I think they'll maintain their current level of sophistication in and of themselves. But those paired devices will be a huge opportunity.
And in the same arena: automotive diagnostic machines are going to be replaced. (e.g. OBD tools becoming an OBD->bluetooth adapter + software) As well as repair and service manuals. (Who needs a book to tell them when to rotate the tires if the car tells your phone?)
I'm actually working on a college project getting OBD-II diagnostics to your i-device. Some amazing stuff avaliable already e.g: http://www.devtoaster.com/products/rev/
I think there's already been some slow moves in that direction - I recall the presenters on Top Gear playing with a couple of vaguely similar things, eg. a Lamborghini where you could cycle the appearance of the dials (so you could make it look like a "jet fighter" or whatever) and a Skyline that had a screen with heaps of different displays (g-force meter and what have you).
Those aren't near a fully programmable tablet but I think that's still a long way off since there might be regulatory problems, eg. maybe you have to be able to see the speedometer at all times. I'm not totally convinced I want a fully programmable display though anyway - I don't think that a million customisable settings would enhance my car. But I'm sure there are people who would like it...
Last summer, I rather surprised myself by using my android phone as a level while working on a backyard building project.
Also in the summer, I was camping when my flashlight died en route to the washroom. On the side of the path, I downloaded and installed a flashlight app, and then used it to find my way.
I volunteer in an after-school guitar class at my son's school, and use my phone to tune the kids' guitars before class starts.
A few weeks ago, a website I maintain went nonresponsive and I used my phone to ssh into the server and restart apache.
Just for fun, I installed an app that measures my heart rate using the camera.
Just five years ago, if you had suggested these uses for a phone, I would have thought you were nuts.
Am I the only person who thinks the iOS usability is declining as its feature set and resulting complexity grows? It's still simple for me, but while my mother could handle the original iPhone, I think she'd get a little confused by the current one. (Double and triple clicks on the home button, cut and paste UI popping up unexpectedly, etc.)
Of course the complexity will increase as a function of the number of features.
However, I'd make the following claims:
(1) Some of the functions are simply not that used. You can get by perfectly well without knowing what double and triple clicks on the home button, for example, If you don't know they exist, you will never even notice.
(2) iOS has less features than Android and arguably this makes it somehow less powerful but also less confusing for my mom (e.g. no task manager, no intents).
(3) Switching to a Mac was actually quite challenging coming from PC land ("where's maximize?", "where's a working Alt-Tab?", etc). I'd say the usability still kicks in as far as how easy it is to do your job for most after the learning curve has passed. Your mileage may vary, but I find myself enjoying using my Mac more than I ever did using Windows on my PC for usual tasks, and that's part of what makes it usable for me. I'd expect a similar phenomenon with iOS.
I think a really disrupting and interesting field will be tablets as replacement for textbooks in education. The possibilities to create amazing educational material are endless.
On the other side, you have textbook publishers, who generally learn a lot of money with ever slightly changing editions and will do a lot to not see that income stream dying...
A few years back I was working for a supplier to academic libraries, who also did a lot of ebook sales to this market.
In my final days there they were just starting to get interest from some institutions who wanted to buy Sony eBook Readers (as the best devices then on the market) preloaded with 30-50 books, then dish these out to students on some high-cost courses. They felt the students would prefer this (a not unreasonable belief with that volume of material) and it'd be easier for them to manage.
So, yes, this sort of device (in the broad sense) will very likely replace textbooks at least partially. I saw it happening first hand a few years ago and see no reason it should have slowed down since.
These guys are trying - http://www.kno.com/ - but I think it's silly to literally copy the physical form factor of a textbook. I think textbooks will not be replaced by devices like this, but stuff like the Khan academy.
> Many if not most of the special-purpose objects around us are going to be replaced by apps running on tablets.
I respectfully disagree.
The trend for specialised vs. generalised devices seems to go in cycles over a period of a few years, in a similar way to the classic thick vs. thin client cycle. Consider games consoles vs. gaming on PCs, the iPod vs. mobile phones with media storage, etc. Neither extreme is ever going to take over entirely, and the bias moves as technology evolves.
I think this is mostly driven by trying to balance convenience and power. When new tools come along that are generic enough to make a certain broad class of jobs easier, we tend to jump on them. Many jobs get moved to those devices, and specialist devices that used to perform those jobs become obsolete. On the other hand, if you get too generic, you start to introduce waste and therefore inefficiency, which pushes things back the other way. Also, if your generic device is OK at doing lots of things but not particularly good at any of them, there is still a market for specialised devices that do a particular job better because their priorities are more appropriate.
We used to write software that ran on desktop PCs, but it turned out that a lot of practically useful software is essentially a simple user interface to a simple database. Native applications had common pain points in this field that could be overcome by hosting the code and data centrally, in areas like installation/updating/backup. Thus Web apps were born.
However, today, we're seeing major players in the industry trying to turn just about everything into such an application, and they are failing. It turns out that while Web apps are great for presenting relatively simple database UIs, they are relatively weak at performing most other tasks. Cloud computing is a pretty direct extension of the same argument.
I suspect things will go the same way with phones/tablets/mobile devices. A generic mobile device with a bunch of common built-in peripherals and sensors will solve a wide variety of real world problems, and thus various kinds of mobile app have been born. No doubt many more variations will follow over the next few years, as these devices support new functionality that was not previously available and ideas will spring up to take advantage of that functionality. The devices will be good enough for these purposes and will be widely adopted as a result.
On the other hand, Swiss army phones could easily start to suffer from both overspecification in breadth of features and underspecification in performance of individual features. For example, the suggestion in the article to replace reading glasses with a smart phone seems unrealistic and oversimplified to me: it sounds great initially, given that we have cameras and screens on these devices, but then you consider the vast range of different reasons that people are prescribed glasses, the consequent individuality of each prescription, and the fact that glasses do not generally require holding in your hand to use them.
In short, I'm afraid I don't buy pg's argument here at all. A certain class of applications, some of which already exist and some of which will be developed, will probably move to handheld multipurpose devices. However, specialised tools aren't going away any time soon, because any generic device is always going to be either a poor replacement for a good tool or too highly specified to be efficient for a broad market, even if the technology exists to combine high-quality implementations of all the required features within the required space and cost constraints in the first place.
I think you make a good argument, but miss the mark on one important point. The specialization/generalization you're talking about seems to apply to apps, not the devices themselves.
Devices like the iPhone and iPad replace a lot of things; the examples of glasses really wasn't a strong one. Consider this list of things being replaced: phone, mp3 player, GPS, maps, compass, books, wrist watch, stop watch, alarm, photo album, voice recorder, notepad + pen, calculator. From now on, most people with a tablet device will use that instead of anything on the list most of the time (except probably notepad + pen). These things will just never be nearly as popular as they once were, and the list is very incomplete.
It's just too convenient to have all those things condensed into one easy-to-use device, and in many (but certainly not all) cases the tablet is better than the device it's replacing.
An iPod makes a terrible compass, wrist watch, photo album, notepad, calculator and or camera. The advantage is a bad X is better than no X. So, while most people may end up using their cell phone as a stopwatch, people that need a good stopwatch will still buy it.
If I were to pick a word for what is generic, I think I would choose "platform". As I see it, the distinction is not really about either a particular device or a particular app, it's about whether the combination of hardware and software is dedicated to a specific purpose, or whether it provides a foundation with a range of specific capabilities but relying on another layer of software on top to provide end user functionality, which is where things like Web or mobile apps come in.
In terms of handheld/mobile devices, I suspect you're being a little optimistic with your list of dedicated tools that can be replaced effectively by building on the more general platform.
For example, digital photo frames are great, but you probably have several of them around your home if you use them. Those devices don't each need to come with an accelerometer, a GPS system, and audio connectors, and all those useless (in this context) extras push up the cost.
Books are an interesting case, where I can see technology improving and converging to the point where tablets really do take over, but I expect it will be quite some time before this becomes the norm. I wouldn't be surprised to see a passive, full colour, high resolution screen inside the next five years, but there is much more to this issue than just the technology. I personally believe that reconciling the various commercial, social, logistical and ethical issues is going to be the hard part, and a lot of these probably won't even enter popular debate until the tech is ready to do the job.
Perhaps the underlying theme here is that even if the technology in a mobile device is capable of replacing many other tools, whether it can do so effectively is a different question, and one that has to take into account factors like cost-effectiveness as well as techhnical capability.
I think in slightly simpler times. Value isn't only about function, its about form and also timely availability. Some devices are required at a certain point in time and although an application could replace them it most certainly will not. For instance the simple mirror, you can use an app on a camera enabled tablet (which might even allow some cool things like virtually trying different looks etc) but the simplicity of our current solution is very comfortable.
The same argument can be applied to the scale example pg gives. Sometimes when you are dripping wet, in the privacy of your bathroom, a scale is just perfect. That doesn't mean your scale will never talk to an app that let's you track your weigh-ins, it just means a scale is a dedicated device that people will likely want to keep for the foreseeable future.
But pg is dead-on on many items, as are you. There will be, as has been, a migration of specialized devices to software solutions.
Where the line will be drawn (due to consumer preference i.e. value creation) is up for debate though!
Personally I'm going back to a wristwatch because it gives totally different perspective of time passing by than a sellphone. Magic is in the habit of watching it all the time.
And I love good maps, it's quite sweet NOT to get lost when you run out of battery.
I know you specifically asked about the YC startup, but there are a few really cool lock-related startups out there. There have been a lot of NFC (near-field communication) startups popping up for the last few years, but those require special hardware in the phone, so a company (I can't recall which one at the moment -- sorry) came up with a novel approach: have an app on the phone that plays a series of tones on the speaker, which the lock receives and uses as an audio key. While this could very well be dangerous if not designed properly, it's a really novel approach, and could very well be more secure than existing electronic locks; I know that in the hotel lock industry at least, security is horrendous as it stands, so hopefully this could help there.
I stayed at the airbnb corporate apartment and mentioned a reprogrammable lock as an awesome upgrade to their service back in 2009. I hacked something crappy together with a schlage lockset for my own use, but ended up having to go to Afghanistan a couple weeks later. I actually prefer the usability of what I had to how the lockitron appears to work.
Hotel locks are amazingly lame. Audio is an interesting output from the phone; flashing the display in front of a photosensor in a pattern would be cool too.
Mechanical locks are basically obsolete in the age of digital photographs and rapid prototyping...if you let me look at a mechanical key for a second, I can make you a perfect copy in 15 minutes.
>have an app on the phone that plays a series of tones on the speaker, which the lock receives and uses as an audio key
The series would have to be time-dependent, or you'd be vulnerable to replay attacks. Or maybe the lock plays a challenge and the phone plays the response.
My phone's battery died last time I was flying home from a business trip because I was listening to music in-flight. I already freak out a bit when my phone dies -- I can't imagine being locked out and not being able to call anyone.
Or perhaps a treadmill in front of the door which generates electricity, connected to a tesla coil, to support wireless charging of your phone, to send a message over the Internet, to unlock the door :)
I agree this is a problem. While like the idea of a remote-controlled lock, from what I've read about how lockitron does it, this is a major problem. (Lockitron is basically a sheeva plug with wifi to the Internet talking over zwire to a commodity remote-controlled deadbolt)
Also, it isn't always the case that you have signal in an entryway, especially in weird apartment buildings, basement apartments, etc.
Also, I am not comfortable taking my $800 iPhone out of my pocket in front of a door in a dark alley, whereas a fist full of keys makes a damn fine improvised weapon.
Also, I don't want to do a wan connection, web API call, wan link to home, and low power rf link to the door every time I want to unlock my door. Especially if both of my hands are full of bags.
> Also, I am not comfortable taking my $800 iPhone out of my pocket in front of a door in a dark alley, whereas a fist full of keys makes a damn fine improvised weapon.
If this startup is done right, I see no reason to force you to take your phone out of your pocket in order to use the key functionality.
It's not like the physical shape of the phone has anything to do with the key aspect, so it may as well just stay in your pocket. As you approach a door, it'll unlock.
Key issues: you don't want to just unlock automatically whenever you approach. You don't want to run wires to the door lockset, and it needs to be very low power. Ideally you want to use an existing oem lockset because making things out of atoms is painful. You don't want to abuse the battery on your phone via a background app either.
There is such a thing as over-engineering. In fact a real engineer, as discussed here recently, knows when simple, proven, low-tech solutions are the way to go. Keys aren't going away anytime soon.
In 2001, Bill Gates said that by 2006 tablets would be the most popular form factor for PCs. They obviously weren't back then. It's funny how Apple changes ecosystems.
I think MS could have delivered on the prediction if they had changed their software to match the form factor. All Windows tablets run a (basically) stock Windows XP / Windows 7. If the iPad used a (basically) stock OS X, it would not have sold! Too many small buttons, mouse-centered experience, etc.
Apple's key was realizing that the software had to change, too.
I doubt that MS could have created the tablet boom in 2006. Less than two years ago, Steve Jobs said "we don’t know how to make a $500 computer that’s not a piece of junk." I think he was right; the hardware just wasn't there four years ago to create something enticing enough. An iPad with the technology of the original iPod touch would be too slow, have too short a battery life, would have cost $800 or so, etc.
The fact that you can change font sizes easily means the iPad effectively replaces reading glasses.
I disagree. Reading glasses are close to the eye and magnify without sacrificing the amount of text to viewable surface area. People who need to significantly increase the font size (i.e. the same people who would use reading glasses) are going be constantly interacting with the iPad to tell it to pan/scroll the viewable (magnified) surface around so that they can see everything. Pagination is only a partial workaround (still have to interact, just deal with the large increase in page turns) and only makes sense with text-type data (e.g. pictures lend themselves to panning, not paging).
In general, the tablet enables all the ideas that people have had over the years that were perfect "except you'd have to carry around a computer to run the thing". I'd love to see communications protocols and hardware (next gen bluetooth?) developed to allow devices that need a computer to wirelessly use the one in my pocket.
I'd also like to see more innovation in the space where users hold tablets while they're facing a television set. The tablet-as-remote-control where the program listings are on the tablet. The tablet-as-gaming-controller where you and your opponent both have tablets (draw a path on a map to move a character instead of guiding the character turn-by-turn).
I'm trying to think of something where I'd need more computing power in my pocket - most apps seem to either have enough power in the device itself or draw the extra needed compute/storage from the cloud... just wondering what you have in mind.
I'm saying that many things you own would work better if they had a little computing power in them, or if they had access to a computer. So, for instance, it would be too expensive to put a PID controller in your toaster oven. That's why there's a simple thermostat and a simple timer in there. But imagine if your toaster could send temperature data in a bluetooth-esque way to your tablet. The tablet does the proportional-integral-derivative calculations and sends commands back to the toaster. So you get all of the benefits of PID controlled temperature without the cost of the PID controller.
Another idea (likely in the works somewhere): tablet as guitar effects pedal.
Interesting thought. I don't know anything about PID controllers - are they so compute intensive that they need a processor significantly more expensive than a Bluetooth controller?
On the guitar effects pedal: Apple has actually run an iPad ad that included a shot of a guitar plugged into an iPad running an "amplifier emulator" (not sure what the right term is for that sort of thing). Not sure which app it is, but it's probably $10 or less. :)
Tablet is the wrong term. In fact I fear the use of it becoming the term to use as I think it immediately prunes potential avenues of exploration. I think mobile or personal device is a better term as its really about the intent of the device, which is to be with you always. Tablet really seems to describe the form factor. I'm not convinced that the form factor is all that important.
Form factor means everything; look at how many new iPad only apps there are, strictly due to increased screen size. Jobs made this clear when he nixed the 7" iPad.
Let me be clearer. I don't mean form factor isn't important in general. I mean form factor isn't the underlying principle that unites these various devices.
Is the Droid Pro a tablet? What about the rumored Playstation Phone? Tablet seems to imply a form factor, and I don't think that's what is important in how we categorize these devices.
It's like if we called PCs "beige boxes" when they came out or if we called TVs "two through thirteen dial machines".
I expect the form factors for these devices to continue to evolve and improve, yet I think the general category of device will be the same (much like how a 3D LED HDTV is still a TV, although looks very different than an old B&W TV from the 50s).
Here's the thing: people who aren't geeks often like and even prefer single-purpose devices. There is, as has been noted (see http://g4tv.com/videos/44277/DICE-2010-Design-Outside-the-Bo...), a "pocket exception", but for larger devices, people who aren't us are often intimidated or confused by devices that do many things.
And of course there are dangers in making a machine more general-purpose than it needs to be. Machines that are too general-purpose becomes more susceptible and sometimes tempting targets for hacking. (As usual, ground well-trod by xkcd: http://xkcd.com/463/, http://xkcd.com/801/)
The thing is, a lot of people will use them as single use devices. They will just have very different 'single' uses for them.
Let's start with the iPod. Everyone uses it to listen to stuff. My brother listens to rap music, almost exclusively. My niece listens to whatever boy band is en vogue today. Very different genres of music. I love audiobooks on mine. Along with music. I have an aunt that has church sermons on hers.
Move to computers. I have not used a word processor in months. I spend a helluva lot of time in spreadsheets though. My niece, with her school reports, is the opposite. My nephew is almost exlusively a gamer on his computer.
Even the web. Most people have a dozen or so sites that account for 90% (made up stat) of their browsing time. It is a different dozen for all users though.
So yes, the vast majority of users will use their devices for only a handful of purposes. It will be a different handful for each user though.
As much as the software is the window to opportunity for these devices, the hardware is just as important. Apple's form factor is exceptional! Their Industrial Design sense and abilities is one of the most important recipes to their success, I think.
Google unfortunately discounts this, or rather, is late to understand how important UX is in the tangible world as much as the intangible. When you pickup a phone, tablet rather, your first impressions are based on the physical device. The intrinsic value during this interaction is irreplaceable by any software, no matter how good.
Was Paul trying to coin the word "tablet" for these kind of devices? I don't know how this is anything new, I have read of the iPad "tablet" or "tablet computer" many times.
The real thing that makes these "tablets" interesting is the number of features they have packed into them. The "ease of use" that people are talking about when referencing Apple's products is just a part of Apple's marketing for their devices. Android (and Maemo and MeeGo and etc) devices are much more capable and much more hackable. Why the praise for such a bland device such as the i-OSes when there's an awesome OS and device market sitting right next to it?
"Capable" and "hackable" are completely different than "ease of use". Apple really does have great ease of use (for normal people), and part of how they get it is by reducing the other two.
I continue to disagree with that concept. Mac OS X would not suddenly become better for normal users if Apple ripped out Terminal.app and forbade anything that replaced its functionality.
Features don't make platforms interesting. Windows CE devices had features pouring out of them, but no-one cared. Similarly with maemo. (I can commiserate: I loved my n800. But no-one else did.)
Use makes things interesting. Particularly as relates to platforms. If the platform isn't used, its potential is a moot point.
My mobile device progression has basically gone like this:
Palm OS -> Maemo -> iOS -> Android
I could do more on Maemo and even Palm OS than I could ever do on iOS and Android. There were features everywhere... The Sony CLIE series of Palms had multi-tasking abilities, great MP3 players, better organizer features, and more powerful apps on average than it felt like my iPod Touch did. Maemo had features everywhere... I could video chat before Facetime was every dreamt up, I had full featured applications (Abiword with a foldable and pocketable keyboard is missed), and web browsers came out that were fantastic (I could use full Facebook, Google Reader, and such in the Tear browser just as well as on my desktop). And the cool interaction features existed too.. on Palm OS I had a universal remote control. On Maemo I could use a Bluetooth'd Wii Classic controller to play Mario in a NES emulator. Features were everywhere.
But why did I switch to iOS and later Android?
They just weren't fit for a touch interface. Using dedicated apps like the iOS model fits much better for ease of use with a touch interface. When I got the iPod, I switched from using full-featured versions of websites to settling for a reduced web experience that was more information-based.. and having to make notes of what to look at later on the computer. It separated bringing the computer experience with me to making a new mobile experience. When I have a computer nearby, it's not so bad. It does make for a nicer experience.
Android feels a lot like iOS... almost like a slightly more functional and open copy of it. It's no where near where Maemo was. Still, it's enjoyable as a mobile experience, instead of just transplanting everything I could do on a desktop to a mobile-sized device.
That said, I still miss Palm OS. I almost went for a used Palm OS phone instead of an Android phone, but I decided I wanted something built for the Internet.
IMHO, for the "average" consumer, there's very little difference between the either the capabilities or the ease of use of either and Android phone or an iPhone. It's almost Pepsi vs. Coke.
Hackability is completely off the average consumer's radar.
I think it's clear that Steve Jobs is fully aware of this. It's why he changed the name of the company to Apple Inc, just before announcing the iPhone - he wasn't indicating that he was going to start building hundreds of different consumer electronics products - he was telling the world that a the iPhone and it's variants would replace most of them.
Apple's tablets are just one aspect of this etherealization. The real hero here is software. From the ability to create a physical three dimensional object to the manipulation of DNA - software makes it possible.
What's reassuring is that Apple doesn't have control over all the hardware interfaces that make (and will make) this possible.
Another possible recipe for a startup (not saying this would be easy) is to find an important way that Apple is handicapping their devices and overcome it on Apple's own platform in a way that Apple will allow. If you can do that, users will love it and buy your product.
>If you can do that, users will love it and buy your product.
I'm trying now to remember a story relayed here about a company who developed a product that bested Apple, Apple considered buying them/it but the negotiator dropped the ball or something and then Apple wiped them out by besting them.
Anyway, if you beat Apple then they'll either lock you out (why handicap it if they'd let you carry on) or integrate your idea working around your IP. You're going to need lots of IP/anti-monopoly lawyers. You might get bought out too, especially if it's cheaper than there end of the court cases ... I guess what I'm saying is that building hardware on top of Apple's platform seems it would always be limited in some way.
You could be referring to the story about how Panic's Audion nearly became the basis of iTunes, rather than Soundjam which they bought instead, though it is quite an old story.
"Hi Steve, it's Cabel, from Panic."
"Oh, hey Cabel! Nice to meet you. So tell me, what'd you think of iTunes?"
"Well, I think it looks great! You guys have done a great job with it. But, you know, I still feel we'll do all-right with Audion."
"Oh, really? That's interesting, because honestly? I don't think you guys have a chance."
Or overcome that limitation on a different platform? (Android, WinPhone 7, etc)? After all, I think a lot of the early interest in Android came from overcoming the limitations of carrier exclusivity (in the US at least) and form factor limitations (options for hardware keyboards, replaceable batteries, larger screens, etc).
The other question is for Apple - what other sorts of addressable hardware could be added to the iPad to make it even more versatile (temperature sensor, various transducers, etc.).
seems like there's still a lot of work todo, and a lot of room for growth. A survey of how people use their ipads, coincidentally from today: http://www.businessinsider.com/ipad-survey
That feed doesn't update when the article is published. That's what i was trying to say. Anyway, its not that important to get real time. Oh, the signs of my urge to be informed under the influence of real-time web.
Lower Cost Structure - In our industry (diabetes) you give away hardware to get ongoing disposable revenue. This hardware is expensive to produce and develop. Plus you spend a lot of effort on areas that don't add much value e.g. reinventing the wheel re: display drivers. This completely changes the economics of the industry.
Higher Product Quality - By taking advantage of the core Tablet attributes like color touch screen display and processor you can do things feature wise that would be prohibitive with custom hardware. The amount of "ooohs" we get showing off our iPhone UI vs the current LCD one is staggering.
New Revenue Opportunities - Again our business has thrived on a single revenue stream, the disposable test strips. With connections to the web all manner of "virtual good", subscription services, and other digital business models get opened up.
Overall it is a huge win for both user and entrepreneur and is going to fundamentally change a bunch of hardware businesses.
If you want to see our product, here is a nice review (http://www.fastcodesign.com/1662351/blood-glucose-monitor-fo...)