Hacker News new | past | comments | ask | show | jobs | submit login
Weak PC Market Catches Up to Microsoft (nytimes.com)
33 points by Toshio on July 19, 2013 | hide | past | favorite | 51 comments



Every time I read a report about the weak PC market, I am left stunned by the lack of innovation we've seen in desktop computing for the past ~10 years. I routinely reference my blog rants on the subject, and I'll do so again [1]. The desktop in 2013 needs to differentiate itself from portable devices--to stand out as providing something that simply cannot be achieved with a small screen. Desktops should be evolving toward immersion. The clearest path toward immersion is to make displays much larger and higher quality.

Microsoft and all of the desktop PC hardware manufacturers should be investing heavily in larger form-factor, high-density displays. I believe that combining an immersive display format with the ongoing developments toward gesture and touch input would give desktop computing a necessary renaissance.

These companies should feel ashamed that it took ~8 years to progress past the 30" 2560x1600 form factor to a $3,500 ASUS "4k" monitor. In 2013, a "4k" monitor on the desktop should be entry-level technology. After all, entry-level phones pull off still higher density. (Yes, I know that manufacturing a high-density large screen is expensive and that's the point; it's expensive because hardly any R&D is going into that channel and these companies have no idea how to stir up demand for something new and innovative on the desktop.)

A 4k LCD monitor should be entry level. Had desktop displays progressed steadily rather than regressing (thanks to the taint of "HD") after the T220, an enthusiast desktop today should be ~8k OLED or better.

[1] http://tiamat.tsotech.com/displays-are-the-key


Why?

I think they got desktop computing pretty much spot on. It's ubiquitous, cheap, reliable and good enough.

People aren't buying stuff because they're pretty happy with what they've got.

And 4k on the desktop? Fuck I really don't care if it's a 1280x800 or 1920x1080 display still.


Technical progress depends on getting people to be unhappy with what they have because theres something new that is genuinely better.


Exactly. The problem is nothing is genuinely better. Its just turd polish.

The only thing that has made a difference to me in the last decade is an SSD. On a daily basis, I can't tell the difference between my 2007 laptop and my 2013 high end workstation.


If what is referred to as progress leads to things like flooding the market with 1920x1080 glossy, reflective displays for PCs, then there's issues to me.

There's been more than just SSD out there that have been huge as far as progress. Multi-core processing pretty much ended all the hangups and fears of programs endlessly cycling on your CPU and hindering your chances of killing it easily. Now, we're to the point you can have an app hang and not even notice it right away as it pegs a 1 of 4 or more cores.

Not sure the date exactly mouse wheels became the norm (early part of the 2000s but not sure the year), but I'd hate having to click to scroll everything all over again. Try playing some of the older PC games before the era of the scroll wheel. The UI looks pretty dated on them when you realize you can't scroll. Semi related, but the addition of more gestures to touchpads for laptops as well.

Wifi was still pretty new for the average person 10 years ago as well. I mean we all had wifi routers I would guess, but the average consumer wouldn't be thinking about it for a few more years.

Prices dropping dramatically on IPS type displays is also huge to me at least. If only non-glossy, high resolution monitors above 1920x1080 became more predominate.


I made the step back from a HP Veer to an older Samsung Smartphone. The difference in DPI is enormous, and only after using a different font and a nice theme to hide the effects of low-DPI as far as possible I was able to stand the Android.

Which means for me that I'm sold on 4k Displays (and higher), if the effect is even only near to the one I experienced, if you get used to it, you won't want to get back. I image it to be equal to the effect of old CRTs with the curved display, which was unbearable after getting used to a (good) TFT or even one of the modern CRT without the curve.

The resolution change we experienced earlier, or at least I did, was always with a bigger screen. A 17" with 1024x768, a 22" with 1280x1024 and now ~24" with 1920x1080. So most(?) of us never experienced a real DPI increase on the desktop, only on smartphones or tablets.

I really hope it will be the next step after the SSD. Higher DPI-displays are genuinely better.


Precisely. Those who are ensconced and comfortable with the inertia of "good enough," may feel that way simply because they don't know how much they would enjoy a high-definition full-view desktop display because nothing like that exists.

I've said elsewhere that without the iPad 3, we'd still have 1024x768 on tablets and there would be those of us shouting for higher resolution tablets, only to be frustrated by counter-arguments of "1024x768 is fine!"

If you disagree with me--if you believe that a high-density large-profile desktop display is just plain silly--I hope that one day if and when we do actually see such a device become available, and you sit down in front of one and mutter, "wow!" that you will remember this conversation. :)


The thing is, portable computing is now cheaper and more ubiquitous than desktops. There was a time when laptops were more expensive than desktops, but now that they are similar in price or cheaper, consumers are picking portable devices.


That is the problem with exponentials, they are really s-curves in real life.

If you are upset with the state of your desktop computer, you need look no further than the accountant next door. You see "we" (and I use that term to encompass people who are interested in computers and changing the computer to create new things) are a small market, and "they" (the set of people who use computers as a tool to get their job done) are a huge market. And they don't care about 4K screens. In fact 1920 x 1200 was more than they needed and so they would buy the cheaper "TV" glass (aka 1920 x 1080) and save a few bucks (or a lot of bucks) and be just as happy. Meanwhile I pay $400 - $450 for a display because I want ever pixel I can get.

It took a guy who had a penchant for making "insane" requests to demand that his company put a high pixel density display on a phone. While at the time it was "crazy talk" the reality is that it really is nicer looking at a high ppi screen than it is a low ppi screen. And in markets where that translates into sales, they have come to dominate the "high margin" bracket.

We don't have a "high margin" bracket. We don't have anyone making "workstations" any more (not in the sense that Sun used to make them), we hardly have anyone making general purpose non-server type computers any more (all you need is a thing to host a browser to display what the real computer wants to show you). The Pixel is perhaps the most interesting example of this at the moment.

Once the network becomes sufficiently fast and reliable, the low marginal cost of delivering a new program on a server trumps the high cost of upgrading the thing you sit at. And the network has gotten a whole lot better with gigabit links being the 'norm' these days.

I've watched computers go from things that filled rooms and talked to dumb terminals over a serial line, to amazing local computation engines with multiple displays, back to computers that fill rooms and talk to a browser over the network. Why do you even want a desktop "computer" any more, don't you want your computer to be able to talk to all of your peripherals, keyboard, mouse, display, phone, and television ? That puts your computer in a box and attached to a fast network that your peripherals are attached to. All you need where you sit is enough juice to light up the peripherals. And we've pretty solidly reached that point.


> Why do you even want a desktop "computer" any more, don't you want your computer to be able to talk to all of your peripherals, keyboard, mouse, display, phone, and television ? That puts your computer in a box and attached to a fast network that your peripherals are attached to. All you need where you sit is enough juice to light up the peripherals. And we've pretty solidly reached that point.

YES! You have hit a nerve. I very much want my singular computer to be a box--somewhere, anywhere--permanently attached to a high-speed network, and to connect to views of applications running on that computer from every single device I own [1].

However, for the time being, my desktop computer is the closest thing I have to a central/master/singular computer. It is the workhorse in my device arsenal. The fact that I want every device I own to be a terminal to view applications running on my singular compute device is a source of continuous frustration. But it is what it is. In today's world, each device wants to be its own entity, much to my distaste.

But yes, if I had my way, even my desktop work environment would simply be a very large, high-definition display with some input devices--a terminal viewing applications running on my compute device. Of course I don't actually need the compute device to be above, below, or beside my desk. It could be in my garage or at a data center.

All of this is tangential to the fact that I want my workspace, my desk, to feature a very large, very clear, immersive display. In the short-term, if manufacturers are feeling the pinch of dipping sales, they must make desktops a focal point. I'd argue that the model I describe in [1] would also work to their advantage; I feel the first vendor to pull off that model--to unify the compute experience for consumers across all of their devices, yielding singular applications with multiple views, continuous/seamless state, no synchronization, self-control, federated backup--will enjoy quite the windfall.

[1] http://tiamat.tsotech.com/pao


I always thought curved or arced monitors (http://www.desktopreview.com/default.asp?newsID=868) would be a really good way of innovating desktop displays, but the price tag is extremely steep right now.


I suspect these days you could achieve something similar using a set of projectors without the heavy weight, large size, and unpleasant power requirements. This kind of technology does seem to be available, but for specialised/high-end applications rather than effective desktop use.

That's really too bad, because if someone did make a good quality display with a physically large viewing area, high ppi, and a gentle curve so it wasn't distorted in the corners, I'd buy it at almost any price. After a bit of experimentation, I estimate that my productivity increases about linearly with the number of large, high-res monitors on my desk up to as many as I've ever been able to try. Unfortunately, the amount of neck pain/stiffness I get also seems to go up rather sharply after a few days if I'm using anything other than one central, ergonomically-positioned display.


The lack of innovation is because of lack of competition, the mobile space very competitive, innovative and healthy. For example the latest in high end smart phones sport a 1920x1080 display, which is the maximum resolution that my monitor currently uses. In my eyes the technical leaps of mobile are ridiculous.

I'm wondering when some of the mobile innovations will result in some cheaper high quality laptops, because as of this moment $600 gets me a brand new top of the line quality phones and tablets while the same $600 in the laptop market gets me an entry level laptop.

What annoys me the most is that Microsoft has been trying to push touchscreens for over a decade at this point. At first the problem was that they were pushing a keyboard & mouse interface onto touchscreens, and now they're pushing a touchscreen interface onto keyboard & mouse with their Metro UI. Nothing has really changed it's just flipped now.


I'm a big fan of desktop computers, but I honestly don't care about larger monitors. When I hear about immersion, I think about something like Oculus Rift and Leap Motion, not a gigantic LED display, which I might not even have space for. 22' is good enough for me.

To me, desktop is mostly about: 1) Quality controls. Good keyboard and mouse are much, much more convenient than a bad keyboard and a touchpad or touchscreen, provide that the UI isn't designed by idiots. (IMO, the capabilities of mouse and keyboard UIs are still severely underutilized and slowly choked down by substandard UI libraries. There is so much more that can be done here.) 2. Dedicated workspace/gamespace. With a good chair, speakers, proper lighting, proper desk. That's the kind of things that contribute to "immersion" for me.


It sounds like you and I mostly agree that desktops should leverage immersion. You make a very good point that a quality workspace is a key part of the puzzle.

I personally would prefer a large, high-density monitor. I like that peers and family can still interact with the same output that I am. But I also find a wearable display to be interesting.

As I hinted at earlier, Leap Motion would go extremely well with a large, high-density monitor. I'd love that level of immersion. It's getting closer to Minority Report / Avatar. I especially find the Avatar-like UI model very appealing. For us to get there, we'd also need an application/UI model that I call PAO (which is separate rant).

Finally, I totally agree that the UI libraries need an injection of new thinking.


>Desktops should be evolving toward immersion.

Disagree on two points:

1. The innate immersive quality of desktops nicely illustrates why desktops are out of fashion.

The trend is for people to be less immersed in a single point of focus: while people are watching tv, they are also checking social media on their tablet/smartphone.

Sure, desktops are immersive and probably they should play to that strength, but - on the other hand - this is why people are leaving them on their desks.

2. I personally have absolutely zero interest in a high-density monitor: my eyes can't distinguish HD resolution anyway. I would value a faster frame rate/shorter display lag greatly more. Paying a significant premium for screens which are no-better-and-maybe-worse on these fronts doesn't interest me at all.


1. I believe a substantial part of the problem is that the desktop manufacturers have lost all will to even attempt to market innovative desktop technology. If immersion is out of fashion, that's a matter of fashion having been steered by better marketing.

I have three 30" displays on my desktop. When I want to create and consume at home, I strongly prefer my desktop computer to my tablets and my cell phone. I can relax in my chair with a full-screen video on one monitor, my social media on another, and something I am composing (words, code, whatever) in the center.

I would pay substantial money to replace all three of these monitors with one that spanned the same width (~6 feet), was 50% taller (~3 feet), with slightly higher pixel density, and was of course seamless. As someone else said, ideally it would be slightly concave so that from my vantage point, it doesn't appear convex.

The manufacturers are as much to blame for our culture's embrace of "good enough" on the desktop as consumers' passivity. I personally am anything but passive about my desktop demands; but you're correct, many are. Still, I feel those passive consumers could be stirred to an active, interested, or perhaps even high-demand state if desktop technology moved forward.

If Microsoft, HP, Dell, Lenovo, the whole gamut, are concerned about desktop sales, they should wake up and give us a reason to buy a new desktop PC.

2. That's a matter of preference or tolerance of today's mediocrity. As I've ranted elsewhere, many/most people are also comfortable with MPEG artifacts and lossy compression. I long for the day where bandwidth and capacity allow us to discard lossy compression to a dustbin.

My eyesight isn't what it used to be now that I'm much older. But I still can clearly see the shocking difference in clarity when I hold an OLED high-definition phone flat to my 30" LCD monitor. It's night and day. The phone's display makes my 30" LCDs look like ancient history. Sad thing is: they're quite new.

Try it. Sit 3 feet from your desktop monitor, as I am, and hold your phone up flush with your monitor. Which looks better?

Imagine if your desktop monitor looked that good. But also filled your field of view. Maybe that's not for you, but it's for me, and I would pay dearly for it.


I appreciated your suggested experiment, so I tried it out.

My results were perhaps not exactly what you anticipated:

- My phone has 114 ppi (and wonderful battery life)

- My primary computer screen has 128 ppi (and wonderful battery life - sorry, I don't own a desktop)

As I said, I don't personally value high-density screens and this apparently affects my purchasing decisions quite strongly.

I can easily accept that we have different technological desires and I am interested in your point of view :)

I suspect you are right about the direction the desktop market should go, but it won't tempt me back to desktop (and in general, the desktop market will surely continue to shrink). I already have most everything I need from a desktop, plus portability.

Probably, we are both in minority niches - at either end of a spectrum.


Oops! I assumed you had a high-density phone to compare versus a desktop monitor. At some point, I still recommend doing the experiment. Grab someone's iPhone 5, Nexus 4, Lumia 925, whatever. And hold it up flush with a typical "HD" desktop monitor.

My phone's screen is beautiful. I want to pull at the edges of my phone's screen and stretch it to fill my entire desk.

No doubt we are both minority niches.

I've long ago acknowledged that I value the quality and size of displays more than most people I know. I spent the bulk of my income from my first high school job on a monitor.

My tastes haven't changed, but technology's pace has stalled out. Still, I make do with 30" LCDs. And anyone who sits at my workstation enjoys it once they use it. It should come as no surprise that people like things that give them more and better: better clarity, more color depth, better contrast ratio, more space to view and display information, more immersion, etc. But it's also not surprising that people don't necessarily know they would want something better and they certainly don't immediately warm to the idea of spending additional money to have better. In other words, they feel satisfied with what they have. It's good enough. As a technophile, that "good enough" complacency drives me bonkers. :)

But that's sort of the routine with popular adoption of technology. Some of us don't know we want something until it's out there and our friends and colleagues have it. I proactively want affordable large high-density displays; and I suspect a large (enough) body of consumers would reactively want the same were they to exist.


I want to agree with you, but in the past hasn't a combination of high-end upgraders and improved manufacturing driven progress? Will enough people pay for a "4K" monitor to justify building whatever has to be built to make it practical?

I am seeing this: "These companies should feel ashamed that it took ~8 years to progress past the 30" 2560x1600 form factor to a $3,500 ASUS "4k" monitor. In 2013, a '4k' monitor on the desktop should be entry-level technology"

and wonder about the extent to which companies push consumers versus consumers pushing companies.


A good point! I have strong opinions on the matter [1] [2]. Yes, consumers are partially to blame. As evidenced in this comment thread, some consumers have reached a level of "this is good enough, I am satisfied." Presented with an alternative that does not exist today, at a price point they could afford, even those who consider today's technology to be "good enough" would be stirred to reconsider.

I should clarify the point I am making when I say that Microsoft and others should be investing in higher-quality displays. They should be working to drive down the manufacturing costs, allowing such devices to be sold to enthusiasts, then pro-sumers, and eventually to regular consumers. For Microsoft to benefit from new technology, the technology needs to be within reach of a sufficiently large market.

In [1] below I point out that when the 30" LCDs debuted in ~2007 (if my memory is correct), they sold for $1,100. I bought my first 30" LCD in 2008. For $1,100. That monitor malfunctioned in 2011. I looked to replace it and ultimately did. For $1,150.

Meanwhile, every other desktop monitor size saw falling prices. Was the steady price for 30" LCDs evidence of price fixing [3]? I'm not sure. But I doubt the cost to manufacture of a 30" monitor remained steady at, say, $600 for several years while every other monitor size became just a sliver above disposable. Finally, starting around 2012, the Korean manufacturers are shaking things up. Bravo to them!

[1] http://tiamat.tsotech.com/pretend-its-a-tablet

[2] http://tiamat.tsotech.com/hd-sucks

[3] http://news.cnet.com/8301-13578_3-57348830-38/lcd-makers-on-...


Microsoft and Intel have been sucking all the profits out of the PC market for the last 10+ years. The PC makers are operating on razor thing margins and have little budget to innovate. When they do innovate they have no ability to update the software as required, depending on Microsoft. Microsoft seems content to ride out the Windows monopoly on the desktop. Apple seems to be doing slightly more, but their focus is no longer the desktop either. I don't see any killer apps coming that is going to push the desktop to the next level.


Is there any real benefit to surfing the web at 8K resolutions? What apps would benefit from 8K? The PC market is weak because the software is weak. Everything moved to the web.


We've benefited from everything moving to web - we have actual choice in operating systems now and it should only get better with more competition. Let's be honest, the PC market was not what I'd call innovative and healthy for the last decade.


Looks like somewhere between 2k and 4k is a retina display for a 22" monitor two feet away.

I don't really want a monitor bigger than 22" - although multiple monitors are definitely great.


I'm not sure I understand. If you acknowledge that multiple monitors is great, why would you limit each monitor to 22"? Do you actually like having the monitors' plastic bezels in your field of view?

I suspect that if you were to actually use a single monitor that was the size of two 22" monitors side-by-side, you'd prefer it. The question then is how much more would you pay for that? I'd like to see the desktop manufacturers we're speaking of in this thread step up with R&D to make larger, better monitors more affordable.

I agree that multi-monitor configurations are great. I just want them to be unnecessary. I previously wrote this elsewhere:

Organizations and individuals alike have historically compensated for this disappointing reality by using multiple monitors, side-by-side. This should have been a short-lived condition--a signal to the manufacturers that larger screens are desired, in order to view and interact with more information at once. But display manufacturers seemingly ignored that signal, and PC manufacturers also looked the other way.


Nope. I don't have the two monitors side by side - they're at an angle to each other, so that as I turn my head it's more likely that I'll be looking at the second one straight on, rather than at an angle. I tried having them making up (effectively) one large monitor and it was a worse experience for me.


Oh, in that case you're agreeing with the other commenter who said the single large monitor should be curved, and I agree with that.


You could argue that the Desktop PC has evolved - it is now the Tablet. Perhaps it is more correct to say it is more a branching that has occurred.


Yes, it's more like a branch.

I have both a powerful desktop and a powerful tablet.

From my point of view, they are quite distinct and every single time I sit down to consume or create, I will choose my desktop if I am at home. If I am on the road, I will select my tablet. The reason is simple: my desktop has massive displays.

If my desktop had a regular "HD" style monitor, the difference between the two would come down to less dramatic matters such as "my desktop renders twitter.com a little faster" or "I like typing on my desktop more." Taking away large monitors from the desktop leaves a lot more room to speak of the tablet's unique upsides.


I don't care. Nobody in my family cares. None of my friends care. Even the gamers I know don't really care about more pixels. We, all of us, use computers all of the time, on our phones, our PCs, watches, glasses, tablets, readouts everywhere we go . . . computing is damned near ubiquitous. We're a hop, skip, and a jump from Blade Runner already.

Further, I don't understand how higher graphics density or bigger displays are "lack of innovation" in the PC market. That's a wonderful non sequitur that leads to argument (as seen in reaction to your comment) but it's still a non sequitur.

Microsoft used to be driven by the goal of a computer in every home and on every desk in every office. That was crazy at the time. That was also an amazing clarion call that all of us 'softies (I was one back in the day) could get behind. And Microsoft, evil as it can sometimes be, was willing to drag everyone else along into that glorious future.

Mission accomplished. (Applause is appreciated here but not expected.)

Now they're flailing a bit. Both Microsoft and the PC makers. And it has nothing to do with graphics. Not really. It has to do with not finding a new niche to fill to make the way humans experience the world better. More pixels is not better for most of us. And just to restate: let's make the world better for humans.

Once upon a time spreadsheets were a killer app. Everyone who wants to use a spreadsheet probably has a machine that can do that now. Then there was the Internet . . . again, probably everyone's happy enough now. What's the next killer use for a PC? That remains to be seen. That's the point of the article. And if there were such an easy answer as "we need better graphics" I'm reasonably sure that tech journalists, naive as you might think they all are, would all rejoice because they'd all already be busy explaining that to all of us in the kind of detail that corporate sponsors' ads would happily show us. That makes them money, sells copy, and gives them a warm feeling as they drift off to sleep at night. Wonderful!

If you really want to kvetch about how someone's missing the boat, please point your finger at everyone here (including the two of us) and ask us all what matters to most people and what we're doing about it. I don't mean "cult of the new" crap that the echo chamber talks about most of the time on HN. I mean "something that substantially changes the lives of a large body of non-techy people". Consider health, travel, money management, alerting, data backup, data protection, et al. I see very geek-specific examples here on HN but I've yet to see many examples that are universally useful for regular people. I have seen none that I'd recommend to my parents without expecting them to need hand-holding.

In the mid-90s there were lots of people buying PCs running Windows to play games and run spreadsheets and be online and write papers and find porn and . . . you name it. There were needs. There aren't now. They're satisfied.

And that is why the PC and Microsoft are stalled. And that is why most startups (or non-startups) don't matter. And that is what you should think about as you fire up your next plan to be king of the world. People, not pixels. What do they need?


If you’re a technical person who uses computers for your work, a weak PC market is a blessing.

There is absolutely no reason the general public should be using the traditional desktop computer. They should be using limited use devices tailored to the things that people want to do the most – prattle, get directions, browse the web, and buy stuff. They can use their smart phone or tablet for these. Such devices fit the technical commitment level of the general public, not demanding much in the way of learning or expertise.

Computers should be for computer people who are willing to make the investment that they require to use properly and effectively. The general public has consistently shown they do not have this capability, and only serve as drag on those technical.

As we move forward and the majority of the population is on phones and tablets those who have made the investment to use computers well will have a decided advantage in our increasingly technological world.


Interesting point of view. I wonder if the people growing up on smart-phones and tablets today will be unable to use productivity devices in the future, or whether tablets will become the new productivity device. My guess is that at some point in school, most students will need word processing and various other productivity tools, and will become acquainted with them that way.


I had a similar thought recently. People who are currently in their 20s and 30s had a unique privilege of growing up when fully-featured computers appeared in almost every household. A bored 13 year old kid getting a Pentium-166 desktop for Christmas is much more likely to start tinkering with the system, learn to program, hack, etc. than if his or her first computer was a walled garden iPad. Now with the huge success of smartphones and tablets, it may well happen that most households of 2025 won't even have a computer suitable for anything but media consumption and "apps".


My Dell Mini 9 still has some killer features that I can't seem to find on the market:

1. < $300

2. No fan

3. SDD

4. GPU

5. x86

Because of the need to support Windows a lot of netbooks are missing the SSD and add the fan. A lot of the Chromebooks are Atom. The one Samsung one maybe fits this bill, but I think it is on its way out.

I had hope for Haswell, but turns out it may break #1.

Anyway, I guess I need to start thinking about Atom and what kinds of OpenGL ES kinds of things maybe, and forget about OpenCL or some such, which is probably appropriate anyway.

The point of how awesome the machine is that it is small, quiet, and an amazing terminal into the various clusters I utilize.

Thanks Microsoft, for killing them off.


It's arguably point #1 that's killing the PC market. $300 isn't much money to work with, and there's a huge difference between a $500 computer and a $300 one. At the lower price-point, you're getting bargain-bin everything, every corner cut.


It's called market saturation. Most people don't need hardware upgrades for facebook and shitty flash games.


If developers were pushing the current CPU and GPU capabilities on PCs there would be more demand for upgrades, but instead most games target the current console standards and offer little else extra if they find more power available.

I suspect there will be a blip over the next 18-to-24 months as the next generation of consoles become common and developers start targeting them rather than the last generation, so what is not "bog standard" on the desktop will not be good enough, then it'll settle again once the people that care for high spec games on the PC have upgraded.


The most common use for PC's is Microsoft Office, web browsing, and email. None of those should be 'pushing the current CPU and GPU capabilities.' Most people aren't gamers or enthusiasts, those groups don't drive the market anymore.

Lets also keep in mind that most publishers would like to sell games to more then the 4 (size artificially small for emphasis) people who buy the latest graphics card every 6 months. That means targeting hardware on a 1-2 year lag at least. Hard core gamers, while a market that spends money, is very small. Targeting the most powerful doesn't make sense.


> If developers were pushing the current CPU and GPU capabilities on PCs

And why should developers do so? If Drew Crawford's blog post "Mobile web apps are slow" and its more famous follow-up taught me anything, it's that we developers are already ridiculously spoiled with our current desktops and laptops. On laptops, we satisfy our infinite appetite for performance by "plugging it in, strapping a 2-lb battery to it, and throwing in a few fans" (to quote a comment on the first blog post mentioned above); for a desktop, we plug it in, use a large 20+ pound case, and throw in more fans. I can't help but wonder if these computers will someday be near-universally regarded as monstrosities akin to gas-guzzling SUVs. I think it's time more of us developers (myself included) learned to work within real constraints again, by using devices that are designed to be fanless and battery-powered, e.g. modern phones and tablets, for more of our day-to-day activities and software testing.


I'm with you. What we're witnessing here it's the confluence of two different trends. First the push to mobile, as everyone has been pointing out for the last few years (yawn) and secondly the plateau in HW requirements. We've reached a point where our PCs are powerful enough to do everything we need them to, so there's no need to upgrade them (yes, we still need affordable 4k displays). This has happened for two reasons:

1) What were once desktop apps have become online services so as to both reduce shipping time and reach a wider audience (browser improvements obviously are the main force behind this even being possible)

2) Apps that were previously the preserve of powerful desktop computers are now available on mobile phones (who needs that photo-editing software when I can apply a filter to the pic I've just snapped with my phone?)


The big question though is if sales are falling because people are switching to tablets/smartphones entirely or because they are buying tablets for a subset of their use cases. I'm a big PC fan and I would never replace my laptop with a tablet as my primary computing device. However, my next purchase will be a tablet, because I don't have one at the moment and it's great for surfing the web from the sofa. The marginal return of purchasing tablet is therefore higher. That doesn't mean my PC is going anywhere soon.


Another possibility is that older PCs are still good enough. My 4 year old Core 2 Duo MacBook with an SSD has felt speedy enough for all my day-to-day tasks (including driving a 24" display) that I just haven't bothered upgrading.

How well do Windows 7 and 8 perform on older hardware?


A computer that came with XP (let's say prior to 2006? 2005?) will probably still run faster with XP (though XP is literally 12 years old now..). However, from what I understand a computer that came with Vista will run faster with Windows 7, and faster still with Windows 8.

Vista upped system requirements from XP, but since then, MS has lowered system requirements and OS footprint on each release.

Remember though that there are PCs that were genuinely quite good (back then) which came with XP and then could also easily run Vista. This is sort of what I was trying to get at with the 2005/2006 limit. Meaning that some PCs that came with XP might still be able to run faster with Win7/8. Just don't count on it if you got it in 2001 :)


As tablet hardware and software becomes more capable, I would expect to see PC hardware sales drop. There may never be a perfect convergence where, for example, it's possible to develop tablet software on a tablet. But as the use cases increase particularly for content creation, less people need the capability of a full PC.


I suggest keeping an eye on the Atom-based Windows 8 tablets. These tablets run full Windows 8, with the ability to run Win32 apps, not just Windows RT. I haven't tried to run Visual Studio on such a tablet yet, but it seems feasible.


MS should just calve off it's different divisions. Seems like they are getting dragged down by forays into markets they aren't good in, e.g. tablets.


That's not necessary. Microsoft's net income this quarter was $4.97 billion. They're doing great. They failed to meet Wall Street's expectations this quarter, but that doesn't mean they're getting dragged down.

Additionally, the first few revisions of any new Microsoft product tend to do poorly, but Microsoft has huge stockpiles of cash that they can throw at the problem until they break into a market, dominate it, and start producing positive revenue.


Firing Ballmer would be the first step.


I've long thought that it may have actually been better for Microsoft in the long run if the antitrust trial had resulted in breaking them up. Who knows though.

The constant forays into new markets isn't new- they've been doing that since the mid-90s at least. At that time they had essentially "won" the PC market and achieved 90%+ market share with Windows and Office. Those products grew and grew until they had nowhere to go. You'd think that would be great for them (and it has been), but stock prices for the tech industry are driven by growth, not profits. If you aren't constantly growing and expanding, your stock price will flatline or start declining. So, for the last 15-20 years Microsoft has been taking the profits from their cash cows and throwing them into one attempt at expansion after another- MSN, Web TV, Zune, Xbox (which was ultimately successful, but not especially profitable)...on and on.

Hardly any of their expansion attempts worked out for various reasons. Often their products were clones of some other company's already successful product, and coming out with something just as good (or even a little better) a year or two later is just not good enough to unseat a firmly established competitor. They also push the Microsoft and Windows branding hard, on everything they make, but those are just not brands that most consumers have positive associations with.

That's the fate of many tech companies, and it's kind of depressing: they have huge success with some core products, blow up over a short period of time, and effectively achieve monopolies with those products. The core products become cash cows, reliably raking in tons of money quarter after quarter, but they have nowhere left to go. So the company dedicates itself to throwing that cash cow money at one (usually) failed attempt at expansion to new markets after another, but hardly anything takes hold and they are punished by the stock market and their shareholders, despite the fact that they are still insanely profitable. In the worst case, some sea change in the industry comes along after a decade or two, and all of a sudden their dependable cash cow starts drying up. It happened with IBM, it's been happening with Microsoft, and it'll probably happen to many of the top tech companies right now. It sucks and it's stupid, but that's the nature of the tech industry; being publicly traded is a double-edged sword because if you're not constantly growing and expanding, your shareholders think you are failing. I wish more companies could settle into a state of doing one thing really well and not have constant pressure to expand, because that's ultimately what kills them.


The desktop/PC market has lost all innovation. I'm not sure why. Things that we take for granted in other form factors: touchscreens, built-in battery backups, seamless bluetooth integration, voice commands -- should be standard by now in desktops. Once they assimilate what's great from other form factors, they can leverage the space on the desk to do things the other ones can't. Things like 3-D, very-large screen formats, wall displays, and gesture recognition.

Instead, somehow the desktop crowd started chasing gamers, creating faster and faster video cards and overclocked processors on the high-end. That was a nice crutch for a few years, but it's not a growth strategy. For the desktop/household PC to grow, it needs to develop into something that it's currently not: an immersive computing experience that's part of your household. That might even include starting to team off with builders to make the PC part of the normal decorative process of designing houses. There's no reason some kind of swappable PC with a wall display couldn't be part of household room designs.

The form factor is dead because the industry has lost the ability to execute on a vision. Instead they're just trying to see how long they can milk the cash cow. Looks like we're now beginning to see an answer to that question.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: