> My personal gripe was how poorly Microsoft thought about and handled security issues. My Linux computer is as far as I know virus free.
He is comparing Windows 95 to Ubuntu 11.04 (if you follow the link in that sentence)!
> They [devs] did not bother developing for other platforms because those platforms were economically irrelevant and the Microsoft developer tools worked.
Then he doesn't even make the connection to virus writers targeting Windows between 199x-200x.
> Windows security issues are everywhere and it did not need to be so.
Sorry, but that's mostly due to the desktop market size and Windows' share of it.
Everything after Windows XP had security at its core.
Blame the users who are clueless, that are emailing viruses to all their contacts, download Trojans and warez with backdoors, etc.
And again, he is comparing decades old MS OSs to latest versions of Linux and OS X.
> Nowadays nobody under thirty writes anything on Microsoft developer tools unless they are demented or brain-dead.
> > Windows security issues are everywhere and it did not need to be so.
> Sorry, but that's mostly due to the desktop market size and Windows' share of it.
No, it's not. It's mostly due to Microsoft ignoring security for years because it wasn't important to them. They didn't have to have everyone running as root by default in all versions of Windows before Vista (AFAIK in XP Home you can't actually set up restricted users). They didn't have to have lots of open ports offering things like RPC to the world. They didn't have to have all files executable by default, based solely off the hidden part of the filename in AnnaKournikova.jpg.exe.
There are now supposed to be 300M Android devices worldwide, which is within an order of magnitude of Windows' numbers 10 years ago, and you don't see Android phones being compromised remotely within fifteen minutes of being connected to a network. There's no equivalent of Blaster or Sasser or anything close to that level.
It's partly due to Windows' market share that it got targeted so heavily, but those opportunities wouldn't have been there if they hadn't ignored security for so long.
I think you've missed the point on multiple fronts...
> and you don't see Android phones being compromised remotely within fifteen minutes of being connected to a network.
Again, why do you (and others) keep comparing today's Linux/Android/OS X OS with a 10-15 year old Windows OS.
Windows security has been at its core since after XP, and by all knowledgeable accounts is just as good as Linux's ... as long as you know how to use it / deal with it. Today 95% of the problem is clueless Windows admins, and bad user decisions.
As far as my own experience goes, I've ran Windows 3.1, 95, 98, 2000, XP, Vista, and all the rest never having been compromised. So it is possible at least.
What you're doing is the same when people complain about IE 6 vs. the latest version of Chrome...
IE6 came out in 2001, and at that time was the most standards-compliant and feature full of all the browsers on the market (well, except for IE 5.5 for MacOS).
> They didn't have to have everyone running as root by default in all versions of Windows before Vista (AFAIK in XP Home you can't actually set up restricted users). They didn't have to have lots of open ports offering things like RPC to the world. They didn't have to have all files executable by default, based solely off the hidden part of the filename in AnnaKournikova.jpg.exe.
Of course they had to do all that. The Windows users back then were generally not very savvy and anything that got in their way was a disaster waiting to happen. Also it was a different time. Even today most Windows home users don't even understand the file-system with it's drives, devices, directories, subs, and files. And you wanted them to understand user security and how it plays with applications that they ran? No.
> but those opportunities wouldn't have been there if they hadn't ignored security for so long.
I guess they should have gotten a time machine to the future to pull all that work and knowledge back to the past. Windows XP should have been based off Windows 7.
My point is that what is possible today, was not possible 10, 15, or 20 years ago both from a tech and user point of view... Just because someone can do OS security good today, dosn't mean you can blame someone else for not doing it good decades ago.
As of ~2008, a very competent technical friend and co-worker had fired up a virgin WinXP instance, on the corporate intranet, to access some HR website which was strictly MSIE only.
Within the 15 minutes that instance was live, it had been compromised.
Anecdata, from a mythical extraterrestrial at that. But I'll stand by that and his experience.
> Again, why do you (and others) keep comparing today's Linux/Android/OS X OS with a 10-15 year old Windows OS.
You argued that Windows was targeted solely because of its high market share. I'm drawing a comparison to another platform with high market share; there simply wasn't anything comparable ten years ago. And it is not obvious to me that it's not a valid comparison; Microsoft were a huge company who had been developing Windows for fifteen years at that point. Android is a lot younger, so you could just as well expect it to be less mature and therefore less secure.
And yes, I know it is possible to run it without being compromised. You obviously knew what you were doing; millions, even tens of millions of others didn't know and wound up with their computers zombified into botnets. That wasn't all because of their ignorance; there were times when a newly installed XP machine would be compromised less than fifteen minutes after being connected to the internet, which wasn't enough time to install the patches it needed. That can't be considered that user's fault, especially when they've just sat through half an hour of being told how they're installing The Most Secure Version Of Windows Yet!
> Of course they had to do all that. The Windows users back then were generally not very savvy...
Now you are missing my point. Microsoft didn't have to do anything. They could have built an operating system that was harder to use but more secure. I contend that it's even conceivable that they could have built an operating system that was roughly the same for ease of use, but still more secure; maybe they'd have been slower to market or had to compromise elsewhere. The point is that security was not a priority for them for years, they obviously just weren't that concerned. That may ultimately have been the right path for them, because they arguably didn't pay a high price really, but I don't personally consider it the technically best course.
> My point is that what is possible today, was not possible 10, 15, or 20 years ago both from a tech and user point of view... Just because someone can do OS security good today, dosn't mean you can blame someone else for not doing it good decades ago.
I think this is where we fundamentally disagree. I don't see why you think security is only something that can be achieved now and why it couldn't be ten or fifteen years ago. In the Unix world, people have known not to run as root for decades; Microsoft chose to ignore that for a long time and ultimately have been forced to shoehorn it back in for Vista. They could have done that in XP, if not long before; it certainly had the capability for it, they simply cut that out of XP Home and chose bad defaults for XP Pro.
> And it is not obvious to me that it's not a valid comparison; ... Android is a lot younger, so you could just as well expect it to be less mature and therefore less secure.
There is a reason why PC games are so much more advanced in their design, graphics, and game-play today than they were 10-20 years ago.
By your logic, there is little reason why Crysis 3 should not have been developed 15 years ago. If they can do it now, and the product/market fit for it is good now, why not 15 years ago!
That's just not how it works.
> In the Unix world...
Different world, different people, different needs/wants.
Microsoft didn't ignore anything; they have maintained a billion users for more than a decade. And profited more than most companies with an increase in sales every single year. They did something right, more than they did something wrong.
Easy-of-use was a priority for them over security until after XP because...
1. It would have impacted their users negatively (do you remember the response due to the new security in Vista?... people couldn't handle a pop-up, couldn't understand privileges, etc).
2. There previous OSs started from a single-user standpoint and it's difficult to change that (and maintain backwards compatibility, 3rd party drivers and software, support, etc).
You make things seem so easy. That's not how it works.
No, that is a totally different thing, and a fairly ridiculous comparison. You clearly can't have Crysis 3 15 years ago because the computers weren't powerful enough. Please don't apply a bad metaphor and tell me that's my reasoning, because it clearly wasn't.
Security does not require computing power, it needs careful code. By your logic, OpenBSD would have been an insecure mess 15 years ago, and nobody's web server would be getting hacked today. That is not how it works.
Everything after Windows XP had security at its core.
Blame the users who are clueless, that are emailing viruses to all their contacts, download Trojans and warez with backdoors, etc.
I also blame MS for having insecure configurations by default. It's possible to secure a NT system quite well but there were a lot of compromises made in the defaults for convenience.
> Sorry, but that's mostly due to the desktop market size and Windows' share of it.
And yet, Unix/Linux had and has a much larger portion of the Internet-connected server market share but without the constant stream of critical vulnerabilities that Windows servers endured in the early 2000s. By your logic, Windows should have been a safer server choice because Unix servers were constantly falling to new attacks.
> And yet, Unix/Linux had and has a much larger portion of the Internet-connected server market share but without the constant stream of critical vulnerabilities that Windows servers endured in the early 2000s.
A couple of things...
1. I really don't know what the Linux server market share was in 2000 in comparison to NT server. Nor what the break-down of kernel vs user-space patches and vulnerabilities are. It also makes things much more complicated when you consider that the two were used pretty much in different situations and for different purposes.
2. That quote was in relation to the desktop/home-user market.
So I'm not going to go there as I don't want to compare apples to oranges, and on limited knowledge.
> By your logic, Windows should have been a safer server choice because Unix servers were constantly falling to new attacks.
How you're getting that from what I said makes no sense to me.
Just counting kernel security patches will give the wrong numbers. The "kernel" includes device drivers for every imaginable hardware component you can possibly run Linux with. In any real server, the security exposure is a fraction of that. If AMD processors require a patch, my Intel boxes will be safe. If there is an exploitable bug in my 3COM NIC, my Broadcom ones will be fine. In any running Linux machine only a tiny fraction of the kernel codebase is active and running.
It's really like adding all the vulnerabilities in the Windows kernel to the vulnerabilities of every device driver ever shipped in a box or made available on the web for every conceivable device you can buy.
> Nowadays nobody under thirty writes anything on Microsoft developer tools unless they are demented or brain-dead.
I'm 29 now, switched to mostly doing C# two years ago, and I love it. Its mix of pragmatism, familiarity and modern constructs is unparalleled, and no other truly modern language has this good IDE support.
The only thing I don't love about it is the vendor lock-in, but in reality this is a smaller problem for many applications than it seems.
Yeah, the author is apparently unaware that C# is one of the few truly multi-platform languages, which runs on Android, iPhone, OS X, Linux, windows of course, and whatever other platform the mono people have since added.
I don't even get the complaint about vendor lock-in. What does using C# lock you into (especially compared to the use of other languages)? Sure, it's possible to use C# in such a way that you're stuck with microsoft tech, but it's not mandated that this is the case.
The lock-in is mostly such that if you code in C# on Windows first, getting it to run on Mono later may be difficult, if you didn't pay close attention to what you were doing from the start. In this sense, it's rather different than e.g. Java.
But not too different from all that POSIX-specific Ruby and Python code out there, admittedly.
Yeah, but that statement applies to pretty much any language really, not just C#.
If you code your Java app to expect to find documents in /home/<user>/ or link your Java app with binary libraries that are platform-specific, then you're going to have the same trouble.
Cross-platform support is rarely (if ever) a matter of just compiling into a new binary.
> Mono is awesome but it's not perfect by any means.
If by perfect, you mean doesn't implement whole of .net, then yes. For me, this is good enough http://www.mono-project.com/Compatibility I am not looking to play catch-up with MS's implementation(good to see mono mentioned WPF as no plans to implement)
What I don't like about .net in general is it's still catching up on package and dependency management.
He was actually being positive about the vendor lock in, and saying that it is damaging to MS that people are using cross-platform stuff. If he were aware of Mono, that would probably make him more negative.
I'm 27 and have been programming in C# for 3 years using Visual Studio Express or Professional (started with Express and got a license for Pro 2 years ago).
You can use Microsoft developer tools to create platform agnostic code. So I fail to see how using a great product would make somebody "demented" or "brain-dead". If this attitude is common in other parts of the world, than it is no wonder that it is hard to find people who know MS languages well.
"You can use Microsoft developer tools to create platform agnostic code" at least someone gets it. I'm (just) 30 and I yet have to find an IDE that pleases me as much, and handles most of the languages I use (to make money, amongst other reasons) daily. I know it's cool and such to say MS sucks and does everything bad, but calling users demented? Ha, I almost stopped reading there.
"The only thing I don't love about it is the vendor lock-in, but in reality this is a smaller problem for many applications than it seems."
I hate to suggest this, but write that statement down and put it somewhere you will find it when you're, say, 40. I've known a number of Microsoft-dedicated (and Apple-dedicated, for that matter) developers over the years, and I can't think of any right off-hand that are still programming.
I see what you're saying but I think you're mistaking me for a different type of programmer. I'm not at all Microsoft-dedicated. I know 5 languages very comfortably, and quite consciously choose C# when given the freedom, because I like it most. I strongly doubt this will remain the case, just like I doubted that I would stay on Ruby forever when that was what I spent my time hacking. Most projects I work on professionally are a mix of languages, usually at least (cross-platform) C++, C# and JavaScript.
Self-reply here: naturally, none of this means that the author is wrong on the big picture. I wouldn't be surprised if MS indeed is very much dying. Sad, cause I like their dev tools.
I read Bronte Capital all the time. He is a hedge fund manager in Australia, and it very good at sniffing out fraudulent stocks or short-worthy stocks and shorting them.
I'm not too confident in his ability to assess technology though, and I think he may be a bit premature on this MSFT call.
I do believe that Windows 8 is a mistake made out of desperation. Given that the change in interface will confuse a lot of people, I think enterprise will be wary of making the switch because of all the massive retraining it will require. Any type of little change in enterprise environments will always require retraining, so I think enterprise adoption will be very slow to adopt. Maybe MSFT will wake up and change the interface back and have some sort of switch to change back-and-forth before it ships a final version though.
* Prediction: this will wind up with a lower corporate take up rate than Vista (ie next to none).*
* Prediction 2: this will accelerate, rather than slow down, the rate at which enterprises take their enterprise specific software into platform independent programs*
* Prediction 3: by stuffing this up Microsoft has just about lost its bet on moving the retail computer market into docking cloud computers. Apple will do this. And they will do it by stealth.*
Unfortunately, Microsoft seems to have survived Vista experience without too much injury. And having had some experience with enterprise platform movements, accelerating a "glacial" speed is still pretty slow. And finally, I still have doubts about Apple's capability and interest in the general "retail computer market". (Although I could be wrong about that.)
Whether one agrees or not with the facts and reasoning of the author (a successful hedge-fund manager with a widely followed blog), this blog post is important in and of itself because of what it represents: a tectonic shift in the business community's perception of Microsoft.
The business community is now openly doubting the future relevance of the Windows platform!
What the post represents is one small stock holder's opinion about a minor position [1], with a link bait title, and echoing popular opinions of the blogging echo chamber.
He has never been particularly bullish on Microsoft. He wrote about his "new" concerns nearly two years ago. [2]
No, this is historically very typical for Microsoft.
Windows ME sucked. Windows XP was great. Windows Vista sucked. Windows 7 was great.
At least that was the perception. Reality is more nuanced, of course.
And business is very conservative, they never upgrade immediately.
For business, the alternative to Windows 8 is Windows 7 or Windows XP. It isn't OS X or Linux or Android. Microsoft does better if they switch to 8, but they don't lose if they don't.
I agree that Microsoft has several chances to get Windows 8 (I.e. the tablet plus computer OS) right, so this is just strike 1.
That said, Windows is already a minority computing platform (in terms of new devices sold) so perhaps it cannot be as complacent as it has been in the past. If Windows 8 flops monumentally, when Windows 9 ships in 2014 we may all be docking our pads or phones to a keyboard and/or 2160p wallscreen to use virtualized Windows (when we have to) running legacy software and ten year old microsoft licenses.
There were no serious threats to their dominance before; now they have iOS, Android and even OS-X is starting to gain more marketshare than I'd be comfortable with if I were ballmer.
I really don't see the point that many HN users keep comparing apples with oranges. iOS and Android have no relevance in this discussion as they are not real OS'es. You cannot develop on them, you cannot do work on them except for replying emails and browsing some webpages. And that's how it will remain.
OSX is great, but please post the link from Apple store where you can get a decent working laptop for under $500. Good luck with that.
From a developer's perspective, I think we should keep a more clearheaded approach to this entire debate and not turn HN into a fanboy forum.
> iOS and Android have no relevance in this discussion as they are not real OS'es... And that's how it will remain.
This sounds seriously shortsighted. So far, Android and iOS are focused on media consumption, but I'm already starting to see them used for 'work', like taking notes in meetings. Asus' transformer line (tablets with a keyboard dock) shows where the next step might be.
There's no rule that 'real work' requires a WIMP (windows, icons, menus, pointer) interface. It's perfectly possible to imagine that in a few years it will be possible to develop on Android devices with some peripherals attached. There's no fundamental obstacle to it.
I don't think the PC is dead - there's still a lot of software and user experience built up around it. But it's quite clear by now that tablets/phones are becoming serious competition.
> iOS and Android have no relevance in this discussion as they are not real OS'es.
"Mavicas and Digital Elphs have no relevance in this discussion as they are not real cameras. You cannot expose film with them, you cannot do work on them except for taking candid pictures and sharing them with friends. And that's how it will remain."
It is always dangerous to assume that you're in the market you think you're in, and not the market that your customers are telling you that you're in. Microsoft seems to believe that they're in the "real computer OS" market, but their customers believe they're in the "lets me run my apps" market. And to those customers - whose vote ultimately matters the most - Windows is competing directly with OS X, iOS, and Android as a way for people to run the software they want to run. Increasingly, that software is the kind of stuff that runs on the tablet or smartphone that they're doing most of their computing on.
You cannot develop on them, you cannot do work on them except for replying emails and browsing some webpages.
So? All the employees in our client companies do their work exclusively through email and internal webapps (which are hosted and developed on Linux). And that's true for a whole lot of companies.
I don't see why wouldn't an "Android workstation" - or Chrome OS, or similar - work for them.
1) Windows 8 is changing Windows in pretty fundamental ways (being touch first, leaving the "PC" behind), and I can't foresee anything but minor changes in Windows 9 to this direction. I doubt they will try to go back to making Windows 9 a true successor of Windows 7. That seems very unlikely to me now. Microsoft is all-in with this tile-based version of Windows. Even their logo has changed to reflect that. So businesses who don't want to stay on Windows 7 forever, should take into account alternatives.
2) Even if you hated Windows Vista, there was nowhere to turn in 2005 but Windows XP. Now there are some pretty good alternatives, and people are getting used to using different operating systems than Windows, which I think is a huge deal, because it's usually very hard to convince users to use another OS.
> Windows 8 is changing Windows in pretty fundamental ways (being touch first, leaving the "PC" behind),
I just don't see how someone can say this if they've used windows 8 for any amount of time. I'm never in metro. I don't get it. I don't have any touch devices, and I wouldn't even know that that was an option except for the advertising.
> Microsoft is all-in with this tile-based version of Windows. Even their logo has changed to reflect that. So businesses who don't want to stay on Windows 7 forever, should take into account alternatives.
There is a usability consideration because of interface changes--if the only thing keeping you with windows is the interface, then this makes sense. If a company needs software to work on a platform they're generally familiar with and have support in place for, they're not going to seriously consider switching from windows by the time windows 9 comes around.
I wonder what would happen if some company decided to throw a lot of money at the Wine Project and release some enterprise platform based on Linux that boasted full compatibility with old Windows desktop software?
Yeah, it's interesting that it has become OK to say these things in the open. In the past, any doubting of MS was smacked down by simply pointing at Windows and Office, as well as MS's cash pile and market cap. It has been sort of painful seeing their series of sizeable blunders treated with something akin to a shrug, but I guess those chickens are finally coming home to roost.
> Nowadays nobody under thirty writes anything on Microsoft developer tools unless they are demented or brain-dead.
Hmmm.... i am 29, i write C# and .NET for a living... there are a few lads in the office that are .NET Dev, under 30... I must be in a black hole then...
His attack on using developer tools was harsh, but I'm over 30 so it didn't apply :)
But seriously, I'm afraid he is close to the truth. As a career Windows programmer, I have great fear for the future after having tried the 2012 preview -- I've never felt so lost on a computer in my life. For my own selfish sake, I keep praying they'll make some changes before final release.
In my case, I sell my own software. So having a market that will buy my software is first priority. Providing support all day is not how I prefer to make a living. (that's fearful me talking)
I'd love to know the percentages of independent Windows devs that make a living selling software vs percentage of Linux devs that can make a living selling software. Sure, big companies will employ both, but how does it work out for the small guys?
Edit: I will admit Linux and its ecosystem are looking more and more attractive all the time...
You don't have to go full time into it. Just putting some hours in the week to toy with it. Doesn't have to be Linux - can be Mac[1] and Objective-C as well, just diversify yourself for learning & (maybe, future) profit.
try hitting the "windows" key more often :) case in point, hitting the windows key immediately gets you to the tile view where you can start typing the name of program you want to launch and its instantly there.
its a bit of a learning curve (a very small one) but once you're over you'll be glad for not having to browse through the Start> All Programs > XXX > yyy mess ...
The "nobody under 30 writes MS" statement is obviously false at it's core but perhaps it means something slightly different?
I imagine most of the under-30 devs working on MS software are being hired by established companies to work on enterprise type software or for consultancies building websites using ASP.net etc.
What about people under 30 who are starting their own businesses from scratch? For example startups, they tend to be using stuff like Ruby/Python/JS etc.
So the question is , what happens when the old guard start to retire? How many will stick with MS because that is what they have used to that point , or will there be interest in switching to newer tools to build newer systems?
What about people under 30 who are starting their own businesses from scratch? For example startups, they tend to be using stuff like Ruby/Python/JS etc.
As someone who was under 30 during Microsoft's heyday (90s) I can say that most developers under 30 then also didn't use MS tools. Borland tools ruled the roost until VC6 and VB6 (around '98). And most young developers out of school back then knew Unix, not Windows.
My point is that it's a myth that there existed some time when all young developers used MS tools. In fact I'd argue that in terms of mindshare MS devtools are near their all-time high in popularity now -- it's that the Windows OS isn't as popular as it once was among the under-30 crowd.
I'm in a startup that uses C# and ruby, ASP.NET MVC, and PostgreSQL.
All software we right has to pass tests on windows and linux (we'll add os x when we get an os x computer). I know we're not conventional, but we feel like these are the best tools.
Anything else is just an argument from popularity and isn't worth our time.
"Nowadays nobody under thirty writes anything on Microsoft developer tools unless they are demented or brain-dead. Firstly the kids out of the colleges know the platform agnostic stuff well. Secondly when half the computers leaving factories either run iOS or Android (that is are smart-phones) nobody sensible will write in a way that does not allow easy porting to these platforms."
Under-thirty checking in as a user of Microsoft developer tools. There is still little out there that can beat C# all-around. There still is no alternative for Visual Basic.NET. XNA is still a quite popular game development framework.
Kids out of college know what their college specialized in. This is either Java or Visual Studio, and chances are when they graduate they'll get jobs at a company making corporate desktop software on Windows. The best way to discourage an aspiring programmer is to tell him/her "yeah it works, but it's not cool."
> Nowadays nobody under thirty writes anything on Microsoft developer tools unless they are demented or brain-dead.
I see most people saying he is wrong here, though I can't think why.
Are there people who thing A. writing software for windows? B. are targeting the windows phone? C. deploying to a cloud (private or public) using Windows servers?
If you are targeting a phone, it is most def either iOS or Android, probably both. If MS, a distant third (and at that point you'll probably be using PhoneGap, Titanium or HTML5).
The number of people actively developing new desktop apps for Windows has to be tiny. Maybe even smaller than tiny.
And if you are deploying to anything other than Ubuntu, you're crazy (and potentially fiscally irresponsible...BizSpark not withstanding).
I get that some people might be using the dev tools, though I would wager (no numbers on this, just gut) that the number of MS Web Devs is far, far fewer than the same open source web devs (PHP, Ruby, Python, NodeJS, Clojure etc).
So, I don't get why people say he is off base.
Frankly, the only people I can see still using MS stuff are the big corporates. IMO, MS is riding the long tail into obscurity. Though, with their financials, it would still be a long, long tail.
> And if you are deploying to anything other than Ubuntu, you're crazy
What? Did you ever see deploying to Windows servers? There's a reason AppHarbor's documentation pages are so much smaller than Heroku's: Deploying a .NET web app is peanuts. Through Microsoft-only-means, it's done with a single buttonclick from Visual Studio. This works really great. There's a convention as to how web applications are structured that is very widespread and supported by all relevant tooling.
Really, there may be many reasons for not choosing .NET for cloud apps, but server support is not one of them.
Secondly, you're forgetting a major category for .NET developers: devices. Office multifunctional printers, cash registers, machines in factories, any mid to high tech equipment really. Windows has a massive market share here, and lock-in is only one of the reasons why this is going to remain. For example, developing a touch screen interface for an ATM using WPF is very, very easy - definitely among the best options out there. The entire world may be moving to the web, but for devices there is no strong benefit in doing so. Why make a built-in webserver and a windowlwss browser if you can just make a decent native app in half the time, using half the resources? Also, the price of a Windows Embedded license isn't very interesting if the device you're selling costs a few thousand dollars a piece.
Admittedly though, I've no idea how small or large the dev market of machines and equipment is. But it's really pretty sizeable, much bigger than the average HN world-vision warrants.
A lot of software is never seen by consumers, don't forget that. A lot of it is never seen by humans at all.
> What? Did you ever see deploying to Windows servers? There's a reason AppHarbor's documentation pages are so much smaller than Heroku's: Deploying a .NET web app is peanuts. Through Microsoft-only-means, it's done with a single buttonclick from Visual Studio. This works really great. There's a convention as to how web applications are structured that is very widespread and supported by all relevant tooling.
Sorry but in a realistic production environment, not just a dev test environment, this is not true. Even Scott Hanselman pointed out ASP.NET has a terrible "deployment story"[1] compared to other options. I'm currently working as an ASP.NET MVC 3 developer and I've been really disappointed in Microsoft's stack in this regard. Life was a lot less stressful back when I was doing Python and PHP deployments on LAMP stacks.
To do it right you'll probably end up rolling our own Powershell scripts to do 1-shot deployments. These work great - but again - it's essentially the same story you have in the Linux world.
The number of people actively developing new desktop apps for Windows has to be tiny. Maybe even smaller than tiny.
Wrong. Windows desktop application development is still huge. Take a look at sites like download.cnet.com and look at how many apps get added each day. And this generally doesn't include the huge ecosystem of things like Office add-ins.
The Window's market isn't a growth market right now, but it's a market where there's still a lot of code being produced for it.
The corporate market is much larger than the start up market. Many corporate web sites, intranets and web applications run on either Java or .NET. This isn't exactly a secret.
That's only one part of the cake however. Think of all the companies out there. From engineering to banking to manufacturers of Q-Tips.
(Anecdotal counterpoint) For the vast majority of the students at my university, visual studio is the reach-to solution for any problem. They would much rather figure out how to write ASP.NET because visual studio is "awesome" than go around running node.js (what's that?) from a command line.
Anecdotal counterpoint to your counterpoint - I saw the same thing 10 years ago when I was in school (Comp-Sci students gobbling up Visual Studio). It seems the more things change the more they stay the same...
Despite this:
-Qualified .NET developers continue to be difficult to find. Also, they make slightly less money than their Java counter-parts for reasons I don't fully understand.
-The 00's are regarded as Microsoft's "lost decade"
-Microsoft's tools gained little traction among startups... although MS-based ones do exist. The most notable being StackExcahnge however that was created by industry veterans not out-of-college whippersnappers.
Don't count on computer science students to create Microsoft's future. Many will drop the major for something easier.
Yeah, but many of those are going to be the lazy and uninfluential engineers anyway. I'm more interested in why people who do look further than the most obvious option would or would not choose .NET.
I made the decision to jump ship from Microsoft developer products about five years ago. I could already see the writing on the wall. I swapped the remaining proprietary tools I was using on Windows XP over to open source alternatives and jumped from VB6 and .NET over to Python. Then, when it was time to replace my PC, I installed Ubuntu 8.10 instead of Vista. Never looked back.
> Nowadays nobody under thirty writes anything on Microsoft developer tools unless they are demented or brain-dead
The gamedev community is very MS centered. There are a few people doing handheld stuff, or homebrew console hacking, but the majority use Windows and Direct X. A lot of people use C#, or even develop for the Xbox 360.
Say what you want about Microsoft, and acknowledging the past decade has been a real disappointment, I wouldn't count them out just yet...
If I compare Microsoft's handling of the OEM PC industry of the 1990s to Google's handling of the Android ecosystem, the latter is utterly laughable in comparison cough Nexus 7 Screen issues cough. That alone is reason enough for me to think they could return. They just need the right leadership. And if you think such a transition is impossible, compare the Apple of today to the Apple before Steve Jobs returned...
> In the late 1990s Windows developed huge market power. Whilst not strictly a monopoly the company had plenty of monopoly characteristics. Sure you could buy a Macintosh - but that market was so small that people did not develop software for Macs and hence Macs were for people who did not need a wide range of software.
So is this, in addition to careful attention to typography, the explanation for why Macs were big with designers? (Many designers do the vast majority of their work in a handful of programs, and one in particular.)
Platform agnostic tools have hardly done anything on the desktop or mobile. If you're writing a mac app without Cocoa or a Windows app without .NET (or other various Microsoft technologies) you're going to have a bad time.
Python and Ruby might be first class languages on the web, but they're never going to be on the desktop/mobile. (Yes I know about things like RubyMotion)
No devs under 30 work on Microsoft stuff? Even if you take that to mean "No dev under 30 wants to work on Microsoft stuff", that is ridiculous.
To my mind, Microsoft ought to be putting all of their corporate weight behind Mono. CLR, F#, C# are all technically superior to JVM, Scala and Java IMHO and many others. But what holds alot of people back from .Net in the enterprise is that they want Linux in the server room, not Windows. If Microsoft really put their weight behind Mono then I think lots of enterprise customers would ditch Java in favour of .Net
How can I say this gently.. Windows 8 looks like ass. I don't know a single person running it today. Things do not look good for Microsoft in this release cycle. I don't think the following one will be any better.
All internal computer now run as virtual machines (not desktops) running on two mondo-powerful Linux servers. The virtualization platform is Citrix. Nobody has a functional box under their desk any more....The company has got rid of the desktop computers entirely (sorry Dell and HP)
What? How are they accessing these virtual machines? Mind meld? In most cases where companies use VDIs the desktop machines are the standard old Dells and HPs because they actually cost less than "dumb terminals" (aka thin-clients). And that's accepting the questionable notion that VDIs are the future.
Nowadays nobody under thirty writes anything on Microsoft developer tools unless they are demented or brain-dead
We have been on a hiring binge lately and it is very difficult to find candidates who know anything but Microsoft tools. Sure they might know github, but there is a very substantial part of the workforce that stills crawls into Microsoft's bosom.
In general this blog post is completely detached from reality. There is the "startup" culture, of course, where everyone runs an iMac and develops iOS and Ruby/MongoDB apps for their EC2 cluster, and then there's the many magnitudes bigger general computing world that holds zero similarities.
Probably with Citrix-compatible thin clients. There are a ton of those out there.
...it is very difficult to find candidates who know anything but Microsoft tools.
That may be true in the US and some parts of Western Europe, such as the UK, but what about the rest of the world? The author of the blog post, John Hempton, is a hedge fund manager who works out of Australia but invests worldwide, and he often thinks and writes in terms of global trends. What is your experience trying to hire candidates outside the US?
Where else is there? Development in India is more ingrained in Microsoft than anywhere else on earth. Looking for blog posts on silverlight will lead to a poorly written guide by a developer in India. ALL LARGE IT CONSULTING FIRMS have major operations in India, and the majority of them use ASP.NET or WPF. Many are now moving over to open technologies like rails, but only in small US specific shops targeting US customers. I haven't heard or interacted with many Chinese developers who aren't in the US. You mentioned western Europe specifically--from working with some Microsoft platform lately, it seems like most silverlight 'vendors' in Europe of are in eastern Europe (most notably, i think, Telerik).
Australia may be the exception to the rule, though I doubt that a quarter of Australian CS majors go on to work somewhere using python, rails, or PHP. A good number probably go to java, a compiled language, or iOS dev, but if they're going into web, i'd be hard pressed to believe that a majority aren't going into an ASP.NET operation.
Claiming that most developers under 30 don't use microsoft platforms is just pish-posh. They just don't learn it in class, and don't do it in their free time. That doesn't mean their first job isn't going to be a DB analyst on some 10 year old VB.net application.
> Development in India is more ingrained in Microsoft than anywhere else on earth.
Software development in India follows whatever happens to be mainstream. People need jobs, most of the jobs are in software services, the projects service providers get are mostly Java/.net, colleges and people tend to stick with Java/.net to be on the safer side.
> ALL LARGE IT CONSULTING FIRMS have major operations in India, and the majority of them use ASP.NET or WPF.
Most of the consulting firms in India don't get a choice, even for the greenfield projects. Goldman Sachs comes in saying this is what you need to do and use Java, the consultancy plays along.
That said, from what I have seen, Java overshadows .net by a big margin.
I agree with all of that; I should have been more clear in referencing the web stuff. Perhaps I'm being unintentionally disingenuous about Java's role in all of this.
Probably with Citrix-compatible thin clients. There are a ton of those out there.
Why would you post this reply when I specifically address that in the next two sentences?
That may be true in the US and some parts of Western Europe, such as the UK, but what about the rest of the world?
India is overwhelmingly Microsoft-centric. Eastern Europe is overwhelmingly Microsoft-centric. Much of Russia is very Microsoft-centric.
But nonetheless your oddly defensive argument (know the author?) sounds suspiciously like the "oh yeah, they're big in Germany" retort. But anyways what I am replying to is what I see as a immediately ridiculous claim that people under 30 don't know Microsoft, when the overwhelming proof says otherwise.
The author of the blog post, John Hempton, is a hedge fund manager who works out of Australia but invests worldwide, and he often thinks and writes in terms of global trends
This is one of the stranger appeals to authority that I've yet read. When I think "technology trends expert" I don't think "hedge fund manager". That someone invests money gives them zero authority on technology trends. I should add to notice that my business is hedge fund technology, so the circle is kind of complete here.
I'm from Australia, and (based on the job ads I see - I just did a quick search on job site seek.com.au - "ruby" returned nearly 300 results, "asp.net" returned > 700, and having worked here for 10 years) Microsoft is the dominant development paradigm, so I don't think the condescending notion of the parent "this is a GLOBAL thing you yanks wouldn't understand" works either.
I get the same impression about Florida (I've friends there). Lots of Microsoft. Even the Java apps run on Windows servers. I'd love to see more Python there.
... it is very difficult to find candidates who know anything but Microsoft tools.
This may depend on a variety of things, such as the job location and desired skill level, but this blanket statement is false. For example, it seems that a significant number of Hacker News participants - who are developers - develop in other languages and platforms.
... completely detached from reality.
My personal opinion is that developers who don't learn other languages, platforms, and tools are completely detached from reality. There will always be (perhaps seemingly a majority of) people where being a developer is just a "job" and learning new things isn't necessary. But, again my personal opinion, this ultimately harms yourself.
Finding good people is hard. Many others have written about this, but don't be discouraged that there's only Microsoft developers out there.
There will always be (perhaps seemingly a majority of) people where being a developer is just a "job" and learning new things isn't necessary.
Absolutely. And I don't begrudge those people- I initially learnt .NET myself and then broadened my horizons, but many of my previous co-workers have families, time consuming hobbies or other such interests. There's not inherently wrong with having "just a job", if you're content with other things in your life. It seems like a specifically (oddly) US-centric view that there should be anything wrong with that.
Here's a quote from the article that makes a good point:
... When firms were asked why they have difficulty hiring, 55% picked "lack of available applicants," but essentially the same percentage, 54%, said candidates are "looking for more pay than is offered" (many more than the 40% selecting lack of "hard" skill). This is an important reminder that the labor market is a market.
This may depend on a variety of things, such as the job location and desired skill level, but this blanket statement is false.
The blanket statement was a personal observation about our own hiring, so I can assure you with complete conviction that it is not false. Further I addressed the alternate universe of the start-up world, comprising a tiny, tiny percentage of software developers, and that is what HN caters to.
Just to be clear, I don't like that most candidates outside of the startup-sphere are so Microsoft or Java centric. In fact it is a battle that we constantly have to fight (hire somebody and then have to argue every single decision that isn't the Microsoft Way). Yet I have enough real world development experience that I found that claim so ridiculously detached from reality that the author lost any and all credibility on tech matters.
I agree with most of the article, but the "The Ubuntu Unity failure" was only a temporary one.
The author says it got better, but in reality Unity already overtook Windows' UI and it will overtake Apples UI with one of its next iterations.
As a younger (though not that young) developer I have to agree that using MS developer tools is the wrong way of doing things for most new software, not all of them though, it sure has its uses. But I am strongly against proprietary software that only runs on one OS as that will only lose you business. Even the game platform Steam seems to have gotten to that point, a Linux port is on its way.
The great thing about propietary software and one OS (Windows specifically, OS X too though) is people using those operating systems expect to pay for their software. I'd love to come out with a Linux version of my software, but I'd then be competing with 10 free versions, and worse, a mindset that expects everything to be free. I gotta eat!
There is definitely a hunger in the Linux community for quality commercial applications in areas where open source has not traditionally managed to perform so well (Image/Video editing, games etc).
You would certainly have limited luck trying sell a C compiler or package manager to Linux users for sure, but there are certainly areas (including some developer tools)where there could be a market.
If you released (say) an image editor for Windows and Mac you are competing with Photoshop as well as countless other programs, whereas if you release for Linux you may have less potential users (although there are still an estimated 30 million) but you are competing with The Gimp.
There is a subset of Linux users who would never consider any commercial software, but this is pretty small percentage, at least that is certainly what Valve is banking on.
Mac users do expect to pay for software sure, but Windows users? Not so much since Windows seems to be the platform with the highest piracy as well as countless horrible "freeware" programs.
If you released (say) an image editor for Windows and Mac you are competing with Photoshop as well as countless other programs, whereas if you release for Linux you may have less potential users (although there are still an estimated 30 million) but you are competing with The Gimp.
Thanks for that -- a bit surprising really (current stats show Linux users out giving the other two). Is that because they are more willing to pay for software as Linux goes mainstream, or desperate for games? (honest question)
This is the same issue with mobile development that the author seems to be pushing everyone towards. Your app might really be worth $50, but guess what... you'll be selling it for $0.99 or you won't be selling it at all.
> I agree with most of the article, but the "The Ubuntu Unity failure" was only a temporary one.
Absolutely. Unity in 12.04 is very solid. There's still a lack of settings options, but I don't think that warrants the whole thing being called a "failure".
Wait, so he's saying that he's moving from MSFT to AAPL because AAPL has better lock-in? Personally, I'm staying with MSFT because MSFT has already learned the follies of relying on lock-in and is moving on where AAPL is repeating MSFT of the 90's mistakes.
He is comparing Windows 95 to Ubuntu 11.04 (if you follow the link in that sentence)!
> They [devs] did not bother developing for other platforms because those platforms were economically irrelevant and the Microsoft developer tools worked.
Then he doesn't even make the connection to virus writers targeting Windows between 199x-200x.
> Windows security issues are everywhere and it did not need to be so.
Sorry, but that's mostly due to the desktop market size and Windows' share of it.
Everything after Windows XP had security at its core.
Blame the users who are clueless, that are emailing viruses to all their contacts, download Trojans and warez with backdoors, etc.
And again, he is comparing decades old MS OSs to latest versions of Linux and OS X.
> Nowadays nobody under thirty writes anything on Microsoft developer tools unless they are demented or brain-dead.
Completely false statement.