> So forgive me for being smarmy and offensive, but I have no interest in being a ‘web guy’. And there are two reasons for this. First, it’s not a challenging medium for me. And second, because the vast majority of Internet companies are filled with bad engineers – precisely because you don’t need to know complicated things to be a web developer. As far as I’m concerned, the Internet is responsible for a collective dumbing down of our intelligence. You just don’t have to be that smart to throw up a webpage.
Sighhhhh. The entire tone of this article suggests somebody who has never done a lot of web programming, and is making a classic "I-haven't-done-it-so-it-must-be-easy" kind of judgement, with a very idealised view of the glamour of his own sub-discipline of software development.
Sure, there is a huge amount of very mundane web programming in the world. But there is a proportionally larger of amount of very mundane non-web programming in the world, too. 95% of the programming that goes on the world is not "challenging" to a good developer.
Taking a very reductionist viewpoint, web programming is just the same as non-web programming, but with a demand that the output from your program is structured in particular ways. I'm sure anyone can think of hundreds of examples of technically challenging and innovate programming that is best suited to delivery in a web environment?
I get as annoyed with really bad web programmers as the next guy, but that's only because the web permits users to inflict their work on others much earlier in the development process than most mediums. Back in the days when I was first learning BASIC, the commercial internet was not so well-developed that I could easily inflict my software on a large group of people. Even if I did, my "death valley" clone would probably have had limited appeal.
With the web, pretty much anybody learning a language like PHP or ASP or whatever can basically do some copy-and-paste programming and have something up on the web ready for public consumption before they have even finished figuring out how programming works.
yes, it lowers the barriers to entry. I think it's a great thing that a musician working at a conservatory can throw together her own web site using PHP and shoving some SQL into tables. I think that's success just as an accountant can hack spreadsheets with macros and so forth. That's also success.
With the web, pretty much anybody learning a language like PHP or ASP or whatever can basically do some copy-and-paste programming and have something up on the web ready for public consumption before they have even finished figuring out how programming works.
That was the route I took, and thankfully I was sufficiently chastised by better programmers for unleashing stupid, insecure PHP software that I became more cautious.
And that's learning, right? I did exactly the same thing and I consider it a positive thing that people can easily produce software that is usable by other people without having to convince others to install software or do any "distribution" as such.
1. The 'web platform' is difficult to write for because there are many different browsers and operating systems.
2. The programming language environment for the web is immature and has sharp tools.
3. The people who write web apps are poor quality.
#1 is true enough. It is painful. That why we write tests, use automated systems and are driving for standards. Of course, he could help by persuading Microsoft to play nice with others on web standards.
Also, he's forgotten the days of programming in assembly language where you had to pick the processor and BIOS you were targetting. That was really ugly, but that's where companies like Microsoft got started.
#2 is also true enough. It is an immature environment. But that will change.
#3 is totally true. There are poor engineers working on the web, but that doesn't mean that there will always be poor engineers on it. It's a virtuous circle. The better things get the more the good people will move to it.
It strikes me that he's in real danger of missing an 'innovator's dilemma' situation. Yes, programming for the web is 'crappy' compared to picking a single operating system with very mature advanced tools. That could be the equivalent of the non-hydraulic backhoes and massive hard disks detailed in the Innovator's Dilemma. Perhaps he just hasn't seen the change that's coming, coming.
Point #1 is meaningless. You want to have compatibility with major browsers for more market share, otherwise nothing stops you from supporting only IExplorer or Firefox.
It's the same situation with desktop applications, although the situation is a little better because of Windows. But, if you don't want your client to download dozens of MB for the latest .NET, then you use .NET 2.0, if want more market share then you start working on a Mac port. If you want to target mobiles, then you start working on IPhone and Symbian ports, while keeping an eye on Android. If it's a game you're developing, then you start working on a port for PS3 / XBox / Wii.
The whole allure of web applications is that you can reach a larger audience faster, with no middle-man between you and your customers (no installation required, no manual upgrading to the latest version, no piracy).
Your work can be a lot easier when you're using cross-platform tools, but then again, it's the same for browser incompatibilities. Javascript hurts? Then use JQuery or something more heavy like GWT, or maybe OpenLaszlo. CSS hurts? Then use Javascript to simulate missing properties (and the situation is a whole lot better than providing an OS-consistent GUI for both Windows and Mac).
I think points #2 and #3 are in conflict, because #3 happens because the barrier to entry is lower. How can that be if the tools we use for the web are immature?
His main point is that the web relies on dynamic languages which don't have IDEs like Visual Studio with intellisense / automatic refactoring or compilers that highlight syntax errors at compile-time (well, he actually mentioned HTML as the language, but I'll close my eyes to that).
While I can see from where his point comes from, it is also missing the forest from the trees. From my p.o.v. this is a virtue more than a drawback because it makes you think more about your API (your average .NET / Java API design really makes me cry), and it makes you think more about what's hard about the process of development (unit-testing, functional testing, standards, and all that).
He also implies the lack of "object-oriented languages" and that of debuggers ... which is a complete fallacy. Modern OOP concepts are not available in C# (mixins, traits, multi-methods), and the debugging in Python (just an example) is a lot more comfortable than what's available in VS.NET. Does VS.NET give you a way to inspect / redefine active objects / methods / entire classes on a breakpoint and then resume execution? And how much do you have to pay to get functionality available for free in Valgrind/KCacheGrind?
Also, in platforms with continuations support the debugging takes a whole new perspective.
Overall, the article is just a poor and content-free rant.
It started off interesting, pointing out the advantages of desktop development. Then, it degraded into pointless idiocy when he said that people only program for the web because they're too stupid to do anything else. Is there any need for sharing idiotic rants like this?
There is this strange notion in the industry that, the closer you are to shifting and flipping bits directly, the farther you are from the end-user and the closer you are to the machine, the higher you are in the programmer pecking order. I wrote enough C and Assembly to know it's hard, but in my experience being a "web guy" is just as hard. In fact, it's a harder kind of hard for us introverted folks, because it involves understanding why people interact with your CRUD app the way they do. You can't just poke them with you oscilloscope probes and expect to get meaningful answers.
Managing your own memory is much harder to get right than letting the compiler do it for you, so therefore you must be better if you do it. Your programs have more bugs and run slower... but... it's hard... so it must be better.
Fortunately, eventually all the people that grew up on C will die. Then the field can perhaps move on.
Ah damn I misread your comment (missed the sarcasm) and accidentally voted down. Would appreciate if someone felt like upvoting his comment to counter me :-)
But basically, web apps and non-web apps are the same. The user sees a computer program designed to make his life easier in some way. Inside, our code handles requests and produces results. We use the same programming techniques to process the request and update the UI in some way. The only difference are details; for a command-line app, you can freely block on IO and you produce results with "print". For a web app, you get an "Http Request" object, and you prodce an "Http Response" object. Inside the response is code to draw the UI, send more requests to your application, etc. And finally, for a "GUI" app, we receive events instantly and any of those events can affect the program state (and any handler can update the GUI immediately; no need to wait for a "request cycle").
Honestly, I think the web model is the most difficult of the three. But I certainly don't look down on anyone that writes the other types of applications, because they are really about the same.
There are CRUD web apps, just like there are CRUD desktop apps. The problem with most web apps is not the application logic. The real challenge lies in scaling the damn thing. You could probably build Twitter in a day, but scaling it to handle thousands of concurrent tweets will take several months of very hard work.
90% of web apps are easy, just like 90% of desktop apps are easy. The remaining 10% counts.
Why would I want to program for the web? It's too hard! Also, web programmers program for the web because they aren't smart enough to program for the desktop.
Wow, this is some of the worst writing I've ever seen, and that's saying a lot.
The entire thing contradicts itself. One paragraph, web development is way too complicated. The next, he doesn't want to be a web developer because it's not complicated enough for his giant intellect! He talks about "civilized" things like compilers and debuggers and static typing, then writes, "It's a challenging place to be." -- Hey, smart guy, the next sentence should reinforce the previous one, not completely contradict it!
If the piss-poor writing weren't enough, this d-bag has arrogance in spades. Guess what? I programmed C and Assembly, too! I moved to the Web because it is more challenging, not less. To be a good web developer (as opposed to the stereotype he uses in his randomly-contradictory argument) you have to know all sorts of stuff -- languages, frameworks, libraries, browser quirks, UI design... the list goes on. And there's some new evolution every year you've got to learn to keep ahead. Let's go over all the skills I possess to do my job:
HTML
CSS (including quirks for 4 browsers)
Javascript (and frameworks)
Python (along with Django and dozens of libraries)
Sysadmin
Design (not super common for dev-focused folks, but I do this too)
And that doesn't even include all the APIs and other nonsense I have to interface with. Or emerging tech like Comet. Or asynchronous network programming, which I also do daily.
Oh, and since when is "process" better than? "Waterfall? Spiral? Agile? Forget it," indeed! These have nothing to do with Web vs. Desktop, by the way: they're processes used to manage teams. It's too bad you need a team to create your product, Mr. Compiler Guy; I've been a one-man show for a decade. Don't fault me for my ability to have a large, diverse knowledge base.
Web development has a lower barrier to entry, that's a fact. Unfortunately, he not only failed to make the argument that it also has a lower ceiling, he made himself look like a complete tool in the (flailing) process. Here's a pro-tip:
Stick to your IDE, Michael; you can't write for shit.
Confusing web designers with web developers seems to be really, really common. And Michael has fallen into the same trap. I can't make a good layout to save my life. I've had far too many calls of people looking for a developer when they really want a designer.
All the issues he talks about that happen on the desktop scale. I have to deal with at the server scale.
He also compares some lone webdev working out of their home to a programmer working out of an office. When your a one man band you rarely need to document those type of requirements. This doesn't change if you are writing for windows or you are writing for the web.
BTW, I do both Windows, Web, and recently Cocoa. I find it frustrating sometimes that getting events right in Winforms can be more tasking than doing the same on the web. I do enjoy having my windows layouts behaving. If you've had to write for IE6-8,Safari,Firefox, and Opera I'm sure you'd agree.
To this I say. It is your choice. You can stay at MS and continue developing desktop products. A 'Web Guy' has to deal with bad and ugly browser technologies, mixed language technologies, deal with scaling problems, worry about security etc. I could think of at least ten web based projects that are next to impossible to develop for the Web. How about http://www.wolframalpha.com/? He is confusing 'Web Development' with 'programming' a blog!
The fact that anyone can program for the web is a bonus not a disantavantage. It forces the software monoliths to move! My kids all learned programming at high-school but that did not make them programmers. The field had always the CS Scientists, mixed up with the Cobol Programmers at Financial institutions the ADA Defense Contractors etc., now we have a different landscape but we can do without this type of bragging which is akin to ...mine is ..bigger than yours... and you are not a man because...
"The reason most people want to program for the web is that they’re not smart enough to do anything else."
One thing he doesn't understand about the web is the dangerous power of blogging. With one sentence you can permanently brand yourself as an idiot to tens of thousands of people.
Shorter: Based on one or two poorly coded web apps, which are also examples of the best web programming ever, I have concluded that web developers are stupid.
Shorter comment thread: I have decided that all of the hard CS problems in web programming, including, but not limited to, scaling, machine learning and recommendations, high availability and fault tolerant systems, and everything google does, are called "server programming," because I would otherwise have to admit that many web developers are smarter than me. Also, javascript was not designed from the ground up by Microsoft to draw graphics on a canvas, so everyone who uses it for that purpose is also stupid.
There’s a reason I work where I do. As far as I’m concerned, the people I work with are doing the most innovative and amazing things on the planet. What a thrill it is to be a part of it.
I generally don't take delusional people's opinions very seriously... and this is no exception. This is one of those rare instances that Jeff Atwood is right (http://www.codinghorror.com/blog/archives/001296.html).
Sighhhhh. The entire tone of this article suggests somebody who has never done a lot of web programming, and is making a classic "I-haven't-done-it-so-it-must-be-easy" kind of judgement, with a very idealised view of the glamour of his own sub-discipline of software development.
Sure, there is a huge amount of very mundane web programming in the world. But there is a proportionally larger of amount of very mundane non-web programming in the world, too. 95% of the programming that goes on the world is not "challenging" to a good developer.
Taking a very reductionist viewpoint, web programming is just the same as non-web programming, but with a demand that the output from your program is structured in particular ways. I'm sure anyone can think of hundreds of examples of technically challenging and innovate programming that is best suited to delivery in a web environment?
I get as annoyed with really bad web programmers as the next guy, but that's only because the web permits users to inflict their work on others much earlier in the development process than most mediums. Back in the days when I was first learning BASIC, the commercial internet was not so well-developed that I could easily inflict my software on a large group of people. Even if I did, my "death valley" clone would probably have had limited appeal.
With the web, pretty much anybody learning a language like PHP or ASP or whatever can basically do some copy-and-paste programming and have something up on the web ready for public consumption before they have even finished figuring out how programming works.