Hacker News new | past | comments | ask | show | jobs | submit login

> 9. It's designed for large organizations. Large organizations have different aims from hackers.

This is where I think pg's bias steered him wrong. Hackers tend not to like Java (although that is less the case now with the JVM ecosystem being what it is), but hackers do not solely determine the success of technology. Furthermore, the type of hacker that pg is interested in—the lisper, the 10Xer, the startup founder—are in limited supply. Certainly they wield more influence per capita, but they still need critical mass to form a stable community around an open source platform in order to compete with large organizations. If you're looking to build a product or a small scale project, by all means find yourself a Grade-A hacker.

If you want to make the next ubiquitous programming platform though, there are other qualities that are necessary. Consensus building is probably the most important skill that large organizations have that hackers will struggle with at scale. If you have a ton of management and mediocre programmers, it's easier to at least get them moving in the same direction, and they'll be more tolerant of the foibles of a designed-by-committee language. Sun had the perfect storm of strong technical talent to design a solid language, but also the large organization effects to market a new platform to enterprise decision makers.

In a way I think pg's smell test might be more valid for an earlier era. He underestimated the influence of the mediocre programming armies, probably because back in the old days there weren't as many and they weren't as mediocre. This is just speculation on my part, but I have to imagine that the quality of the average programmer as declined over time as the numbers and appearance as a recognized career path have increased. I mean how many people were programmers in the 70s because they're parents thought that's what they should go to school for?




Something perhaps worth pointing out: modern tooling (issue tracking, source control, automated tests, continuous integration, etc.) make it easier and easier to extract productive work from mediocre developers, even at scale. One of the things it has taken me the longest to understand in my career is that you can build incredible things with sufficient effort, even if everything at every level is done half-assed.


One of the things it has taken me the longest to understand in my career is that you can build incredible things with sufficient effort, even if everything at every level is done half-assed.

You can, but those "incredible things" have a shit lifespan and they die at unpredictable times, and it costs enormous amounts of money to fix or replace them (in part, because business requirements devolve into the "make it work exactly like the old system" style of project that makes good hackers leave and consultants ask for $50,000-per-week budgets-- in part, because they recognize the unpleasantness of the work and need to hire a team to do it; anyone savvy is going to find a way to collect the benefits while managing rather than doing that kind of work).

Mediocre programmers generate write-only code and systems that work well on launch but start to shake shortly after, like American cars in the 1980s.

Here's another thing. You can get seriously good (1.6+) engineers to maintain Postgres or the Linux kernel, because even though the work is hard and painful (and in the OS world, often gratis) it's an opportunity to learn from excellent (2.0+) programmers who built the thing. What were their concerns, and why'd they solve the problem this way? It's a learning process. But no one good is going to maintain the complexity sprawl generated by mediocre programmers for a typical programmer salary. The market rate for a good engineer to do that kind of work is $500-1500 per hour (because there's no career benefit to the work). Pay less and you'll deal with rapid turnover and evaporative cooling (i.e. the good leave and the bad stay). Or, the typical corporate solution is to put powerless junior programmers (who are not competent enough to maintain a system that is falling apart faster than they can fix it) on that type of work, which prolongs the death spiral only slightly.


I voted you back up because I don't think it deserves the downvote, but here's the thing: mediocrity exists and it gets things done. For you and I who do not wish to tolerate mediocrity our only option is to find work environments that suit our tastes, but we're not going to stop the machine through our indignation.

Plus, even if a company rejects mediocrity in its DNA, a non-technical organization is most likely going to have structural issues that will drive out good programmers anyway. It's possible for a non-technical organization to learn to value and retain good programmers, but there are many factors that need to come together, and statistically I think the majority will never figure it out.

Finally, there's an amazing thing about code is that once it works, it can continue to work in the exact same fashion indefinitely as long as nothing changes. From experience we both know that it's much better to do things right so that you have a maintainable and flexible system that can grow with an organization, but that is not a strict requirement for code to generate value in the short to medium time frames that most companies operate. Just because code is completely unmaintainable, and the cost of comparably trivial maintenance becomes exorbitant, doesn't mean that it doesn't have real business value. We privileged alpha programmers have the luxury of avoiding jobs that have to deal with this, but it's out there and it's a huge proportion of production code (I'd wager it's the majority). In the grand scheme of things it's all good though, nothing lasts forever anyway. Bad code could be one more thing that hastens the demise of an atrophying company, or it could be quietly deleted and replaced by newer systems once it's outlived its usefulness.


there's an amazing thing about code is that once it works, it can continue to work in the exact same fashion indefinitely as long as nothing changes

No business stays the same. All systems have to change, either due to evolving business needs or because the data inside them are growing. The first means bad code getting changed, generally at incredibly high cost. The second puts stress on critical points in the bad code, generally causing unexpected and intermittent failures.


Of course. I'm not sure how having read my post you could come to the conclusion that I don't understand that.

But the point is that this does not preclude businesses getting value from maintenance-nightmare software.


A pleasing fantasy, but a fantasy nonetheless. The world runs on software these days. Everything around you has software in it, at every level: firmware, micro-code, operating system, drivers, server, client, network, etc. And the vast majority of it was written by mediocre programmers. There's simply too much production code in the world for it to be any other way.

Yes, mediocre programmers create mediocre code, and they have a harder time maintaining that code, but they still manage. Often times they have the advantage of money, resources, and people. With enough devs and a reasonable process it is possible to "polish a turd" into something that is functional and feature rich, even if it has a great many flaws.

This is the scary realization that most devs, including myself, have a hard time accepting. The software that your bank runs on? Yeah, that was written by hundreds of devs and if you interviewed each one you would probably only recommend hiring a tiny handful. The firmware your hard disk runs on? Same deal. Also your car's EFI module, your cable modem, your DVR, etc.


Spooky as shit but nonetheless true. Fact is, by definition, most code is written by average/below average coders with no vested interest in it.


I was thinking on this a bit today and I realized that we're actually a little bit better off in the software world, since good software can be copied infinitely. So well written operating system kernels, server applications, libraries, etc. can have a huge impact on the overall software landscape, regardless of how few people contributed code.

But even so, there is still far too much code out there which is the culmination of mountains of mediocrity. It is rather frightening to think about, considering that everything these days relies on software.


You're arguing that the median quality output of programmers isn't raised by good practices, good tools and good management by using examples that feature poor practices, poor tools, and poor management.

Some questions to consider:

Practices: Is the output of the median programmer raised or lowered when all code is reviewed by a senior programmer?

Tools: Is the output of the median programmer raised or lowered when the language includes memory management?

Management: Does management that insists on quality output and backs it up with time or resources raise or lower the the output of the median programmer?


I feel like I miscommunicated and we're talking past each other, so let me try to fix that.

I guess I was reflexively reacting to a certain culture that tolerates mediocrity. Yes, tools are important. Code review, source control (Git, not that Perforce monstrosity), and continuous integration are necessary. For every level of programmer, such tools are necessary, but they shouldn't be used as an excuse to keep trying to nail 100 mediocre programmers together and make them build something. It still won't work.

Modern tools should be used because even great programmers have lapses into mediocrity or even stupidity. Version control has saved my ass on multiple occasions. I would, frankly, just assume that every decent programmer is going to insist on using a version control system.

Practices: I agree with you that code review is beneficial-- necessary, even. It's not enough, sadly.

Tools: Usually, tooling wars end up involving the command-line-vs-GUI debate, so I'll address that and ignore the others for now. IDEs are a double-edged sword. Use them when they make you more productive, but if I can tell that your code was written using an IDE, that's bad on you. I like Google's read-only IDE. IDEs are indispensable for certain types of forensic work but IDE-dependence is just awful.

Management: ok, I agree with you here. Managers who insist on quality and are willing to pay for it are a godsend (and also rare).


>> 9. It's designed for large organizations. Large organizations have different aims from hackers. >This is where I think pg's bias steered him wrong.

I generally agree and it's ironic too since Java the platform was conceived with embedded programming as the target medium.

http://en.wikipedia.org/wiki/Java_(software_platform)

It wasn't until the very end of the 90's when Enterprise Java came on the scene.

So claim no. 9 was weakly founded in only about 1 year of history prior to the OP article.

I still think of Java more as the SE version than the enterprise version.


I think that that's a fascinating look at it, and yet, I think that the success of small startups, and the ever-shrinking tech company has been a vilification, of sorts. You look at Facebook, and it has been a much smaller company than Google (the most successful company on the map right now, that uses Java widely). Twitter (where that Clojure guy Nathan Marz went) is even smaller than Facebook. Is it possible that language/management choices have had an effect on this? I certainly think so.

I think the verdict should still be out on Java, because I think that the success of Java is dependent on two confluent economic factors.

(1) there's a lot of people who want nice stable jobs, but aren't really pg's type of hackers per se, and

(2) "management" oriented people tend to have big pockets in our present economic environment

But if we remember from economics, competitive markets tend to drive the cost down to the minimum cost of production. You can deploy vast amounts of capital to build a large organization that hires Java programmers, but is it possible to seed lots of small teams hacking in more productive languages, and is it possible for them to create more software that more people are interested in using? Heck, is it possible to use that model to just develop features that bigger organizations might buy out, instead of developing them in-house with the aforementioned armies? I dunno, but it seems like something worth exploring. ;-)


Maybe hackers tend not to like Java, but that doesn't mean that there aren't hackers that like Java.


Agreed, that is the reason for my parenthetical. Hackers can love anything useful, and they can find use in anything, and Java with its current ecosystem is incredibly useful.

That said, the nature of Java is not as exciting to hackers in general as more powerful languages like Lisp or Haskell or SmallTalk.


I was going to say almost exactly the same thing. The items that he listed are certainly detrimental from a startup perspective, but for a big company I think that they are actually arguments in favor of Java. Unfortunately for Paul Graham, and his prediction, the big companies have had more say in adoption than the startups. I agree with you. In hindsight, it makes perfect sense. But, it wouldn't necessarily have been obvious at the time.

Personally, I dislike the language wars. I think that a good developer should be able to write good software with the tools available. Arguing the opposite always smacked of the "silver bullet" to me.


In the end, all languages are Turing complete, but there is still a difference. It is similar to woodwork (from the small dabbling I've done in woodworking): good tools make you faster and can produce better output. You can produce the same quality, but it takes a whole lot more effort.

But more important than the tools you choose (Makita vs DeWalt, Chevy vs Ford), getting to know them is far more important. And that's where the language wars seem to fail: the recognition that a guy with 15 years of good Java experience will code circles around that <insert your language here> beginner.


XML is not Turing complete, nor are most configuration file formats. Yet these are used to provide significant behavior. They are are programs. Programs are data. Data are programs.


> XML is not Turing complete

No, but it's also not a programming language. Of course, there are languages that use XML as their syntax, and many of these are Truing complete.


> I think that a good developer should be able to write good software with the tools available.

While that's true, it would be hard for me to consider someone a good developer if they, given the choice, chose poor or inappropriate tools for a job.


Sure, if you have the choice. But that's were the Hacker News startup culture and the enterprise development culture diverge. In a small startup you can dictate the tools. In an enterprise company you may not have any say in the matter. You can either complain endlessly about it (and boy howdy are people willing to complain endlessly) or you can do your job to the best of your ability with the tools available.

I understand where you're coming from though. I just think that you need to know a lot of context about the developer in question before you dismiss him/her over their tool choices. You need to know why they used those tools and how well they learned them.

But what it really comes down to is how fast the developer can learn something new. I'm fairly confident that I understand enough about programming language concepts to be up and running with a new language in a few days. I might not know the API (if one is provided) but I will probably be able to contribute to the team pretty quickly.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: