Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Programming Languages: What tool is right for which job? (hammerprinciple.com)
138 points by obiterdictum on Nov 27, 2011 | hide | past | favorite | 56 comments


I think it's a good question. There are so many languages around that it is hard to choose from. I think it's beneficial to get good at a handful of languages and use them. However, there might be a time where a newer language is just so great that you have to use it. For example, in web development should you use straight Javascript? Jquery? Dart? Coffee Script?

Personally, Python, Java, and ObjC gets me everything I need and it allows me to get really good at a few languages instead of trying to keep 10 different ones in my head. Granted, if you know a few languages any new one should be fairly easy to pick up.

You don't want to start a job with some new language only to find out that it hasn't evolved enough to give you exactly what you need in your project.

I'd love to hear how people decide what language to use in their business. Do you pick Python over Java because it's simpler to write? Do you use PHP because of how well it's documented?

Like others, I'm saying this based on the title as I can't get the page to load.


I think this tool gives too fine-grained an outlook about whether a language is "suitable" or "not."

Some languages (e.g. Matlab, Erlang) were designed for specific domains in which they excel, but most were designed to be general-purpose. So, most of those languages are really suitable for most tasks.

There are few things that, for example, one might code in Java, that I would not consider using Python for. The two are very different, of course, but for non-specialized applications, the practical reasons for using one over the other come down to other concerns; ubiquity, your own familiarity, interoperability with your existing infrastructure, etc. Me, just I'll choose the novel general-purpose language out of interest.

Then again, a few languages I've worked with (i.e. PHP, Matlab) deserve thoroughly to be crucified for their crimes against computer science and developers everywhere.


I did what I always do when presented with a list like this - I checked out Visual Basic. The list makes no distinction between BASIC, Visual Basic and VB.NET. That killed any credibility the list might otherwise have had, and after realising that it appears to be more of a "what's hot, what's not" list, I stopped reading. It definitely does not do what the title suggests it does. No value.


I think it's safe to assume the rankings are for VB.NET.


I doubt it. VB.NET does what C# does. There are more VB.NET code searches on MSDN than there are C# code searches, which doesn't align with the ranking in the article. The rankings make more sense when applied to VB6 and earlier.


That may be the case, but on the other hand C#'s TIOBE ranking is quite a bit higher than VB's. And really, neither metric is a good proxy for how much code is actually being cut in either language.

The only thing that's certain is that language popularity surveys are painfully easy to cherry-pick.


Sorry the site's having issues. I haven't done any significant work on it in most of two years and there's rather a lot more data in it than when I actually worked on it.

So, yes, it's true, I haven't used the right tool for the job. Except that as with so many things "the right tool" is "an actively maintained codebase". Looking into why it's crashing now, but don't expect any miracles


Should be running (a bit) better now, though likely still on the slow side.

Turns out a query I'd written back when there were 20k votes in the system didn't perform so well now that there are 200k votes in the system. Go figure.


Scale is always one of the most difficult things for me to plan for. Over last summer I was working on a registration system for an ISP which worked really well in testing. Once the database exploded to over 1M rows however, that application's performance was noticeably suffering. Unfortunately now that system is live, and the schema can't be changed for a few months...oops


I like the idea but feel it's an echo chamber. I haven't seen a single surprising ranking yet.


Really? Go is on there, but ActionScript is not?

I realize that the future of the Flash Platform is in doubt, but Flash is still a great choice for a lot of things, like:

- This language is good for beginners.

- This language is well documented.

- I find this language easy to prototype in.


Why would a beginner want to learn a dying technologlash unless you were writing games, Flash was never the right answer.


I do write games, and Actionscript (3, not 2) is a lovely language.


Since I specifically excluded you, what's your point?


That Actionscript is still a useful language, within it's niche. And since this post is all about finding the right language for particular niches I don't see why you'd argue against it's inclusion.


I'd already made that point by excluding games from my statement. When someone says except for X, bla bla bla, you don't respond by saying yea but I do X.


But you can respond by going "But it's silly to exclude X from discussion". Which is why I have been trying (and, evidently failing) to convey.


I didn't exclude X from the discussion, I excluded it from my criticism; that means I already agree X is good for games and there's no reason to bring it up.


Although I no longer use it, I learned to program with ActionScript. I studied art and had no background in programming, but I found it fun to play with Flash and AS. For example, getting a webcam involved is a couple of lines of code. Now you can do basic motion detection and all kinds of cool stuff. It is a good way to learn programming as everything is included in one package. You can use the Flash GUI to make a simple interface for your application.


That's an argument for including Actionscript in the list. That way a beginner might know the following:

"This language has a niche in which it is great"

"This language has a niche outside of which I would not use it"

"I enjoy playing with this language but would never use it for "real code""

"I am sometimes embarrassed to admit to my peers that I know this language"

"The thought that I may still be using this language in twenty years time fills me with dread"


Python and Ruby fit into all of those categories. And I'd guaranteed that in 5 years they'll be even more popular, easier to hire for and the languages will be more polished.

Can ActionScript say that?


I think there are Statements about those things precisely.


Something the site didn't answer, what high-level languages are suitable for writing highly-parallel network code? I need to write some code that does stuff like: given a couple million URLs, download each, extract an element, gather all extracted elements in a list.


I suppose if you use zeromq, any high level language that has zeromq bindings will be able to do the job. That is over 30+ languages including Python, Ruby, Lua, Perl, Node.js, Objective-c, Java, Scala etc etc


`Highly-parallel network code` was the problem erlang was designed for. Haskell would also be a good fit.


"Erlang was designed to program fault-tolerant systems" according to Joe Armstrong [1].

[1] http://www.infoq.com/presentations/Systems-that-Never-Stop-J...


How do you make a system fault-tolerant? Make it distributed. How do distributed systems work? By communicating via a network.


sounds like a job for clojure, lazy seqs and java's threading and network libraries make this sort of thing not too terrible


The choice of language is not so important here, but the choice of a language implementation is. If you're doing a lot of network I/O in parallel, you need a language and a framework that can take advantage of hardware concurrency (multiple threads and/or processes), but should be able to do fast switches between your tasks (not requiring a full OS supported context switch, which is slow).

This pretty much rules out e.g. Python and Ruby, because the CPython/CRuby interpreters suck at concurrency (there are other implementations available, which are a little better). You can still do many Python processes, which do lightweight switching between I/O tasks somehow (e.g. with Twisted), but then load balancing between processes becomes a PITA. Node.js is similar, there's only one thread running at once. With Node.js (and twisted) you also have to manually "compile" your code for asynchronous style continuation passing style, something humans suck at but compilers do very well.

Now given the task you said, you could take C and epoll/kqueue and write a small framework yourself, but processing the data you got with C might not be too nice.

Or you could use an interpreter or a compiler that does this automatically for you. This is where the choice of language kicks in. In order for the language implementation to be able to effectively organize parallel execution, it needs some information from the language. C doesn't have any helpful information, which is why it often relies on full OS context switches where all machine state is stored and restored (or a non-portable hack with the stack). Something lisp-y with first class continuations might be helpful, but many lisp implementations don't really do concurrent execution well.

So, for a language that concurrently executes network I/O efficiently while still being high level and fast, my recommendation for your task would be: Haskell. There's years of research and engineering work done on just what you're looking for in Haskell. Just forkIO as much as you wish, write your code in regular sequential imperative style, compile for multithreaded execution (ghci --threaded) and let the compiler and runtime do all the hard work for you.

Erlang might do the job too. If anyone knows of other languages with smart I/O multiplexing and co-operative userland threads or fibers (with concurrent execution!), please reply here!

NOTE: my assumption was that you're actually processing the data you're receiving and you're more or less CPU limited. If you're I/O bound, you don't necessarily need concurrent execution and may get away with Twisted or Ruby fibers or Node.js (without using multiple processes and load balancing or task queues).


How inefficient is Ruby really? This tool surprised me. Can any Ruby-ists chime in?

http://therighttool.hammerprinciple.com/statements/programs-...


It tends to hang out at the bottom of the Shootout [1]. Its phenomenal cosmic powers come at a price. But note that the ratio of Ruby to Python or Perl is generally within 2x.

(The Shootout is not the One True Measure of speed... but it isn't meaningless, either.)

Generally the only thing this really means though is that you wouldn't want to do hard-core computation in pure Perl/Ruby/Python, hence things like NumPy. [1]: http://shootout.alioth.debian.org/u64q/which-programming-lan...


The interpreter in 1.8 and earlier is definitely pretty sluggish; I believe the 1.9 implementation was intended to be quite a lot faster, and JRuby is pretty fast. No idea about Rubinius or IronRuby or other alternative implementations.

It's definitely the case that stock Ruby 1.8 is slow for a great many applications, though. Having said that, there's alternatives, and a lot of the time efficiency of execution is not at the forefront of the objectives for systems written in Ruby.


> I believe the 1.9 implementation was intended to be > quite a lot faster, and JRuby is pretty fast.

"faster" compared to what? "pretty fast" compared to what?


As a sibling quote mentioned, 1.9 is fast relative to 1.8. JRuby is pretty fat in general, with the potential to approach Java speed at times, although I think specific things can slow you down a lot. In particular it was the case (not sure if it still is) that exceptions in JRuby were pretty expensive, but many Ruby libraries treat them as cheap, leading to performance problems if you move from one platform to another.

I was not trying to be precise though, just give a feel for the different implementations.


Than 1.8, as per the part you trimmed from your quote..


Is it me, or do a lot of the 'statement' links not seem to load?


It is possible they didn't use the right tool for the job.


Heh indeed, and in a couple a days "How I improved my site after Hacker news etc..."


Eh, site's held up to hacker news 4 or 5 times in the past. Never been a problem before.

It's been around for two years slowly bitrotting (I really should do some maintenance on it). Just gradually been getting slower and this time it hit the threshold where it was too slow to stay up. It happens.


Yup getting 503s all over the place.


I can't get the index page now. Sorry, I think the traffic from HN may have killed it. :(


Also ran: "What tool should I have built my website with?" -- can't get a pageload


This isn't the right question to be asking.

There are certainly cases where one language at another for a given task, e.g. Ruby is better for string parsing than PHP. But in many (most?) scenarios the technique used for the job matters far more than the language. Sorting is an easy example: bubble sort is a crap solution regardless of the language.

Disclaimer: I can't read the actual page due to 503's and the cache isn't helpful, so I'm responding to the title and what I can divine from the linked page.


While I agree with this statement, that technique is more important than superficial or trivial language decisions, I think there is a second level of reasoning that needs to be brought into play here.

Once you know several techniques and several languages, and have started to get a skill at identifying the core bits of a project, the process (or series of questions to answer) becomes:

- what technique will effectively handle to core problems?

- What languages make this technique simple to implement?

and then the real kicker, which in a way comes full circle to the original naive analysis:

- in the best language for $technique, are there any problems with secondary and ancillary problems associated with it? (e.g. i have to do this really fast lookup table of simple operations, C would be great for this, except that the data is irregular and involves a lot of string parsing C sucks at that..., or the central problem here is just a big fold, i'll use Haskell, but it involves a lot of tricky memory optimization to be fast enough...)

Out of this of course arises a meta-technique of learning to partition problems into stages in different languages, which of course has its own problems.


The "tool" decisions are usually made at the begining of a project.

In that context, I disagree with your assesment. It often takes me a week, sometimes more, to evaluate and choose the right solutions for a new product. The techniques that I use when coding matter little during that time.

What _does_ matter: third party services and libraries that will make it easier to build a better product.

However, I agree that choice of language is only tangially significant. It's rarely a "pain point" for me. Gui toolkits, cloud services, hosting solutions, payment solutions, libraries for [you name it], etc. _those_ make up the "tool for the job."


Actually, bubble sort has very nice access locality characteristics which mean it can be a great solution in some circumstances...


This was extracted from computer scientists and created by one, otherwise PHP would be numer one the stament: "Is so easy to get things done with this language". Right now its 10 staments are all negative (not even one neutral)


Even if you are adept in using a highly efficient and effective language for doing each job, if no one else is using the language it's utility becomes limited. That's because none of us work in vacuum. Computers run by different people in separate locations cannot talk to each other unless those people cooperate. They have to agree on some things. And further, we're all using software and systems designed and built in whole or in part by someone else.

I sometimes wonder what we could achieve if choices of language were reduced and we were all forced into "speaking the same language". Would the advantages of a common language supercede the advantages of any one language's design?


You wonder what would happen if we all wrote C++ on Windows? I hope you can answer that question yourself :).

But seriously, i think its great that we can all work in our favorite language, and we can get away with it. I can write a web service in common lisp, and you can use it in your Rails cat picture app, that also uses a brainf*ck script to convert the cat pictures in a suitable format. And your company can also write an iPhone version of your app in objC and a desktop version in C++ for windows and Linux. Man, protocols are awesome :)


It's just a thought. A "what if".

I trace the existence of many languages to having no common instruction set in hardware.

Different assembly languages for different hardware. This was very frustrating for people many years ago.

If protocols (rules) are awesome what if we had had a protocol that asked the chip makers to use a common (extendable) instruction set for all hardware? What if there had only been one assembly language?

It seems that all abstractions, from Pascal or Lisp virtual machines to C to higher and higher level languages to the ones popular today, are all descended from the search for a way to deal with that initial lack of protocol (rule) to get hardware makers to use the same instruction set and thereby let programmers use the same assembly language.

I could be very wrong on this.


> It seems that all abstractions, from Pascal or Lisp virtual machines to C to higher and higher level languages to the ones popular today, are all descended from the search for a way to deal with that initial lack of protocol (rule) to get hardware makers to use the same instruction set and thereby let programmers use the same assembly language.

Maybe I am misunderstanding this.

But Lisp and C derives from very different views of the underlying machine. It is rather hard to unify a lisp machine (or other lambda calculus machine) instruction set with the instruction set assumed by a language like C. Even assuming extensibility of the instruction set, constructing a machine that can execute both the "usual" instruction set and lambda calculus efficiently is very very hard.

So this may not really be practical.


So with Lisp or even Forth, the proper approach would be to have "Lisp chips" or "Forth chips" (which were recently discussed here)?


Even if there was One True Assembly, few would want to program in it directly, and we would still have different ideas about how to generate it. There would still be functional people and OOP people and procedural people. There would be tradeoffs where different approaches would be objectively better.

I would like a One True Assembly because it would make it easier to develop new languages. :)


I guess this is kind of what is happening in the browser at the moment.


Haskell.


Lisp.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: