Hacker News new | past | comments | ask | show | jobs | submit login

I can pretty much tell anyone who asks why your new programming language won't work.

Its because people like me, the used C for 15 years, Perl for 10, and can't stand Python/Ruby/C#/OtherNewFangledLanguage types... simply do not care about your language.

Yup, its that simple. If you can't hook the guys who've been around the block a few times, what hope does your language have?




>and can't stand Python/Ruby/C#/OtherNewFangledLanguage types

Can you not stand them because you tried them and decided that they don't fit you, or because you refuse to acknowledge progress?

I won't commit an appeal to novelty by claiming that new programming languages will always be better, but to stubbornly stick to C, Perl or other good ol' languages for no other reason but waving your cane at the new kids in town is equally as close-minded.

The world of programming is so interesting precisely because there are always new things to explore, recent inventions and innovations to marvel and old concepts to learn, rediscover and improve.


Luckily most people's language options haven't ossified like yours. I'm sorry to say that sounds awfully boring. Learning a new, interesting language is one of the most rewarding and enjoyable things I do.


Well in the end, the only thing that really matters is programmer productivity... and ultimately the proof is in the pudding I think.

You're right in basically pointing out that if a new language can't communicate quantifiable productivity benefits, then what's the point of taking the time to learn it and switching over?


The digital equivalent of "Get off my lawn!"


How old do you think Python is? :3

(Answer: 20 years old; it was first publicly talked about in 1991. Python is as old as Linux!)


Here's my favorite trivia story about how old Python is:

You know JWZ's rant about how Java could have made integers first class objects by using the first bit as a flag for whether that machine word contained a primitive type or a pointer to an object, at the cost of only having a range of 2^31 instead of 2^32 (or whatever the word size was)? Python apparently had the same discussion in the early days, and made the same decision as Java (i.e. non-primitive integers, you have to box them to treat them as objects) and the reason was ... that the requisite bit twiddling was slow on DEC Alphas. (In fairness, at the time Guido was working for a government lab full of DEC Alphas.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: