Hacker News new | past | comments | ask | show | jobs | submit | avstraliitski's comments login

Honestly, I've coded loads in perl, and I think the language has a great place in the Unix user's toolkit. I certainly use it for small scripts all the time. However, the 10,000th time I had to type 'man perldsc' because of the ugly hack that perl data structures and their syntax represent ("Access and printing of HoHoAoHoHoAoA", anyone?), I swore never to use it for complex projects. I learned PHP, and never looked back. Do yourself a favour and learn Ruby, Python or PHP. Don't use perl for complex web projects - it's simply not the problem domain it was designed for, and it shows. I mean, look at the code example on that framework's page: "my $self = shift;" .. who can be bothered with such cruft in 2011? Get with the program!


Uhm, sorry but that's nonsense. I know PHP and Ruby. The latter one very well. While Ruby is a great language I prefer Perl in most situations. While I consider both of them to be very good the primary reasons for choosing Perl are CPAN and the fact that it is older and therefor they already "fixed" a lot of shortcomings that other scripting languages have.

And yes, I know the benefits of Ruby's OO. I have used Smalltalk, so please don't tell me all the wonderful things you can do. I know them. But using Perl is like using a lot of experience for practical stuff. That's Perl's strength. There is a lot of knowledge in it and it's designed for practical stuff, the real world and not to look good or to be easy to learn. I still recommend Ruby to people who ask me about programming. Simply because it's better to understand a lot of things and I think it's easier if you start out with a programming language like Ruby. But staying with one language is just stupid. You will have a lot experience with a particular, but knowing different approaches to solve problems is even better.

Then there is this TIMTOWTDI dogma it that receives a lot of criticism, but I think it what keeps Perl alive. It makes it very flexible so you can use it like a completely different language. One can use it for small scripts or one-liners and huge applications. Also it's not true that there are no start up. DuckDuckGo is maybe the most popular example.

But this doesn't mean you should all use Perl. People do things in different ways and they think in different ways. It's not about choosing one single language that receives a lot of hype. Just learn and what's more important understand them. And with understanding I mean the philosophy. A lot of C people for example use Perl like it would be C and then say it's bad because it's like C and slower. GitHub is a nice way to find out how people are using a language and receive some inspiration. It's like everyone should at least try Small Talk and a language like LISP or Haskell. And everyone should learn what Assembly is all about. It will make you a better programmer no matter what language you'll be loosing in the end.

Oh, sorry. This is so off topic. I just wanted to know what stuff I might have missed, because it's really been a while since I had a look on other languages. Sorry!


it's simply not the problem domain it was designed for, and it shows

Ah yes, the "single purpose tool by design" fallacy. PHP wasn't designed for Ajax or JSON or HTML 5. Perl wasn't designed for the web.

Install a library. It's a general purpose programming language designed for you to use libraries.


I just wanted to say you might be hellbanned, it seems all your recent posts (after this one) are dead.


I disagree strongly with the article's premise. Don't take a job where a language and framework has been chosen that you don't know, where you are still at the point where building basic websites is a 'project'. Why not? Because obviously the company you are working for has no idea what they are doing with their time and money, and you're therefore unlikely to be in an environment with skilled coworkers.

If you are really a beginning web developer, read the HTTP RFC. Learn HTML. Learn Javascript. Learn CSS. Learn basic Unix. Pick one simple project and implement it in three different languages. Then implement it with three different database layers. Then implement it with three different web servers. Then implement it on three different operating systems or at least Linux distributions.

Now try to benchmark and scale. Compare multiple front end load balancers. Compare various NoSQL database architectures and caching solutions. Try some cloud hosting, see where it falls down.

This will teach you far more than fiddling in one framework.


I agree if you're doing this fulltime. My job was a part time Rails job for a professor. There are lots of opportunities out there that you can do on nights and weekends if you have another fulltime job. It's a whole different situation than what you're talking about.


+1! The only IT people worth their salt are those who have natural curiosity. Without it, you cannot sustain learning as technology evolves and the industry changes, and your skills will become irrelevant. Incidentally, this is probably why a lot of formally trained CS / software engineering grads actually suck. They never put 2+2 together for themselves, out of interest, in their own time, for fun.


The simple fact is that many people need drugs to deal with day to day living in US society. They are told constantly they should be happy, healthy, beautiful, hard working. The US has less holidays than other western nations. It also has less international news and arguably a more homogenous media culture. For many of these people, if they can take a pill and not worry, then they will do so. It's shocking to a non US citizen how much drug advertising is on American television. Ask any foreigner - they have all noticed. It's the sort of thing that would be outlawed in other countries. Particularly the wording - "ask your doctor for ...". On the other hand, natural drugs with proven benefits (I believe a recent magic mushroom study just strongly suggested that they keep people happy for a year after dosage) are outlawed. THX, anyone? sigh


> Particularly the wording - "ask your doctor for ..."

This of course just adds to the stress and unhappiness as described by Barry Schwartz in this interesting TED talk http://www.ted.com/talks/barry_schwartz_on_the_paradox_of_ch...


So you essentially Web2.0 a TODO text file, making s/CRLF/<arbitrary new record type>/ and let some other people view it, maybe ... add timestamping or version control or some other cruft, maybe a couple of mobile platform specific interfaces and/or an API (whole thing <1 weekend coding, add 2 weeks for UI/UX) ... somehow con fools in to using it, then someone gives you 50 million USD. This is why the US economy is going down the drain. Unbelievable.


Are you serious? Please try and replicate Evernote in 3 weeks.


Firstly, you originally posted this article so I can see that you are biased against cynical dismissal of the company. (I tend to feel that in openness you should have declared that in refuting my skeptical take on Evernote, but now I've done it for you.)

Secondly, it seems from your background that you are 18 and your self-identity is closely linked to programming 'apps' and the whole money-for-dotbubbleapp2.0 thang. I am happy for your (apparent?) success thus far, as you yourself spin it. However, I would suggest that you consider that perhaps you do not have enough perspective yet to understand that this whole thing is just a game... a very stupid game... and ultimately not necessarily producing anything at all that is meaningful or useful to humanity, the economy, the country, or your dog: despite big scary amounts of money being thrown around.

In answer to your question: I am serious. It appears to me that it is not untrue to say that nothing Evernote has done has actually been new or unique. All they've done is drawn a cloud around some stuff that existed before (I can hear the squeak of the whiteboard marker right now...), put some marketing dollars down and made some sexed-up UI's for a few device types. That's it.

Then got a bunch of money.

Of course, they probably wasted a lot more money doing it less efficiently ... and used a lot of stupid Silly-valley oldboy networks to get the cash ... and burned a lot of time. And maybe actually did produce something new along the way. But basically... nothing new. Skepticism is king.

TL;DR? That's sad because I can't them recommend that you try The Art of UNIX Programming or any tome on code generation. But I hope you do. Also, take a tip and don't aim for fame "One day I hope you won't need to look me up to find out who I am" (puh-lease!) ... we were all 18 once.


I will give you $20,000 if you can create a viable competitor to Evernote in 3 weeks.

PS: there are some things called aspirations. I'm sorry I'm not jaded enough yet.


Have you ever tried using evernote? It is much more than text files and some of it is quite sophisticated, such as automatic approximate OCR of images.

The UI/UX/API/mobile and desktop apps/backends, all of which you dispatch off without thinking, are all non-trivial to get right. The fact that evernote doesn't have too much competition right now should tell you something.


I am going to have to stop replying to these comments soon because they're not well thought out. But here's another response.

More than text files? Yes, there's some binary file support too. "Ooh"

OCR is not hard. They probably just use http://code.google.com/p/tesseract-ocr/

The fact that they don't have 'competition' proves nothing. Maybe nobody wants their 'product' because it's not useful. As others have commented even here, who've actually tried to "get it" multiple times, it's really not useful for a lot of people - they just don't see the point.

Maybe the lack of people copying their "product" is because it's just marketing fluff around a solution to a problem that doesn't exist (really, very much, for most people, apparently).


If you think you could create full-featured native apps for five different platforms plus the web UI and a synchronzing backend that can support millions of users in three weeks, I have a hunch you are overestimating your abilities.


They are profitable and hiring people. How could they possibly be bad for the economy?


In reality, economics is not about money.

If you produce resources or services of value, then the future is rosier because others will trade with you in exchange for access to those resources or services.

If you make some piece of software junk that's dismally easy to copy and do so based on speculative capital, particularly when it provides arguably zero unique value, then what you've done is wasted everyone's time... including that of your own workforce. See http://en.wikipedia.org/wiki/Evernote#Similar_products_and_s...

Unfortunately, this sort of thing is rife in the US.

Back to reality: I cannot conceive of China suddenly sending squillions of RMB to the US because someone figured out how to Web2.0 a TODO text file. Hence, skepticism on real economic value.


You do have a good point. I mean, deep deep deep down, Evernote is just a glorified Notepad. Most of their users are free, and I can't imagine them being a $100 mil company selling glorified Notepads for just $5/month when they aren't that popular in the entreprise market.


That's like saying "Dropbox is just a glorified folder."

It's true enough, but that doesn't mean that your underlying premise is valid.


Then spend 2 weekends coding and make something better.


Why would you replicate something that already exists? That's what Evernote has done, with very minimal spin. It's pretty close to empty marketing crud.

A superior solution would be any version control client and regular files. I'm not intimately familiar with Evernote but I'd wager that there are at least ten major features of good version control systems that Evernote doesn't have. Features like decentralisation, transparency, open source, mature security model, mature fork and merge model, commit hooks, etc.

Failing that, even rsync would be a good solution for a single user. I'm sure it's probably more on-wire efficient than whatever system Evernote use.


If you are correct (not saying you are) then someone could build a wrapper around those on Android and iPhone and make a killing.


"CVS should be enough for everyone" ;)


Or ask: what sucks about trying to use Dropbox on your mobile?


In the end, most web apps are just glorified CRUD interfaces. But glorified CRUD interfaces are still valuable businesses


I agree. However, In my worldview, due to life being short one should eventually learn to think past value as presently defined by the US market, because the cracks in that particular form of economic rationalism have grown far too great to hide.

J. Krishnamurti: "It is no measure of health to be well adjusted to a profoundly sick society."


It's almost never the wrong decision to try something new.


I liked WebOS and was enthused by the launch. I wanted to buy a device. I was in Taiwan, one of the world's consumer electronics centers. There were no devices. FAIL.

If you can't even distribute your product, then why bother building it? Seems like they spent so many years / so much money on R&D that their marketing failed.

I'm sorry but a forced-smile charismatic keynote (and that's being generous), even a very well presented one, cannot make up for the lack of global distribution. The US is not relevant as a leading consumer electronics market anymore. You need to either be global, or go to Asia.


This is lame and not HN-worthy because the use cases are almost zero.

0. Premature optimisation is the root of all evil 1. Approximately zero big sites use ruby 2. Of large sites, very few would actually care that much about tiny on-wire messaging optimisation since other elements are far more significant to bandwidth use. 3. On-wire messaging in binary destroys transparency, which means PITA debugging and wasted programmer and API-using developer time. Programmer time is far more important than computer time or a few bytes of bandwidth in most developed nations. 4. You have to pay for the binary serialisation with additional client-side code, which means you've actually gone backwards until you've communicated enough that the delta between JSON and binary format message sizes have outgrown this initial load time penalty. 5. Initial load time penalties are often more important than subsequent use, since generally this is the only time client bandwidth (the only side of the bandwidth equation relevant to performance in 99.99% of cases) is going to be challenged to any serious level. 6. Refer to point zero.


Good lord you're a troll.

0. Few people who use this quote actually know its origin and context.

1. Scribd, Hulu, Justin.tv, Slideshare... there are tons of ruby sites in the top 500, not just Twitter.

2. Serialization isn't just on the wire, it also may be stored in memcache or the like. But even on the wire, crossection bandwidth is scarce in large clusters. There's a reason google developed protocol buffers.

3. Debugging by wire sniffing is shitty period. I don't think binary vs ascii is very meaningful. What matters more is the infrastructure on each end, eg do the client and server have good reflection, monitoring and logging capabilities. These are what affect programmer time.

4. This is a RPC format. Sending payloads to browsers is not a use case. Even so on modern javascript implementations decoding a byte packed binary format is likely faster than the eval path used by naive browser JSON parsers because it doesn't build any intermediate forms like token streams, ASTs, etc.

5. Same as 4.

Your view is both myopic and hyperbolic.


G+'s manageable exposure model may threaten the whole online dating arena when opt-in interest groups (Facebook 'page' replacements, possibly co-branded with Google Groups) with geolocation support and event functionality (possibly co-branded with Google Calendar) happen. Discuss.


I posted the same request. I run a Facebook page for a small special-interest community with a few hundred members. Members can meet each other, share relevant info, and there's some people with powers to moderate the inevitable borderline-commercial crap that pops up rarely. G+ doesn't have this.

Maybe Google Groups + G+ can have babies and make a superior opt-in, shared-community solution to Facebook pages.

The thing is, from Google's marketing perspective, this is also similar to LinkedIn and .. well .. it would just look way too much like "more of the same" for them to push it hard at launch. But the demand's clearly there, I think we'll see it added.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: