Hacker News new | past | comments | ask | show | jobs | submit login
The strict Pragma is a Cultural Marker (modernperlbooks.com)
75 points by Mithaldu on Nov 24, 2013 | hide | past | favorite | 59 comments



Worth reading even if you aren't a perl guy. Every language has this type of 'cultural markers' that set your expectations for a codebase by skimming a few files or just the project structure. On that first dive into someone else's code[1], this markers are invaluable, they are the first cue into the author's mindset and background. They are a signal for future maintainers or collaborators. It's like stepping off the plane and hearing people around you speak with your own regional accent. Even if you've never been here before, now you know some of your knowledge still applies.

[1] Rule of thumb, if you haven't touched your own code for a long time, think of it as being written by someone else. You are not the same person you were when you wrote it.


> Rule of thumb, if you haven't touched your own code for a long time, think of it as being written by someone else. You are not the same person you were when you wrote it.

This is really insightful, most especially the part I italicized.

I can think of a few particularly embarrassing moments when I've dug through old code of mine and immediately thought "What kind of idiot wrote this crap?" followed by a brief moment of horror when the realization sets in that the idiot was me. :)

I think it would also make a fantastic addendum to this [1] discussion (and the article to which it is attached), especially since many new programmers often don't think much of code maintenance. Usually it's comprised of "I (or someone else) will eventually rewrite this." Then it enters production.

[1] https://news.ycombinator.com/item?id=6789504


I've come to (try to) document / code like I assume the next person reading it will be both insane and stupid.

Usually that's me. Usually it was worth the extra effort :) And at the very least it's a measure of progress, as I'm no longer that insane and stupid.


Oh gosh, yes. I've found that approach to be the most helpful. In a year (or three?), I know that I won't remember anything about what it was supposed to do. I'm pretty sure I'm stupid or forgetful. Probably both.

I like your assertion of documentation being a measure of progress. As dumb as it sounds, even if you're the only consumer of an API you wrote internally for some on-off application, documenting it thoroughly is a tremendous time saver. Granted, I still slip up occasionally, especially if it's a quick script of sorts, but I find more often than not that when I do regret not having taken the time to document something, it's (now) usually something deceptively simple that I wish I had better curated, because it eventually finds new life as part of a greater work.


Yeah, especially with one-off scripts I keep upgrading my machine / reinstalling to clean things up, and wondering wtf -nRT means and why I thought it was important. Something like half the time I google / manpage around for a while and find out it wasn't important, or assumed something incorrect about what it was trying to do.

I've started wrapping everything I do frequently in small bash scripts, and documenting what and why. Saves a lot of time when migrating to a new system, since I can glance and say "nope, don't need it" within a couple seconds, or remember why it was useful (and sitting in a cronjob somewhere) and reinstall the things I actually use.


When writing scripts don't use the shortened flags, use the long version.

For example, the deploy script from a simple php app I wrote:

    rsync --recursive --links --verbose --rsh=ssh --exclude-from ./exclude_from_deploy.txt --delete ./ kazan:/srv/simple-paste/


Can't step in the same river twice is true in all domains, not just software. Admittedly with some languages it is worse than others, assembler and APL come to mind but same truth still applies.


I find that when I use 'struct' in my Perl, I have a psychological mind shift in how I write the code. Instead of hacking together some code, I start to take it more seriously, and start to care about everything from what I choose as my variable names to the overall sructure of the code. I start to care about maintainability. To be honest, I end up eliminating lots of clover Perl idioms and make the code much more 'C' and Java like, explicitly declaring most of my variables at the top of functions and making them verbose and consistent. I start to document everything and write lots of comments explaining what I'm doing. It really doesn't add much overhead to the overall development to be honest.

I think as if I have to give the code to somebody else, even if that other person is me in the far future.

It's served me well, I've returned to old pieces of code I wrote years before (even multiple 10s of KLOCS) and after a couple read through was able to get to work without much fuss.

Fitting in with Larry's idea of programming languages as real languages, it's as if I've code-switched from a colloquial language to the kind of formal language I might use when giving a presentation.


> explicitly declaring most of my variables at the top of functions

While i largely agree with the rest of your code, that one bit i can't generally agree with.

If you take care to keep all your functions below 15 or so lines, i.e. such that they fit on one screen, then that's fine.

As soon as you write bigger functions you end up impacting your code quality in two ways:

- readability and refactorability go down, because coders after you have to read more code to see the context of a variable

- the chance of bugs goes up because you're more tempted to reuse a predeclared variable in a loop, instead of redeclaring it at every start of a loop; or more insidiously, reuse the same variable as the counter in multiple loops

So please: Declare your variables as late as humanly possible, optimally only directly before they're used.


Or, if that late just isn't your style, declare a few at the beginning of each small chunk of code where they are needed. Then, guess what? You can probably easily factor that chunk out to a new function, and end up with the lots of small functions solution.


I don't disagree with your points at all. Short life variables I do tend to declare later in the code.

But most of the major state variables (especially the ones that get returned, or important data structures I want to keep track of in the code flow) I stick at the top with comments and explanations of what the variable is and why it exists.


That's better to hear. I'd still invite you to try and make a conscious efforts to make your functions smaller and more numerous. That might help you reduce the number of variables you need to make an effort to actually keep track of. :)


Perl was my first language that I really engaged with, it started my career. Why did it fall out of favour? (I left web development to write applications/C, then returned some years later to write APIs but everyone had stopped using it).


It was Perl 6.

As soon as it became clear that the move from 5 to 6 was going to be discontinuous, if you wanted to be a Perl developer you had to decide if that meant being a Perl 5 developer or a Perl 6 developer. Being a Perl 6 developer was the clearly marked path to The Future, but Perl 6 was years away from being ready to use in production environments, and it never seemed to get any closer. Perl 5 was practical for real-world use, but choosing to be a Perl 5 developer meant worrying that you were investing time and energy learning skills that were already marked as obsolete and writing code that would end up having to be re-written.

That kind of uncertainty is fatal to a language. Why put up with it when there's so many other languages out there with clear upgrade paths? Especially when the upgrade path for the one you're using now appears pretty similar to learning a new language anyway?

So developers drifted away from Perl to other languages, and that, as they say, was that.


I agree. All of what you said, along with an ambiguous release date for 6, with the caveat that it always seemed like it was just around the corner, made me back off digging really deep into 5 thinking 6 was just constantly a few months away.

After 3 or 4 years of crying wolf, I and the rest of the industry simply moved on.

When 6 finally started coming out, the complex release mechanism and the difficulty in firing up a fresh install of *-nix and typing "perl" really left a bad taste in my mouth.

Lots of that has finally cleaned up, but Perl6 is not a default package in any OS yet, and there's still years of performance improvements yet to do before it's really production ready.

I think the very fast progress of Go has also made me rethink spending time with Perl 6 and instead thinking of taking some time to get familiar with Go.

(I say all this as a very huge Perl fan who's solved some very hard problems in the past with Perl and had it help me make a successful career out of being able to solve those kinds of problems without lots of fuss).


I was once one of the most prolific authors on CPAN, and this, along with my answer above about lack of performance, drove me away.

Perl(5) just needed classes and clean up a few bits of historic cruft (like the need to end modules in "1;", indirect object syntax, etc) and it could have been an awesome language. Perl6 decided to throw the baby out with the bathwater.


Many people will attribute the "6" in "Perl 6" as the cause, but I attribute it to the "Perl" in "Perl 6". Perl 6 appears to be a sufficiently enough different language from Perl 5, much more than the apparent difference between Perl 4 and Perl 5, that it should have had a different name, not just a different version number, out of the gate. Or maybe Perl 6 changed too much, too soon to just be a single increment in the version number. The object model, the runtime, the syntax, more things were explicitly operators (I never really cared that .. was an operator in perl 5, but perl 6 seemed to make such a big deal about operators that now it seemed like you had to be concern much more with precedence, for example), more operators, more symbol/punctuation use, different idioms, a language agnostic interpreter. Compare the python 2 to python 3 transition/marketing. Calling it "Python 3000", even as just a nickname, early on may have helped indicate that it was a major change. Tools like 2to3 end up producing code that is still easily recognizable as python, because the idioms are not wildly different. People will be able to call themselves "python programmers" independent of their experience with either version. But it's still not a smooth transition and people are still asking "which version of python should I start with", but the answers are less dire than with perl.

The ambitiousness of perl 6 really warranted a different name for the language, and more explicit breaking ties with Perl's established and productioned legacy. This would have helped both projects I think. Even working on (vs in) perl 5 is somewhat considered a dead end, because it has perl 6 looming over it. It ends up doing a disservice to the livelihood of both projects.


I had a long winded reply but it got lost and I don't feel like rewriting it. Too long won't rewrite version: Perl6 was conceived in 2000, it still doesn't exist in 2013. For a number of reasons involving competing languages, cloud computing, and the end of the Mhz race, it's very tough to imagine Perl6 as anything other than a dead end.


Saying that it doesn't exist in 2013 is either disingenuous or ignorant. You can make any number of arguments as to its relevance, or whether it has achieved certain goals and milestones, but its existence is not in question.

It is very actively developed. Some (small group of) people use it every day. Some (very, very small group of) people even use it in small portions of their business.


I'll concede that. I meant it more in the way people say "Rust doesn't exist." Yes there is software that actively runs, but by no means are either generally considered ready for "production".

My main issue though is that static languages are improving to the point where I start to wonder about the relevance of any dynamic language.


My main issue though is that static languages are improving to the point where I start to wonder about the relevance of any dynamic language.

IMO static typing can be cumbersome for exploratory programming. That said, I really like the added benefits it can impart. I think an optionally typed language may be the best of both worlds, but my experience with them is minimal.


I like type inferencing, like in Haskell. You gain the benefits of static typing, but you do not have to explicitly write the types all over the place. I find that the exploratory phase goes more quickly with the compiler there to provide a sanity check.


Ah, that old chestnut again. The argument falls down when we see what really happened: Perl 6 had and continues to have a huge positive influence on Perl 5, the future belongs to both as there turned out to be no discontinuous move, and Perl 5 stays up-to-date.


I'm sure there are people who see it that way; I can only speak for myself and for every other Perl developer I ever worked with.


Even if it is only the way things are perceived, and not the actual truth - it can still explain the decay of the community.


What decay? The Perl community still continues to grow - https://news.ycombinator.com/item?id=6738619


It doesn't really matter if it turned out not to be discontinuous. The fact that it looked like it was going to be for a long, long time is still a big problem.


This is just a personal opinion, but I also wrote Perl (back in the 90s), and I think that TIMTOWTDI was the reason that it fell out of favor.

TIMTOWTDI makes it hard to share code, even between masters. I believe it's the source of the "Perl is unreadable" meme. Because everyone is capable of defining their own personal dialect of Perl, no solid, common subset emerged.

Compare with the Zen of Python: "There should be one – and preferably only one – obvious way to do it." In Python, it's generally very easy for two similarly-skilled Python hackers to share code.

It looks like a Wikipedia editor shares this view: http://en.wikipedia.org/wiki/There%27s_more_than_one_way_to_...


> no solid, common subset emerged

That's wrong. This common subset converged through consensus and finds expression in e.g. the Modern Perl movement, the module collection Task::Kensho and the updated motto "There is more than one way to do it, but sometimes consistency is not a bad thing either".


All of those things emerged long after Perl had ceded its position in the language marketplace. It's great that Perl has them now, but the process of working them out took a loooong time.


Isn't that the case with every New Jersey language?


ruby flourished while perl was going down, even though it also shares TIMTOWTDI.


Yup. Which is why I'm personally a bit uneasy about Ruby's long-term future. It's attractive for the same reasons that Perl was attractive, but that means it also invites some of the same issues that Perl experienced.

Wild speculation follows:

There's always a need for a solid "just get shit done" language, but maybe any good language in this space is doomed to die by its own success. Crufty, hard to maintain code accumulates over time, because quickly banging out code you don't expect to still be maintaining in 10 years is the whole point. As that cruft accumulates, people start noticing it and inevitably blame the language. That leads to casting about for the next "just get shit done" language, and the cycle repeats.


Ruby flourished largely due to rails which ends up being pretty opinionated about how to do things.


It didn't fall out of favour, Perl usage is growing every year. The "market" expanded as a whole, admitting newer languages. http://blog.timbunce.org/2008/03/08/perl-myths/


While I sympathize with that position, I think it glosses over some important implications. When the market expands rapidly, but you share does not, you are either missing new entrants, losing existing adherents, or some combination thereof. Most importantly, your relevance is shrinking even if your market share is not. That said, it is an important morale booster to know it's not actually shrinking.

Note: You and your in this case is meant generally, as it most definitely includes me.


When the market expands rapidly, but you share does not...

My collection of Programming Books Which I Will Discard Before I Ever Move Them Again includes a book about CGI programming with the Korn shell. I am not making this up.

As you well know, back in the day when CGI was magic and server-side includes were amazing, there weren't a lot of options for server side programming. You had C, if you were a Unixy systems programmer and didn't mind doing string processing in C. You had Tcl, if you wanted to go the AOLserver route. You had whatever shell script you wanted, if you wanted to pipe and stream together Unixy commands. After a time, you even had server-side JavaScript, if you spent a lot of money on Netscape server products.

You also had Perl, which ran on just about every Unix under the sun and behaved the same way pretty much everywhere. Even if you couldn't rely on one Unix utility behaving the same on every Unix variant (unless you somehow managed to get the GNU tools installed), you could rely on Perl being just about everywhere. Better yet, it had a fair library of code being developed and published because people were sharing it.

In the early days, you might have to use a Perl fork to connect to an Oracle or a Sybase installation, but you even had that option. (Yeah, you could do that in C too, but that would be borrowing pain far beyond string handling in C.)

It's pretty easy to see why Perl ended up with a lot of that market. It fit that niche which hadn't previously existed, and there wasn't much competition. (Korn shell. Korn. Shell.)

The market expanded rapidly. Then the competition expanded rapidly because the market expanded rapidly. It's innumerate to suggest that the relative position of market share will remain the same or even grow when the market grows so rapidly and so much competition appears.


The market expanded rapidly. Then the competition expanded rapidly because the market expanded rapidly. It's innumerate to suggest that the relative position of market share will remain the same or even grow when the market grows so rapidly and so much competition appears.

I used very, very poor wording, and on re-reading what I wrote, it looks like I'm trying to say something I'm not.

When the market expands rapidly, but you share does not should have been when the market expands rapidly, but your users do not at a similar rate

My main point just being that Perl has less relevance to this market than it did before. This is obvious and expected with much more competition. On the other hand, it wasn't a foregone conclusion that Perl would cede a high position of relevance (if not the top position) once the competition heated up, which it has. I think that's important and it's useful to recognize that (as obvious as it may be to some) so we don't become complacent by falling back on platitudes of "we are growing, so everything is okay". We've lost a lot, and I still want that to be a motivator for doing great things and telling people about them.


Oh, man, oraperl and sybperl. Brings back memories.


Perl has stagnated, it's as simple as that. There are so many alternatives out there that have picked up where Perl left off or gone in new directions and covered some of the same ground that the competitive value of Perl has diminished over time.

I consider Ruby, for example, a more evolved Perl in a lot of ways.

Perl used to be the go to language for scripting and automation (practically what it was designed for) but now that niche is also filled with Python, Ruby, node, power shell, and improved automation tools like puppet, chef, or ansible.

Perl also used to be the go to language for dynamic web applications but it fell behind as other languages gained ground and especially as sophisticated frameworks like rails put Perl development to more and more of a comparative disadvantage.


I consider Ruby, for example, a more evolved Perl in a lot of ways.

Hopefully not in strength of module system, automated testing, extension mechanisms that aren't monkeypatching, quality of core development practices, metaprogramming (if you count Moose or p5-MOP), documentation, lexical scoping, or Unicode support.


Along with the other excellent answers, one thing Perl never got was a faster runtime. This drove me into the arms of V8 (via Node.js).

Ruby got jRuby and Rubinius, PHP got the excellent work Facebook did, Python got PyPy, and many cool languages started to appear on the JVM.

Perl got stuck with /usr/bin/perl.

Having said that, the internals of Perl continue to improve in the current interpreter, and there's some long standing amazing work such as the incredible performance of the DBI drivers (look at the techempower benchmarks to see how well Perl shines even versus Node.js when it comes to database access).

I still use Perl for small scripts, but I'm just not building anything big in it these days. Perl6 was a huge lost opportunity, and a massive failure in vision and leadership.


Programming languages are often too associated with popularity, and the "big products" which have been produced with it in the last few years. Being "old", especially in the startup world, is already a big negative. This has been applied to the numerous proprietary languages which were popular during the 90' for web development, and it will happen again to the current popular choices. Perl by itself has very little to do with it. I've seen "worse" languages become popular and fade away.


Interesting to think about the sorts of cultural markers that other languages have. Maybe this is just another way of saying 'code smell', but there must be similar elements for JS/Ruby/Python (for example) where it the code isn't strictly 'wrong' (or even debatably wrong), but is still clearly indicative of a particular cultural approach towards coding in that language.

Like overriding Array.prototype.push in JS - it might be just fine, but I tend to pause and re-evaluate my attitude towards the code when I see that going on, because it's a very different approach to coding in JS than I personally use.


I tend to see much clearer indicators in JavaScript code. Global variables? Lots of functions outside any closure? Old skool js coder ahead.

You can also tell javascript only programmers a mile off. Anonymous functions everywhere? Functions have multiple concerns? Use 'var aFunction = function()'? Mono linguist and spaghetti code ahead.


A lot more JS coders could benefit from adding "use strict" to their code too. It's frankly amazing it isn't used more.


Honestly, in practive it falls short and doesn't do nearly enough. :(


True. It does far too little at compile time. I don't understand why. Maybe we need a "use compile strict" ;)


In Ruby, using explicit "returns" or ternary operators can be a hint that someone is newer to Ruby. On the other hand, using symbol-to-proc syntax indicates some experience (I.e. list_of_objects.map(&:method))


I just don't understand why anyone developing a language would think dynamic scope would be a good idea.


Richard Stallman gave an excellent use and rationale for dynamic scope in the EMACS manual:

http://www.gnu.org/software/emacs/emacs-paper.html#SEC17

One of my favourite things about perl and CL is the ability to have lexical (my/let) and dynamic (local/special) scope in the same program depending on what is clearer.

In fact, it comes up often enough that programming in a language lacking dynamic scope (e.g. basic, JavaScript, or python) feels limiting sometimes, and I end up emulating it (settings+extend, observables, etc).


That rationale may be a bit revisionist, here is Olin Shivers' take on RMS's justification for elisp's dynamic scoping:

"Some context: Common Lisp did not exist (the effort was just getting underway). MIT Scheme did not exist. Scheme was a couple of AI Lab tech reports and a master's thesis. We're talking the tiniest seed crystal imaginable, here. There was immense experience in the lisp community on optimising compiled implementations of dynamically-scoped languages -- this, to such an extent, that it was a widely held opinion at the time that "lexical scope is interesting, theoretically, but it's inefficient to implement; dynamic scope is the fast choice." I'm not kidding. To name two examples, I heard this, on different occasions, from Richard Stallman (designer & implementor of emacs lisp) and Richard Fateman (prof. at Berkeley, and the principal force behind franz lisp, undoubtedly the most important lisp implementation built in the early Vax era -- important because it was delivered and it worked). I asked RMS when he was implementing emacs lisp why it was dynamically scoped and his exact reply was that lexical scope was too inefficient. So my point here is that even to people who were experts in the area of lisp implementation, in 1982 (and for years afterward, actually), Scheme was a radical, not-at-all-accepted notion. And outside the Lisp/AI community... well, languages with GC were definitely not acceptable. (Contrast with the perl & Java era in which we live. It is no exaggeration, thanks to perl, to say in 2001 that billions of dollars of services have been rolled out to the world on top of GC'd languages.)"

http://www.paulgraham.com/thist.html

(emphasis not my own)

That whole page is a good read, particularly if you are a fan of Shivers' writing style.


Cool, I hadn't read Olin's take.

However, the cited rationale can't really be revisionist of what RMS told Olin, because it was published in 1981, about 14 months before Olin came to MIT in 1982.

It appears verbatim in RMS's paper, "EMACS: The Extensible, Customizable, Self-Documenting Display Editor," A.I. Memo 519a, ftp://publications.ai.mit.edu/ai-publications/pdf/AIM-519A.pdf (March 26, 1981). He doesn't mention an efficiency justification in there.

The whole document is also fascinating for including RMS's nascent but not-fully-baked early thinking about free software.


Hmm, good catch, I didn't notice when that was published.


When you're writing an interpreter, dynamic scope is easier to implement. And it isn't necessarily all that problematic for simple script-type programs, since they don't tend to do much nesting of lexical or execution contexts.

Also it does offer some flexibility, though personally I've never been fond of programmer convenience as an excuse for bug farming.


Global variables in C are dynamically scoped. They tend to be useful from time to time.


I'm not very familiar with dynamic scoping, so maybe I've misunderstood it, but that doesn't appear to be true. What variable a name binds to is determined entirely at runtime, which happens for globals too. You cannot end up accessing a variable in your caller unless they explicitly pass in a reference to it, and it's not possible to write a function which refers to a single global variable in the code, but which can refer to different variables at runtime depending on the caller.


You are correct. C is lexically (a.k.a. statically) scoped, through and through.


As the child of anthropologists, I think there's a lot of room for cultural analysis of technologies. Languages, for instance, have cultures as well as affordances, and a careful comparative analysis would be really interesting.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: