Hacker News new | past | comments | ask | show | jobs | submit login

Shell scripts are extremely portable and should be the preferred method for a large set of tasks. Properly written, a shell script can run on a 20 year old Solaris machine, any version of Windows (with installed tools like Cygwin) and any modern Unix variant... claiming Node.js is more portable is ridiculous on so many levels.

The problem is so many programmers don't take the time (or care to take the time) to learn to use the well thought out design of Unix tools opting instead to see every problem as a nail corresponding to the latest trend in hammers (programming languages).

This has led a lot of programming types to create advanced tools for managing Unix systems which largely ignore the design of Unix.




It's hard to write a portable shell script. My dotfiles need to be portable, and they involve a lot of shell. Every time I introduce a new OS, I have to make changes. Various oddities get you. These, for example, look really innocent but aren't portable:

  find -iname 'foo*'  # [1]
  ... | sed -e 's/ab\+c//'  # [2]
  ... | sed -i -e 's/abc//'  # [3]
  tar -xf some-archive.tar.gz  # [4]
  python -c 'anything'  # [5]
Things like messing around with /proc are more obvious, but things like curl (is curl installed? what do we do if it isn't? try wget?) can be hard too.

[1]: find doesn't assume CWD on all POSIX OSs [2]: "+" isn't POSIX. You have to \{1,\} that. [3]: -i requires an argument on some OSs. [4]: This is stretching the definition of portable a bit; I've worked on machines where you had to specify -z to tar, given a compressed archive. (tar has been able to figure out compression on extraction for well over a decade now, so -z is usually optional, but some places are really slow to upgrade.) [5]: Unless anything is a Python 2/3 polyglot, you'd better hope that you guess correctly that Python 2 was installed. (And it's really hard here: python is either python 2 or 3 on some systems, depending on age & configuration, with python2 and python3 pointing to that exact version, but on some machines, python2 doesn't exist even if Python is installed, despite PEP-394.)


It's a well known fact that GNU tools have plenty of extra features which you have to be careful about using if you want portibility AND that many of the legacy commercial Unix implementations have positively ancient implementations and feature sets. I wouldn't really say it is so very difficult though.

Every script you write isn't going to be portable, but it's not that much of a stretch to endeavor to keep your script simple, not make assumptions, and be mindful of the potentially missing features of some implementations.

I take a special objection to [5], `python -V` isn't difficult at all to run, hoping and guessing are not necessary.

There's a good guide here: http://www.gnu.org/software/autoconf/manual/autoconf.html#Po...


Portability is a red herring anyway. If you pursue it, you'll always end up chasing the lowest common denominator.

YOU can control where the app is deployed (this is largely true even if you're selling your app just by having installation requirements or by selling appliances instead of installable apps).


> I take a special objection to [5], `python -V` isn't difficult at all to run, hoping and guessing are not necessary.

I mostly meant that in a simple statement of:

  python -c "code"
…you're probably forced to assume that it's Python 2 (or write a 2/3 code) and hope that your assumption is right. You can't run `python -V`: you're a script! The point is that it is automated, or we wouldn't be having this discussion.

Of course, you can inspect the output of python -V (or just import sys and look at sys.version_info.major) and figure it out, but now you need to do that, which requires more code, more thought, testing…


I'd argue that you should probably stick to one subset of things in your bootstrap script -- and I'd say grep, awk, sed and (ba)sh go together, anything "higher level" like python/ruby/perl/tcl does not fit within that. You might want to check for python with a combination of "python -V" and the dance described above -- and, as part of bootstrapping, make a symlink (or copy, if you need to support windows and/or a filesystem without symlink support) to eg: python2. Save that tidbit as "assert-python2.sh" and then first "assert-python2.sh" then "check-bootstrap-deps.sh" and finally "bootsrap.sh" :-)


Interesting, for 1 and 4 -- I immediately assumed that might break (as for tar, I'd generally prefer something like zcat (or for scripts gzip -dc) | tar -x... makes it easier to change format (both gzip to lzma and tar to cpio). For 2,3 I'd be wary of sed for anything that needs to be portable in general. For 3, it seems prudent to use a suffix with -i anyway; explicit being better than implicit most of the time.

As for 5; How many system does have python 2 installed, but no python2 binary/sym-link? (I've never had to consider this use-case for production).

Note a slight benefit of splitting tar to zcat and replacing python with python2, is that you'll get a nice "command not found" error. You could of course do a dance in the top of your script trying to check for dependencies with "command -v"[1]. If nothing else such a section will serve as documentation of dependencies.

Something like:

    # NOT TESTED IN PRODUCTION ;-)
    checkdeps() {
      depsmissing=0
      shift
      for d in "${@}"
      do
          if ! command -v "${d}" > /dev/null
          then
            depsmissing=$(( depsmissing + 1 ))
            if [ ${depsmissing} -gt 126 ]
            then
              depmissing=126 # error values > 126 may be special
            fi
            echo missing dependency: "${d}"
          #debug outpt
          #else
            #echo "${d}" found
          fi
      done
      return ${depsmissing}
    }

    deps="echo zcat foobarz python2"
    checkdeps ${deps}
    missing=${?}

    if [ "${missing}" -gt 0 ]
    then
      echo "${missing} or more missing deps"
      exit 1
    else
      echo "Deps ok."
    fi

    # And you could go nuts checking for alts, along the lines of
    # pythons="python2 python python3"
    # and at some point have a partial implemntation of half of
    # autotools ;-)
[1] https://stackoverflow.com/questions/762631/find-out-if-a-com...


Ahh yes, the much vaunted Hammer Factory Factory.

http://discuss.joelonsoftware.com/default.asp?joel.3.219431....


Completely off topic, but very interesting -- the author of that post wrote a book about surviving the Costa Concordia! http://www.amazon.com/gp/product/B00AUYIKNK/ref=as_li_qf_sp_...

Seems like he really did need some tools.


This fills me with great joy and sadness.


>The problem is so many programmers don't take the time (or care to take the time) to learn to use the well thought out design of Unix tools opting instead to see every problem as a nail corresponding to the latest trend in hammers (programming languages).

Unix tools are the arguably the best tools available to a modern user. That, however does not mean that the Unix tools are well designed; many would argue that the Unix tools are extremely poorly designed or have no discernible design at all. S-expression are a much more powerful and useful abstraction than a "stream of bytes". POSIX was hacked on many years later in attempt to make sense out of the mess that shell commands had become. Shell scripts are very fragile and have never been truly portable across various *nixes, although the situation is better than it was twenty years ago, when it was enormously difficult to port scripts across the various commercial Unix installations, because they would break in many different and subtle ways.

I recommend reading the out of date but still useful Unix Haters Handbook: http://pdf.textfiles.com/books/ugh.pdf




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: