Hacker News new | past | comments | ask | show | jobs | submit login

The alias is closer to:

    alias wcurl = curl -# -O $1 --location --remote-name --remote-time --retry 10 --retry-max-time 10 --continue-at -"
the main reason I use wget is because it automatically retries downloads, which is vital for my not-so-great internet, and an option I wish was on in curl by default, as half the world uses curl, and then keep trying to automatically download 400MB files, which I have trouble finishing.

Then, if you look at the script, it's basically that, with some minor cleaning up and nicer error handling.






FWIW, I think you can add some of those options to .curlrc if you wanted them to be added every time you invoke curl.

correct; .curlrc for the win!

and you can use `-q` to have it exclude the curlrc and use only args you pass to it. curl has such an amazing amount of power, but also means it has a lot of options to utilize a lot of things folks take for granted (exactly like retries, ect)


Makes me think that probably a lot of unattended shell scripts out there should use -q in case someone has a .curlrc altering the behaviour of curl, and thus breaking the expectations of the script.

Maybe even should be a shellcheck rule.


same can be said for wget with --no-config (and really any app that runs on any system) - if you're in the business of automating it, those config-features need to not be ignored (or down right overwritten to your scripts desire) in docker-containers though, its more safe to assume no pre-loaded rc files are embedded (but 100% want to check your source-container for things like that) - but for running in some users workspace, the need to be careful is real.

most of the time though, those options should be safe; only really need to not have auto-retries are when trying to get super accurate date (like observing odd behavior, you dont want thing to simply "auto-work" when finetuning something, or triaging an upstream-server or dns issue)

i often write my scripts execing calls like `\curl -q ...` such that i get no user's alias of curl and ensures a consistant no-config amongst my system and others (although gnu curl vs mac curl (and other binaries of gnu vs mac vs busybox) are always the real fun part). (if the user has their own bash-function of curl then its on them for knowing their system will be special compared to others and results maybe inconsistent)


Oh wow, I thought only function could take parameters !

It must be relatively recent because old those old answers do not mention it : https://stackoverflow.com/questions/7131670/make-a-bash-alia... https://stackoverflow.com/questions/34340575/zsh-alias-with-...

On the other hand I tested it in both bash and zsh, and it works in both


Instead of -O you want to use

  --remote-name-all
otherwise you have to specify -O for every URL specified. For the same reason, remove $1 and rely on all parameters being added at the end of line. The example above will only download the first URL.

(I personally think this should have been the default from the beginning, -O should have set the behaviour for all following parameters until changed, but that is too late to change now.)

There is also --remote-header-name (-J) which takes the remote file name from the header instead of the URL, which is what wget does.


> There is also --remote-header-name (-J) which takes the remote file name from the header instead of the URL, which is what wget does.

I don't think that's the case, that behavior is opt-in as indicated in wget's manpage:

> This can currently result in extra round-trips to the server for a "HEAD" request, and is known to suffer from a few bugs, which is why it is not currently enabled by default.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: