Hacker News new | past | comments | ask | show | jobs | submit login

The analogy sounds great and catchy (and sufficiently contrarian to get knee jerk upvotes from the site), but it really oversimplifies things and misses the mark on what types of problems fuzzing catches and why that's incredibly important in a project as popular as curl.



Indeed. It's more of an argument not to execute scripts downloaded from the web. Which is valid, but not the point of fortifying curl.


Would it make sense for cURL to detect when its being piped into `sh`? Possibly have it refuse to proceed unless the site is on a whitelist or the user has added an --i-accept-the-risk option.


That's not possible in general, I think. Can you even get process information out of file descriptors? But the people who write the scripts that embed such curl commands, will add the option directly. And others will put it in the command to copy. Or put | cat | sh in the command. And there's also wget.

I think it's not something curl can tackle by itself. Instead, you'd need a file descriptor that informs you about the source, and the shell should refuse not whitelisted sources.


You're right it subverted in general. isatty() can tell if output is being piped. When piped special safety rules could kick in. Of course they would be false positives but then there could be an option (like I mentioned above).


I agree that fuzzing is a very important technique, and as I said it's worth fixing any issues found, but I still think that many of the most common uses of curl are very problematic. As you can see, I didn't get any knee jerk upvotes.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: