Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I've ran into this decades ago with file servers back. Half a million files, takes too long to do a simple for loop and calling 3rd party processes for awk/sed when I was just using them to format/search text. Breaking it down to just to mosty bash one scripts reduced the run time and ended all pauses.


I was going to argue that it would be better to simply not use a shell for that, but

> decades ago

Frankly, I can only imagine how the environment then would be. Thinking back with your current experience, what do you think you would have done if you had to fix it again?


Things have changed so much, from the app side laying out data, file system/storage side, and hardware speeds, that people are much more lazy with applications due to the environmental improvements.

I use to make lists first, then process the lists, I still do this sometimes since its faster. If you have to run a query every time, your probably doing it wrong, but for small stuff, everything is so far, I can chain gnu apps and be done. I'm not a programmer, I'm a sysadmin so mostly deal with the fixing things like auditing or fixing data on a file system. (or maybe db)


And the funny thing is that some of these systems are still in use for critical missions.

Sometimes even bash isn't an option, you have to deal with older shells like ksh.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: