Hacker News new | past | comments | ask | show | jobs | submit login

Maybe this is naive, but wouldn't it have made more sense to do a bunch of smaller cp commands? Like sweep through the directory structure and do one cp per directory? Or find some other way to limit the number of files copied per command?



No, because then it wouldn't have replicated the hardlink structure of the original tree. That was the goal, and also the bit that causes the high resource consumption.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: