Hacker News new | past | comments | ask | show | jobs | submit login

So I just added a 2.4GB iso and other than taking longer nothing happened. Push and clone also seem to work.



Right :) I forgot that everyone is in 64bit systems these days. This change a lot. Anyway, GIT uses single blob for single file content. This is all right, if we speak about source code and text files, as those are not large. Thing changes for arbitrary binary files.


AFAIK, not much of anything changes (perhaps name some of these changes?).

And what is wrong with a single blob? This seems work great for deduplication, if you were using git to manage a photo library and happen to add something twice.


There is several issues depending where you run git on. For example mmap() can be problematic on large files in 32bit OS. On Cygwin it gets even worse.

As for deduplication, I do not think so. If you have single blob, lets say 1GB and you change just 1B, whole blob changes and no more dedup. If you use basic method of static block size, lets say 512KB, this will work much better. Futher, there are more advanced techniques to handle dedups like roling checksums to carve out even smaller sub-block.


But you're always arguing the odd case: - 32 bit systems - cygwin git on Windows instead of native - expecting deduplication on different files (???)


Uh, hold on. Are we trying to discuss objectivily? If not, and you just insist like GIT is best for everything fanboyism, and use it everywhere.. I am not interested.

I pointed out that GIT have some rough cases and its good to be aware of them. I said it already and will say it again: choose right tool for the task.

If your workflow with GIT works, use it. No need to discourage other solutions. Yes, I use ancient platforms and I want DVFS on them too.


Well, you said something will happen with git and big files, but it seems not much will happen for the majority of users.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: