I don't think we should underestimate how much performance differences can color opinions here, especially for something like CLI tools that are used all the time. Little cuts add up. At work I use both git and bazaar, and bazaar's sluggishness makes me tend to avoid it when possible. I recall Mercurial recently announced an attempt to rewrite core(?) parts in Rust, because python was just not performant enough.
The speed of branching in git, compared to subversion, was a huge part of convincing me to move. Not an entirely "fair" comparison given that subversion is a centralized VCS, but speed is very important.
This is the interface you're presented with, but it's actually a "cheap copy" underneath. So if you write "svn cp https://svn.myserver.com/trunkhttps://svn.myserver.com/branches/foo" that takes about 1-2 seconds in my experience (no matter how many files, how long the history, or how many binary files you have, etc.).
Likewise, Git has been steadily improving on the user interface front for years. It's much better than 1.0 was, but it's still not the easiest DVCS to learn.
In my experience, saving a little on runtime doesn't make up for having to crack open the manual even once. UI is a big cut.
From an implementation POV, it's also generally easier to rewrite core parts in a lower-level language, than it is to redesign a (scriptable, deployed) UI.
> From an implementation POV, it's also generally easier to rewrite core parts in a lower-level language, than it is to redesign a (scriptable, deployed) UI.
But only if the data structure is simple and works well for the problem domain. Bazaar (a DVCS from Ubuntu; I mean bzr, not baz) had a much simpler and consistent UI, but it had several revisions to the data structures, each one quite painful, and it was slow; they were planning the rewrite to a faster language but never got to it. (Mercurial also used Python and wasn't remotely as slow as bzr - the data structures matter more than the language).
>In my experience, saving a little on runtime doesn't make up for having to crack open the manual even once.
Please don't ask me to give up features so you don't have to read the documentation. It is the most basic step of being a good citizen in a software ecosystem.
> In my experience, saving a little on runtime doesn't make up for having to crack open the manual even once. UI is a big cut.
Maybe, but I used to work in a multi-GB hg repo, and I would have given up any amount of manual cracking to get the git speed up. Generally you only open the manual a few times, but you can sync many times a day. I'd give a lot to get big speeds up in daily operations for something I use professionally.
"Large binary checkins" is not the actual issue. Git degrades as repository size increases. Large binary checkins make it much easier / faster to reach this situation, but you can also reach it just fine with regular text-based repository if they're big and have a significant history (long-lived and many contributors).
Unless you mean another DVCS, P4 can run circles around git on large repos (mostly on account of not conceptually trying to copy the entire state of the entire repo at every commit).
I don't think we should underestimate how much performance differences can color opinions here, especially for something like CLI tools that are used all the time. Little cuts add up. At work I use both git and bazaar, and bazaar's sluggishness makes me tend to avoid it when possible. I recall Mercurial recently announced an attempt to rewrite core(?) parts in Rust, because python was just not performant enough.