Why though? The whole point of git is that it's distributed source control - you _always_ have a local copy of the source and the history. That's one of the biggest wins in using git and not a centralised source control in the first place..
But you don't necessarily have an up to date copy. If you had a local mirror, you could queue collective commits until the github server becomes available again.
> But you don't necessarily have an up to date copy.
You would if you designed your build process that way. I think it's a good idea to eliminate all third party build-time dependencies. In practice this means keeping a Git clone of everything to use for official builds, or anything that can't risk a third-party being unavailable when you need it most.
You could set one up in approximately no time at all using your local copies.
I mean, having a proper local master backup is a great idea, but you could survive without it in a case like this. Just set up a new repository somewhere everyone can get to it and have everyone push to it.
If people have their whole workflow in git kernel-dev style, sure. But the reality is that for a lot of people their workflow is github. Leaving them with raw git is equivalent to leaving them with nothing.
I've worked at places with enormous enterprisey SVN checkouts that might take an hour or two to slowly download off the internet even though the developer at the next desk may have the exact same bunch of bits available without having to go all the way out to the internet and back. Just solving that problem with better Git tooling could mean a better experience than the current re-centralised decentralised version control...
Well yes, but if you need to setup the codebase on another computer for example or are very used to use GitHub to do some "extra things" like browsing and follow commits, it will disrupt the workflow.