I am working with a git repo that is very large ( > 10gb ). The repo itself has many large binary files, with many versions of each ( > 100mb ). The reasons for this are beyond the scope of this question.
Currently, it is no longer possible to properly clone from the repo, as the server itself will run out of memory (it has 12gb) and send a fail code. I would paste it here, but it takes well over an hour to get to the point of failure.
Are there any methods by which I can make a clone succeed? Even one which grabs a partial copy of the repo? Or a way I can clone in bite sized chunks that won't make the server choke?
rsync to copy the entire repo by pointing it at the top level directory that contains
.git. Then change the remotes in .git/config to point back to the original.
That's the only key off the top of my head that needs to be changed in
.git/config, but I would scan through looking for any others that are host specific. Most of them are pretty self-explanatory.