![]() ![]() Or “backwards,” and it might evolve a bit as you work with the repository,īut generally it’s a chain of changes to a single file represented as some In other SCMs, a delta-chain is generally fixed. They are so different from most other systems. Let me go through the basics anyway) how git delta-chains work, and how To explain this, it’s worth explaining (you are probably aware of it, but The case of “I know I have a really bad pack, and I want to throw awayĪll the bad packing decisions I have done.” To pack files sometimes regardless of whether you converted from anĪbsolutely. Garbage collect all unreferenced objects with git gc -prune=now (or if your git gc is not new enough to support arguments to -prune, use git repack -ad git prune instead).Īctually, it turns out that git-gc -aggressive does this dumb thing Remove the original refs backed up by git-filter-branch: say git for-each-ref -format="%(refname)" refs/original/ |Įxpire all reflogs with git reflog expire -expire=now -all. This is a very destructive approach, so make a backup or go back to cloning it. If you really don’t want to clone it, for whatever reasons, check the following points instead (in this order). (Note that cloning with a plain path just hardlinks everything!) The clone will not have the removed objects. Clone it with git clone file:///path/to/repo. ![]() A safer way is to clone, that keeps your original intact. Then there are two ways to get a smaller repository. You really filtered all refs: use -tag-name-filter cat -all when calling git filter-branch. git log -name-only -follow -all - filename can help you find renames. You really removed all variants of a filename, if a blob was moved over its lifetime. People expect the resulting repository to be smaller than the original, but you need a few more steps to actually make it smaller, because Git tries hard not to lose your objects until you tell it to. Git-filter-branch can be used to get rid of a subset of files, usually with some combination of -index-filter and -subdirectory-filter. The command he suggests for doing this properly after having imported “a long and involved history” is git repack -a -d -f -depth=250 -window=250īut this assumes you have already removed unwanted gunk from your repository history and that you have followed the checklist for shrinking a repository found in the git filter-branch documentation. Linus suggested (see below for the full mailing list post) using git gc -aggressive only when you have, in his words, “a really bad pack” or “really horribly bad deltas,” however “almost always, in other cases, it’s actually a really bad thing to do.” The result may even leave your repository in worse condition than when you started! A window of size 250 is good because it scans a larger section of each object, but depth at 250 is bad because it makes every chain refer to very deep old objects, which slows down all future git operations for marginally lower disk usage. As of version 2.11 (Q4 2016), git defaults to a depth of 50. Nowadays there is no difference: git gc -aggressive operates according to the suggestion Linus made in 2007 see below. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |