Hacker News new | past | comments | ask | show | jobs | submit login

"It is really hard to lose stuff"

Indeed. This also means that garbage keeps piling up in git repos.

This is how I make sure, I do lose stuff eventually:

https://github.com/no-gravity/git-gc-all-repos.sh

A script that goes through all my repos and performs garbage collection.




I know your link says you don't want to do `gc --aggressive --prune all`, but for anyone who does: a) That still leaves objects referenced by the reflog, so you may want to `reflog expire --expire=now --all` first if your reflog is full of rebases etc that you don't need, and b) you may want to run gc twice because the second time compacts a tiny bit more than the first.


Disk space is cheaper than lost data.


True, but wouldn't the reduction in size make it a little quicker to transfer over slow networks?


You might be interested in the `git maintenance` subcommand that was introduced in Git 2.31. It can do per-repository automatic background garbage collection on a configurable schedule, among other things.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: