I am going to use git/github to track a writing project I'm doing. I later hope to use the changelog of the repo for research purposes.
To get more observations/snapshots of the work in progress, I was thinking of writing a script that checks for changes every 10 minutes, and auto-adds, commits, and pushes them (assume no merge issues).
If I work an average of 10 hours a week on the project, for 6 months, that's (roughly) 14,400 small commits. Would I start to experience any sort of bottleneck or performance decline in git due to having that many total commits?
If not there, is there any breaking point? Millions of commits?
The Linux kernel git repository has over 500,000 commits, so you should be fine. Performance issues with a git repository is more to do with the cumulative size of the committed files than with the number of commits. See this answer for more details.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With