https://git-annex.branchable.com/ #git-annex #commandline #tools #largefiles #sync_tool #HackerNews #ngated
The Future of Large Files in Git Is Git
https://tylercipriani.com/blog/2025/08/15/git-lfs/
#HackerNews #Git #LargeFiles #GitFuture #VersionControl #TechTrends #SoftwareDevelopment
#Git #LargeFiles #Storage #GameDev
New blog post about one way to keep your large-file storage size down when using git.
https://dbat.codeberg.page/posts/git-large-file-technique.html
I hope it's not bad advice. Please let me know of bugs etc.
Also, if anyone has other techniques, will be happy to try them and add them to the post.
🦇
#Codeberg #git #git-lfs #GameDev #LargeFiles
I opened an "issue" on Codeberg to try get some discussion about my recent git-lfs experiences and the ideas floating in my head for a better way to handle large binary files while using git.
I'd appreciate any feedback etc. It's here:
https://codeberg.org/Codeberg/Community/issues/1910
I'm a git-nitwit, so factor that in! 🤭
🦇
### Comment I hope this is okay, to open an issue as a community discussion. I don't know where else to go. I tried on Mastodon, but it's not working there. Please let me know! # Why git-lfs is an "illusion" I have been building game-dev tools for a while; this usually means big blender f...
Rough idea:
touch big.blend
git rm --cached big.blend
git commit
git push
git add big.blend
git commit
git push
...but scripted somehow, so that only happens when big.blend has changed.
3/
LFS does not do it. Git-annex does not seem to do it.
All the `assume-unchanged` and `skip-worktree` do not really do it.
Is there some way you can think of?
2/
It is just *bewildering* to me that git does not have a way to ignore changes to a tracked file.
Game dev use-case:
repo/
codefile001
codefile002
binary_files/
huge001.blend
huge002.png
huge003.wav
I want to add/commit/push as usual, because I *want* the all those files online (backups) and for others to fetch.
I do NOT want those binary files to duplicate. When I push a new version, then *replace* binary files that changed!
😕 🦇
1/
I recently needed to read in a very large file in Python (around 150 million lines), it would read in 50 lines, do some work and then carry on. But it had a habit of crashing out (normally when I forgot and closed my laptop). So I created a stateful reader, that way when I started the execution again it would resume from the last saved checkpoint.
Recently I needed to read a very large file of input data and, for each record, perform an HTTP request to send that data off somewhere else. There’s a few ways to do this. The first and simplest one is to read all of the lines from the file and then iterate over them, this has one big drawback however. In testing it worked fine, but when you scale this up to a few million records then you start to run into memory limitations! Okay, I need to look at another option.
How To Remove Large Files on Your Windows PC
https://www.techmarkettips.com/technology/how-to-remove-large-files-on-your-windows-pc/