Esto pasa cuando tienes el respaldo, del respaldo, del respaldo... Y #rmlint viene al rescate

Saturday night craziness: implementing things in shell that are already available in C.

**Tool for detecting duplicate folders**

https://github.com/ilario/finddirdupes

Beware: it is much slower than `rmlint -D`.

#dupes #linux #shell #bash #duplicates #lint #rmlint

GitHub - ilario/finddirdupes: Finds duplicate folders

Finds duplicate folders. Contribute to ilario/finddirdupes development by creating an account on GitHub.

GitHub

Recommended #opensource #file #duplicates detection and deletion: #rmlint

Why? - Extremely fast · #CLI · Candidate file filtering by #name, #size, #modification #time · Configurable criteria for determining the original file · Paranoia mode offered (byte-by-byte comparison) · Flexible output #formats, including #bash deletion script, #json, #CSV · Excellent #documentation and #tutorials

https://github.com/sahib/rmlint

More recommendations: https://tuxwise.net/recommended-software/

GitHub - sahib/rmlint: Extremely fast tool to remove duplicates and other lint from your filesystem

Extremely fast tool to remove duplicates and other lint from your filesystem - sahib/rmlint

GitHub
User manual — rmlint (2.8.0 Maidenly Moose) documentation

After having several troubles with #backuppc, which used to be my tried and true reliable backup server software, I am looking at other ways of doing things.

I have learned that #btrfs file system has features that include not taking up extra disk space when making a copy and that #rmlint can be run on a schedule to de-duplicate.

I think some scripting, cron and rsync on top of btrfs may be a more long term reliable solution to how I'd like to do backup than specialized software.

@10leej Hey, I know that you use and like #btrfs. I haven't used it yet, but am considering it as it may take up less storage space with its copy and de-duplicating abilities (paired with #rmlint)

What is the short list of reasons you like BTRFS ?

Today, #rsync is backing up 8TB of data to a USB 7TB hard drive. How do you fit 8TB in 7TB of space? Heavy #btrfs compression and reflink file deduplication with #rmlint.
@schlink It would a little bit silly, because there are so many out there already:

https://www.virkki.com/jyri/articles/index.php/duplicate-finder-performance-2018-edition/

https://github.com/topics/duplicate-files

But then again, at first glance none of them are written in Rust nor Go! =)

A fun exercise if nothing else, with a couple of real-world concerns coming together.

#dupd #jdupes #rdfind #fdupes #rmlint #duff #fslint

@brandon
Duplicate finder performance (2018 edition) | stdout

#rmlint is a filesystem junk cleaner.

rmlint is a simple tool that scans a directory or a set of files for duplicates, empty files/directories, orphaned files, and several other problematic things. After rmlint finishes scanning it creates a #JSON file and shell script which contains the to be removed files. Running the shell script deletes these files after confirmation.

Website 🔗️: https://rmlint.readthedocs.io/

apt 📦️: rmlint

#free #opensource #foss #fossmendations

User manual — rmlint (2.8.0 Maidenly Moose) documentation

tip: with -c sh:link it deduplicates at the file level; useful when you want to keep procrastinating cleaning up your mess, and just need some space #rmlint #data