(Hopefully last) Status update:

I pulled all the pictures and home movies from the original #raid setup from the mini #Debian file server. Through #jdupes I was able to clear out a lot of duplicates, but I did have to do some manual review as I had several duplicates of pictures at smaller than original size.

I have organized pictures into main folders and will continue to further organize. I have all pictures and videos backed up on two drives.

As an aside, oh my lord were there so many files on my mini server. When sifting through all the stuff, there were so many random pictures of stuff I do not recall taking. I did get the drives refurbished, so maybe they were not wiped properly or maybe when backing up my games devs put in a lot of random stuff?

As for the mini Debian file server, I have reattached the original raid drives. Due to there being so much garbage on it, I am in the process of doing random writes to clear it out and then encrypting everything. Once this process is complete I can then put my various backups back on the mini Debian file server.

Wish me luck 😅

When using #jdupes, sometimes you want to remove duplicate files in directory `B`, but without removing as well the same files if they are repeated the reference directory `A`.

Enter the `-I` (`--isolate`) option. Quite handy today.

`jdupes -IO -R A B`

Mise à jour de la fiche sur comment trouver les fichiers dupliqués de son disque dur dans le Grimoire :
https://grimoire.d12s.fr/2020/find_duplicate_files.html

Ajout de #fclones, une alternative à #jdupes écrite en #Rust.

#grimcom

Find duplicate files (to remove them and save space)

Trouver des fichiers dupliqués pour les supprimer et libérer du stockage. Modern Rust solution : fclones $ fclones group --cache /tmp > dupes.txt (1) $ fclones link < dupes.txt (2) 1 Creates the list of duplicated files 2 Replaces clones by links, other operations are : move, remove, dedupe (via native filesystem deduplication capabilities …

Grimoire-Command.es

Aktuell läuft #jdupes um doppelte Dateien im #Backup durch Hardlinks zu ersetzen. Es soll bis 7x schneller als #rdfind oder #fdupes sein. #opensource

https://github.com/jbruchon/jdupes

GitHub - jbruchon/jdupes: A powerful duplicate file finder and an enhanced fork of 'fdupes'.

A powerful duplicate file finder and an enhanced fork of 'fdupes'. - GitHub - jbruchon/jdupes: A powerful duplicate file finder and an enhanced fork of 'fdupes'.

GitHub
@worldsendless I would first reach for #jdupes
@schlink It would a little bit silly, because there are so many out there already:

https://www.virkki.com/jyri/articles/index.php/duplicate-finder-performance-2018-edition/

https://github.com/topics/duplicate-files

But then again, at first glance none of them are written in Rust nor Go! =)

A fun exercise if nothing else, with a couple of real-world concerns coming together.

#dupd #jdupes #rdfind #fdupes #rmlint #duff #fslint

@brandon
Duplicate finder performance (2018 edition) | stdout

@ugeek #jdupes también es digno de probar. No es completamente compatible con fdupes, pero es más rapido.

#fslint-gui fonctionne très bien : en quelques clics, il m'a libéré 6GO sur mon SSD (qui je croyais bien rangé) et sans perte de données !

Il est aussi capable de lister les dossiers vides, les liens symboliques cassés, les fichiers qu'on l'air temporaires, les symboles de débogage oubliés dans les binaires… même les espaces redondantes dans les fichiers texte !

(ça doit faire un carnage chez les gens qui font du #Python sans tabulation) #fslint #rdfind #fdupes #jdupes