Currently downloading my Google Photos data with their Takeout tool, and will make dupeGuru to good use (sorting ~ 20 Gb): https://dupeguru.voltaicideas.net/

#LeaveGoogle #dupeguru

dupeGuru

dupeGuru is a cross-platform (Linux, OS X, Windows) GUI tool to find duplicate files in a system. It’s written mostly in Python 3 and has the peculiarity of using multiple GUI toolkits, all using the same core Python code. On OS X, the UI layer is written in Objective-C and uses Cocoa. On Linux & Windows, it’s written in Python and uses Qt5.

dupeGuru

A BIG MISTAKE ON MY PART!

When I got my desktop server last summer it was to use for a couple of different purposes but one was to keep a copy of all of my important files.* And I gathered them up from every device I had, plus every memory card and stick, and dumped them on the new hard drive.

Today I am running #dupeguru to find dupes, as I'm sure there must be a few. Er...if 200,00, give or take a dozen, can be considered a few...

*to me, EVERY file is important.

I finished getting ~4TB of my data from Amazon Photos last night(it's just one part of the whole though). The transfer has been going on since last Friday and finally finished during the night last night. So, I am going over the lot with #dupeGuru to weed out #duplicates and get those ~4TB to a lesser number.
The first scan isn't finished, but damn that's a lot of duplicates already!
Any dupeguru experts out there?
#dupeguru

anyone experienced with #dupeGuru? I'm using it on an external HDD and it seems to be stuck at 33%.

program is not freezing, so maybe it's just a big file or something, but there hasn't been progress in a while now.

my question is: besides wait, can I do something? would be nice to at least be able to have a list of the duplicates it has already found.

 

dupeGuru : supprimer les doublons sur votre ordinateur – JustGeek

dupeGuru est un logiciel open source qui permet de détecter et supprimer les fichiers doublons sur votre ordinateur en quelques clics.

JustGeek