Nikow

@nikow@mastodon.online
5 Followers
8 Following
118 Posts
I saw somebody with #Signal badge today. I can not be worse, i have one too!

Second day of coping data. I moved from #tar and #mbuffer to #rclone for handling them. First error ruined whole progress, so i switched to something dedicated. It's slower tho. I think one of files is damaged and not readable from source.

Sadly its the biggest file on the NAS - the clone image of the drive. Lesson learned - next time i need spit this 700GB+ monster to 1GB parts and secure them with #par2.

In fact, it's actually a bit more complicated than that, because the place where the #NAS stands and the place where I have the disks are different places (this explains the use of rclone). On a virtual machine near the #NAS I mounted the mentioned directory using #rclone. At home stands a #Gigabyte #Brix to which the USB disks are connected. Machines see themself over #ZeroTier #VPN.

My drives are slow SMR drives, so i copy them with simple bash oneliner similar to:

tar cv --from-file /path/to/filelist | mbuffer -m 1G | tar xv -C /path/to/drive

Thank you #mbuffer and #tar creators for awesome tools.

Second step is slower tho. I mounted my NAS using #rclone mount as `/tmp/aleksandria-001` for easy access to it. Then with second #Python script i readed whole database to the memory and sorted files by size from biggest to smallest. Next step was 'metadata' map which keeps only drive label to which files has been exported. Then i manually connect drive, copy df output about free space in kilobytes and export lines with paths to text files.

With less than 50 lines of #Python code i counted all my files and added them to database. I had no clue how much RAM will be needed to keep whole list of files and how long it will take to iterate over them. It was overkill tho - whole process took one hour and it needed just 1GB to keep everything in the memory.

https://gist.github.com/nikow/d17e9720598b8977ff0da2c4a3ce4602

Synology NAS Rescure Part 1 - Count The Files

Synology NAS Rescure Part 1 - Count The Files. GitHub Gist: instantly share code, notes, and snippets.

Gist

My #Synology NAS decided to went into ReadOnly mode for no reason. The problem is that i need to evacuate 50TB of data to 5TB drives. I can not connect all those 5TB drives and once and They're quite not reliable - some are crashing after writing some data to them.

Is there a ready program which cna help me?

So far my plan is to write #Python script to index all those files into #sqlite3 database and then prepare lists of files which fits into space on external drives.

Pytaliście, czy można zobaczyć co gadałam na SCS? No można. Dzięki uprzejmości @nikow który nagrał to swoją komórką. Tu jest link. https://www.youtube.com/watch?v=Jvxc-G0ZUHk
Moje wystąpienie na Security Case Study 2023

YouTube
Urzędnik dowie się, co kupujemy? Zmiany w paragonach wywołują obawy

Zmiany w paragonach 2023 u wielu wywołują obawy. W zamyśle rozwiązania, które ma się pojawić już 15 września 2023 roku, ma ono ułatwić klientom życie. Przypisanie elektronicznego paragonu do konkretnej osoby wzbudza jednak wątpliwości związane z przetwarzaniem danych. Czy jest się czym niepokoić?

INTERIA.PL
Updating #ZeroTier to 12.1 was mistake. Sadly there are no logs to investigate.