Second day of coping data. I moved from #tar and #mbuffer to #rclone for handling them. First error ruined whole progress, so i switched to something dedicated. It's slower tho. I think one of files is damaged and not readable from source.
Sadly its the biggest file on the NAS - the clone image of the drive. Lesson learned - next time i need spit this 700GB+ monster to 1GB parts and secure them with #par2.
With less than 50 lines of #Python code i counted all my files and added them to database. I had no clue how much RAM will be needed to keep whole list of files and how long it will take to iterate over them. It was overkill tho - whole process took one hour and it needed just 1GB to keep everything in the memory.
https://gist.github.com/nikow/d17e9720598b8977ff0da2c4a3ce4602
My #Synology NAS decided to went into ReadOnly mode for no reason. The problem is that i need to evacuate 50TB of data to 5TB drives. I can not connect all those 5TB drives and once and They're quite not reliable - some are crashing after writing some data to them.
Is there a ready program which cna help me?
So far my plan is to write #Python script to index all those files into #sqlite3 database and then prepare lists of files which fits into space on external drives.
Zmiany w paragonach 2023 u wielu wywołują obawy. W zamyśle rozwiązania, które ma się pojawić już 15 września 2023 roku, ma ono ułatwić klientom życie. Przypisanie elektronicznego paragonu do konkretnej osoby wzbudza jednak wątpliwości związane z przetwarzaniem danych. Czy jest się czym niepokoić?