Want to upload a 5 GB file, but the Internet connection is slow, so it takes hours? Why not split the file, transfer it in multiple chunks in parallel and assemble it on the target machine?

split -n 10 file file.split # creates 10 files named xaa, xab, ... inside the folder file.split
rclone -P --transfers 32 --checkers 64 file.split :sftp,host=example.com,user=myuser:/path/to/dest/

On the target machine when completed:

cd /path/to/dest
cat file.split/x* > file

#bash #linux #rclone #network

And besides, because transferring a folder with a bazillion small files (e.g., a git repo) can also be very slow even with a good connection, so you can first package it into a single file, and then use above-mentioned trick:

tar -czvf folder.tar.gz folder
split -n 10
...