Copy your external HD to Google Drive using Linux


Searching about solutions to synchronize my local files and my Google Drive I found that there isn't an official client for Linux, but I can access to Google Drive directly from Nautilus if I am using Ubuntu, adding an Online account.

It is enough if I just need to browse over my files and get what I want, but If I need to move more than 300 Gb from my external HD to my Google Drive using Nautilus, it is awful. 

During the transfer, a lot of errors could be thrown and the progress bar is not showing the expected real-time. I needed something better. 

Some of the available tools aren't free but they have a beautiful user interface. Digging in the list of available tools I found Rclone

Rclone is actually more than just a Google Drive client. It is a command-line program to sync files and directories to and from various cloud storage services, including Google Drive, Dropbox, Amazon S3, OneDrive, etc. 

In the beginning, I was too lazy to fight against a command-line program to do the stuff, but after a cup of coffee, I felt confident to meet the challenge. It was really worth it and much easier than expected!.

The installation and usage documentation can be found on their official website. There is detailed documentation for Google Drive.

Once I have installed it, I was testing some commands for copying the content of my external HD to Google Drive, and it is what I finally used:

/usr/bin/rclone copy --update --progress --transfers 30 --checkers 8 --contimeout 60s --timeout 300s --retries 3 --low-level-retries 10 --stats 1s --log-level "INFO" --log-file /home/rafam/rclone_logs --stats-log-level "INFO" --stats-one-line "/media/rafam/Seagate/My Data" "usal:My Data"

This is the meaning of each argument:
  • copy: Copy files from source to dest, skipping already copied.
  • --update: Skip files that are newer on the destination.
  • --progress: Show progress during transfer.
  • --transfers: Number of file transfers to run in parallel.
  • --checkers: Number of checkers to run in parallel.
  • --contimeout: Connect timeout.
  • --timeout: IO idle timeout.
  • --retries: Retry operations this many times if they fail.
  • --low-level-retries: Number of low level retries to do.
  • --stats: Interval between printing stats, e.g 500ms, 60s, 5m.
  • --log-level: Log level DEBUG|INFO|NOTICE|ERROR.
  • --log-file: Log everything to this file.
  • --stats-log-level:  Log level to show --stats output DEBUG|INFO|NOTICE|ERROR
  • --stats-one-line: Make the stats fit on one line.
  • "/media/rafam/Seagate/My Data": Source path of files from my laptop, in this case, it is an external HD.
  • "usal:My Data": usal is the name of my remote, that is a Google Drive. I set this value during Rclone configuration. "My Data" is the path where the files will be stored in Google Drive.

You can check the flags documentation for more details.

And this is the result:

$ /usr/bin/rclone copy --update --progress --transfers 30 --checkers 8 --contimeout 60s --timeout 300s --retries 3 --low-level-retries 10 --stats 1s --log-level "INFO" --log-file /home/rafam/rclone_logs --stats-log-level "INFO" --stats-one-line "/media/rafam/Seagate/My Data" "usal:My Data"

140.755G / 173.03 GBytes, 81%, 4.514 MBytes/s, ETA 2h1m38s

During the process, you can check which files have been already copied:

$ tail -f /home/rafam/rclone_logs 

2020/09/22 15:21:50 INFO  : Analisis Malware - W7x32SP0.zip: Copied (new)
2020/09/22 16:50:36 INFO  : Analisis Malware - W7x64SP1.zip: Copied (new)

So each time I add new content in my external HD I can do a backup easily.

In addition, you can add a Cron job for doing it periodically.

I hope this helps! Let me know how do you use this wonderful tool.


=)~

Comments