Synchronize Files With rsync
Synchronizing files from one server to another is quite awesome. You can use it for backups, for keeping web servers in sync, and much more. It's fast and it doesn't take up as much bandwidth as normal copying would. And the best thing is, it can be done with only 1 command. Welcome to the wonderful world of rsync.
$ aptitude -y install rsync
Simple - one command
Let's copy our local
/home/holyguard/destination which resides on the server:
$ rsync -az --progress --size-only /home/holyguard/source/* server.example.com:/home/holyguard/destination/
-aarchive, preserves all attributes like recursive ownership, timestamps, etc
-zcompress, saves bandwidth but is harder on your CPU so use it for slow/expensive connections only
--progressshows you the progress of all the files that are being synced
--size-onlycompare files based on their size instead of hashes (less CPU, so faster)
Note that this sync excludes hidden files since it uses the bash
*. If you want to include hidden files, write the source like this:
/home/holyguard/source/ and remove the trailing slash from the destination like so:
Well, that's it! But read on if you want to learn how to automate this.
Advanced - automatic syncing with SSH keys
Alright so syncing files on Linux is pretty easy. But what if we want to automate this? How can we avoid that rsync asks for a password every time?
There are different ways to go about this, but the one I mostly use is installing SSH keys. By installing your SSH key on the destination server, it will recognize you in the future and permit instant access. So this way we can automate the synchronization with rsync.
Open a terminal and type:
$ ssh server.example.com
It should not ask you for any password. Great! this means we can also run rsync directly without logging in!
Let's create a sync script
So now just create a script
$ $EDITOR /root/bin/syncdata.bash
that contains your rsync command:
#!/usr/bin/env bash rsync -az --delete /home/holyguard/source/* server.example.com:/home/holyguard/destination/
Save the file (CTRL+O) and exit (CTRL+X) and make it executable like this:
$ chmod a+x /root/bin/syncdata.bash
Schedule it to run every hour
And to have your data synchronized every hour, open up your crontab editor:
$ crontab -e
0 * * * * /root/bin/syncdata.bash
That's it! New files are automatically updated @
server.example.com:/home/holyguard/destination/ every hour. Files that are deleted from
/home/holyguard/source/* are also deleted at the destination, thanks to the
Some extra rsync command line options
Some extra arguments that might come in handy customizing your synchronization job:
--deletedelete files remotely that no longer exist locally
--dry-runshow what would have been transferred, but do not transfer anything
--max-delete=10don't delete more than 10 files in one run, safety precaution
--delay-updatesput all updated files into place at transfer's end, very useful for live systems
--compress-level=9explicitly set compression level 9. 0 disabled compression
--exclude-from=/root/sync_excludespecifies a /root/sync_exclude that contains exclude patterns (one per line). filenames matching these patterns will not be tranfered
--bwlimit=1024This option specifies a maximum transfer rate of 1024 kilobytes per second.
Licenciada en Bellas Artes y programadora por pasión. Cuando tengo un rato retoco fotos, edito vídeos y diseño cosas. El resto del tiempo escribo en MA-NO WEB DESIGN AND DEVELOPMENT.
Load balancing with high availability can be tough to set up. Fortunately, Varnish HTTP Cache server provides a dead simple highly available load balancer that will also work as a…
In spite of a past we could say almost confronted, the approach between Windows and Linux is accelerating more and more, drawing a story closer to love than to hate.…
If you are reading about this for the first time, the Windows Subsystem for Linux is a kind of virtual machine that allows you to run the Linux terminal on…
In the previous articles I made a short introduction to the Unix world and in the following article I have dealt with the basic commands for the file system management. Today we are…
I introduced in the previous article, available here, the basic concepts concerning the Linux world. Today we are going to have a look to some basic operations that we can perform…
If you have thought about migrating from Windows to a Unix operating system, or Linux specifically there are things you should know. The goal is to give essential information (and…
Even if most of the tech experts actively claim that RSS (Rich Site Summary) is dead especially after Google Reader was discontinued 5 years ago but it isn’t yet as…
It’s hard work monitoring and debugging Linux performance problems, but it’s easier with the right tools at the right time. Finding a Linux Network Monitor tool or Software package for…
In this article we will explain how to install, manage and configure the SSL Security certificate, Let's Encypt in NGINX server used as proxy. This certificate is free but does…
Setting up your own mail server from scratch on Linux is complex and tedious, until you meet iRedMail. This tutorial is going to show you how you can easily and…
It's been over a half-decade since the GIMP 2.8 stable debut and today marks the long-awaited release of GIMP 2.10, its first major update in six years. And among other…
I recently had a request to setup SFTP for a customer so they could manage a set of files in their environment through an FTP GUI. Being an avid user…