So you need to transfer almost a GB of files…

and the source server is too slow to tar -z all these files without having the tar process killed (and the thought of waiting several hours to trickle 50,000 files down the pipe is too much)…

rsync is your friend!

With compression, efficiency, and resume-able, it took only several minutes to shrink-send 700MB.

Background: a website running on a dual-core Atom server all the way over in the US needs to be migrated over to a faster host elsewhere. My several attempts at compressing the files with tar -zcf failed with ‘Process killed’ – presumably too much processing time was being allocated to this one task. Rsync, as well as being a synchronising superstar, is fantastic at transferring large number of files. I also use rsync to pull manual updates over to live sites (leaving bazaar for the test site).

Caveat: both sides need access to the rsync command with SSH access (or a running rsync daemon). The more files needed, the more free RAM needed – approximately 100MB for every million files 🙂

Home

“Home is never far from our thoughts, though. How many times have you looked forward for months to a holiday, only to find that on day three you’re already dreaming of your own bed? But when you return, the process starts all over again.”

Mark Mason in The Times, July 20th