Partial restoration of website

Jeremy Utley jeremy at
Mon Nov 15 00:35:13 PST 2004

On Sun, November 14, 2004 11:53 pm, Jeremy Utley said:

> I'm also going to experiment with a script that will once per day tar up
> the website directory, and scp it off to my personal server - want to see
> how effective that is, and how much of a performance hit it takes on the
> server.
> -J-

Replying to myself.  Developed a very quick and dirty script to perform
this type of backup.  Looks like the website backup, when bzip2'd, is
approximately 16MB in size, archiving and transfer to my server took about
5 minutes.  Looking at my server's bandwidth utilization, not sure I could
handle this much traffic daily.

Justin, are you still willing to house a daily backup?  If so, I'll modify
my script here to scp to your server instead of mine, and put it on a cron
to run daily.  I'd need an account to house the backups in, and to get you
my ssh public key to facilitate the automated copy.

I also have an idea of running a weekly svn dump of all subversion repos,
and backing that data up as well, but that can come later.


More information about the website mailing list