Skip to main content

Hourly Production Server Database And File Backups Discussion

General • Asked by Chris Oliver

Thank you for this! I should get this up and running on a site I run for a complex fantasy football league...I've been meaning to with Cron jobs and whatnot, but it's just such a hassle and, so far, there hasn't been much site traffic, but still...a catastrophe could be hiding around an upcoming corner!

Glad I could help! :)

I've been horribly slow about getting this implemented (had to do a bunch of work on the site first) and I went about doing this stuff last night. The backup script is working brilliantly and I've restored from one of the backups to confirm it, so thank you for the writeup!

However, the Cron portion is proving to be quite the headache. I'm at Digital Ocean with an Ubuntu 12.10 server. Running crontab -e, I was getting errors about nano permissions but found out how to get rid of that nonsense (I can't find the link now, but it was removing a file and commenting out a line about logging stuff). Anyway, the cron job is in my file:

0 * * * * /bin/bash -l -c '/home/kkerley/.rvm/gems/ruby-1.9.3-p392/bin/backup perform -t sqwid_backup'

(got this by running crontab -l just now) but it's never firing. I thought maybe it was due to not having a newline at the end, but I've put one in three times. I also ran pgrep cron to confirm that it's running and it is (returned 598). I just don't understand what I'm doing wrong/why this isn't firing off hourly.

Are there any gotchas that I'm just not aware of? This is the first time I've messed with cron.

Thanks!

One thing (and I updated the line) is that it should be perform -t production_backup to match the name of the backup that you created earlier.

That should be it I imagine.

Ahhh! I didn't even notice. I saw that last night and thought it was weird when I was typing it in but figured it was just some weird cron command. :)

I've updated my cronjob and will hopefully know in 51 minutes if it worked or not.

Thanks for the quick reply and again for the tutorial!

It's working like a charm now. Thank you again for this guide!


s3.path = "/production/database"
Is the the end point or path with the bucket? I get The bucket you are attempting to access must be addressed using the specified endpoint. Please send all future requests to this endpoint.

Does one need to set any permission on the bucket?

Solved, just needed to fix my bucket region.


For anyone getting this warning.

[fog][WARNING] fog: the specified s3 bucket name(BUCKET_NAME) is not a valid dns name, which will negatively impact performance.

Fog does not like dash in the the BUCKET_NAME. It's best to use BUCKETNAME.

Good find! Thanks for sharing this.


Thanks Chris for this very useful guide. I just implemented the Backup gem and the cron job. And it is successfully storing a backup of the database to Amazon S3.

At the end of the guide you say:

"Always be sure to test your backups and make sure you can safely restore from them!"

So how do you restore your database with a backup on Amazon S3?

greetings,

Anthony

Depends on what you're backing up, but for example, you can restore a mysql database with a sql file like this: http://stackoverflow.com/a/...

I'm backing up a PostgreSQL database. I believe the pg_dump utility has a feature for restoring databases from a remote location.

I tried to restore the database with:

sudo su - postgres
psql -1 posplus < production_backup

but nothing happens


Hello Chris, thanks for this useful website! I am getting the following error with Amazon S3: AuthorizationHeaderMalformed<message>The authorization header is malformed; the authorization header requires three components: Credential, SignedHeaders, and Signature.</message>

I have tried to use GIYF without success. I have paperclip gem in my rails app working fine with the same S3 credentials and same bucket but backup gem is not! Can you help me please?

Interesting. I can't think of much except for maybe having a typo somewhere.



Just keep it more simple with https://backup.foundation


is there a way to use backup gem on heroku instance and setup? thx

Use pg backups. Is easier on heroku.


Hey Chris, Backup gem link is broken, new link should go here: https://github.com/backup/b...


[fog][WARNING] fog: followed redirect to my-bucket.s3-eu-west-1.amaz..., connecting to the matching region will be more performant
[info] CloudIO::Error: Retry #1 of 10
[info] Operation: PUT 'path/production_backup.tar'
[info] --- Wrapped Exception ---
[info] Excon::Errors::BadRequest: Expected(200) <=> Actual(400 Bad Request)
[info] excon.error.response
[info] :body=> "\n<error>IncompleteBody<message>The request body terminated unexpectedly<

Any Idea?


Hi Chris, my application db size is 84 GB. If the backup runs every one hour will it not hamper the application. What should be the ideal way?

Lots of options. Depending on how long it takes you might want to do this every 4 hours or something instead. You might want to do a live replica database for realtime backups to another server and then archive a copy of it nightly.

You should use database replicas and make partial and transaction backups. I did that on MS SQL server but i don't know how that would be achieved on Postgres or Mysql.


Hey Chris this is interesting. But I need the restore functionality. Do you think you have a way arounf?

You've just got to download the backup, extract it, and reimport your database and do the same for whatever else you may have backed up.


Hey Chris! Are you still using this method in production? I haven't been able to get the backup gem to work with ruby 2.4.x, have you gotten it to work with rails 5.1 and a recent ruby?

gem install backup -v5.0.0.beta.1 will work with Ruby 2.4.x


Hey Chris, how to do a incremental backup. Is there best tools to do that or procedure to do with postgres commands like WAL to get incremental backup.


Login or Create An Account to join the conversation.

Subscribe to the newsletter

Join 22,346+ developers who get early access to new screencasts, articles, guides, updates, and more.

    By clicking this button, you agree to the GoRails Terms of Service and Privacy Policy.

    More of a social being? We're also on Twitter and YouTube.