7

I'm trying to implement a database backup cron (other solutions welcome) in my job but I have a small problem:

I have a large database that is over 10GB in space and the current vm doesn't have space to store it in the temporary file that mysql creates.

I know I can use mysqldump with a host parameter, but my question is, when doing that does the temporary file generated by mysqldump stay at the machine that is running it or does it stay on the database server?

UPDATE: I forgot to mention that I'm trying to backup a network of websites and that some of them are behind a firewall (needing VPN access), some need server hopping to get to the database server.

5
  • I'm 99% sure that the file is on the server that the mysqldump process is running on, not the server that the MySQL server is on. This is exactly the process I use to dump large databases when the server in question doesn't have enough disk space. Commented Feb 10, 2016 at 12:20
  • @Eborbob you run mysqldump on the archive machine? Commented Feb 10, 2016 at 12:23
  • Yes, something like mysqldump -h livedb.example.com -u user -p password > dump.sql Commented Feb 10, 2016 at 12:27
  • Hmm that would mean that you have direct access to mysql's port, which isn't the case in our current setup, so I would probably need to go the SSH route metioned below. Commented Feb 10, 2016 at 12:32
  • The easiest way if this is a permanent set up and not just a one-off backup is probably to setup a VPN between the two servers. Commented Feb 10, 2016 at 12:36

2 Answers 2

5

You can run a shell script from an archive host, where you've traded password-less ssh keys with the database server. This lets you transfer the file directly over ssh, without creating any temp files on the remote database server:

ssh -C myhost.com mysqldump -u my_user --password=bigsecret \ 
  --skip-lock-tables --opt database_name > local_backup_file.sql

Obviously there are ways to secure that password on the command line, but this a method that could accomplish what you want. One advantage of this method is that it doesn't require the archive host to have access to port 3306 on the remote host.

This guy's version is cool because it also compresses the data on-the-fly before transferring it over the network, and then he uncompresses it before loading it into a local database.

ssh me@remoteserver 'mysqldump -u user -psecret production_database | \
  gzip -9' | gzip -d | mysql local_database

But that's why my version uses ssh -C, which enables its own compression algorithm and avoids extra gzip pipes.

Sign up to request clarification or add additional context in comments.

1 Comment

In my case I would need to do a full database server backup (-A on mysqldump iirc). By executing mysqldump on the remote machine and then piping the results back (although compressed) would mean that the temp file is on the remote host no? Couldn't I use -h on mysqldump on the archive server?
0

Depending on the circumstance it might be a better idea to use MySQL replication. Set up MySQL on your backup server and configure it as a slave of your production database (see http://dev.mysql.com/doc/refman/5.7/en/replication-howto.html). You can then dump the slave database easily.

An advantage of this approach is you're not transferring 10GB each time you want to backup, you're only transferring any changes to the database as and when they occur.

You'll need to keep an eye on the replication though, because if it fails your slave database will become stale.

2 Comments

The one thing that makes me confused on this approach is that I have a network of sites on different setups (some behind a firewall, some behind some that need ssh tunneling to get there and I don't know if a single replication server would be enough. (adding this info to the main question)
Rather than SSH tunnelling I find it easier to setup a VPN. That way you can make the live and archive machines appear on the same network and the mysqldump command will just work without any messing around.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.