Good, Universal, way to export/import all database tables for backups?

I need a way to backup and restore the entire contents of an App, to make it easier to both just backup and restore it if necessary but also to make it easier to upgrade or migrate to a new server.

This project was a ‘small’ project for ‘a while’, which, obviously, got a lot bigger and will probably stick around for a longer then it was meant to be.

I did find a few backup modules, but they seem to focus exclusively on MySQL, and actually generate MySQL-specific queries, while I need this to work on a different DBMS as well.

Most also seem to expect that the database is the only thing that needs to be backed up, but this application has some other file that also need to be included in the backup.

So I was thinking on using ActiveRecord and/or ActiveQuery to build the queries I need, and just save the table data in a more generic format (like JSON or XML) within the ZIP file.

saving data that has a model is easy, as then I can just do this:




$zipFile = new \ZipArchive();

if($zipFile->open('path/to/backup.zip', \ZipArchive::CREATE) === true){

    $zipFile->addFromString('data/model_name.json', \yii\helpers\Json::encode(ModelName::find()->all()));

    $zipFile->close();

}



But then of course for each model the application uses.

But that won’t work for the Migration and RBAC tables, or the extra link table I have, so I could I export those in the same format as the Model data?

Another issue with this is that I haven’t yet found the best way to import it again, for Models I would assume I can just do something like this:




foreach(Json::decode($zipFile->getFromName('data/model_name.json')) as $record){

    $recordModel = new ModelName();

    $recordModel->load($record);

    //might actually need

    //$recordModel->attributes = $record

    $recordModel->save(false);

}



I haven’t actually tested this as I don’t actually have the entire backup yet, but again this is then only for models, and not for the extra tables that don’t have one.

Well…

This is a complex topic… One that there is no easy answer.

This issues are:

It needs to be as fast as possible or you client/users are going to scream at you…

You really can not use php for this. Or if you do, it will be a semi-automated manual process.

PHP script timeouts will keep you from doing a lot of stuff at one time. Plus there is the issues of the other files.

This is really a problem for system admin, not application development.

You are going to have to bite the bullet and write a DB specific backup and a file system backup.

MySql can create a fairly quick export of its table defs and data, so you will have to figure out how

to do this for the db you are using. You did not specify what you are using so, no one can give you

any help with that.

Then once you have your database backup you could basically zip the entire app and copy it to the new server,

upzip it, restore the database and fix up any details that have changed between servers.

MY EXAMPLE


I use mariadb (mysql) for my database.

I use backblaze for my offsite storage

I use duplicati to perform the backups

Google them to find out more.

Basicly I use mysqldump to create a file that contains all the info from the database.

This file is stored in my application tree so it gets backed up along with all files that my app creates while running.

Then I use duplicati to back it all up to Backblaze.

:( I guess I should test the whole restore process one day… ;)

Probably the main thing you would need to figure out, is the db specific export and import.

-John




#entirly unsupported example code.  It may cause your server to melt.  For example only.... YMMV....


#!/bin/sh


cd /home/nginx/domains/example.com/backup


echo Backing up example.com


date


#build sql filename and backup database

sql_file="`date +%y`_`date +%m`_`date +%d`example.sql"

rm -f $sql_file

mysqldump --user=root --password=password example > $sql_file


# copy all files to backblaze B2 using duplicati

duplicati-cli backup \

	"b2://MyAccount/full/?auth-username=username&auth-password=password" \

	/home/nginx/domains/example.com/ \

	--backup-name=example_backup --dbpath=/root/.config/Duplicati/CNKYJHFH.sqlite \

	--encryption-module=aes --compression-module=zip --dblock-size=5MB \

	--passphrase="jkasDF$#%" --retention-policy="1W:1D,4W:1W,12M:1M" \

	--disable-module=console-password-input --backup-test-samples=0 \

	--log-level=Profiling --log-file=/var/log/duplicati.log \

	--thread-priority=high


echo Backing up example.com complete

date



I just agree with John.

Write several shell scripts to back up your MySQL db and the app content files, and run them periodically using cron. … This is a must job for the administrator of every web app.

As for your idea of “DBMS independent db backup”, I don’t think it’s very easy to implement. It won’t pay you the cost. And, you ain’t gonna need it. :) Although you will have to write a tool to convert your old MySQL db tables to your new DBMS ones when you need to change the underlying DBMS, it will be much easier and cheaper.