How to HOT Backup Database (MongoDB, MySQL, ES …) to AWS S3

Actually, there are many way to backup your database. You can using RSYNC, MongoDump for Mongo, S3 Backup Plugin for ElasticSearch. However, this post will show you the way I used in my project. Maybe, It is not perfect for all case. But in my case, it is really perfect.

Docker Ironman T-shirt

Docker Ironman T-shirt

I am running a project with Microservice Architecture. All Databases and Services are running in Docker Container.  In my plan, I have to backup all databases every night.

At the beginning, I tried to use tar command to compress the data, and then I use command aws s3 copy to copy backup data to S3. It seems work. But tar command makes MongoDB stop working. I tried to google to solve the problem. I found the solution is rsync command.

The backup process should be implement in three steps:

  • Use rsync command to copy the data to other location
  • Compress the data by tar command
  • Move the compressed data to AWS S3

The script should be:

Thanks for your reading.

Install MongoDB and Mongo PHP in Ubuntu

I am using Ubuntu Server 14.04 OS for my working enviroment. All steps below is way I implemented successfully on my machine. So now, I will show you how to install MongoDB and Mongo PHP driver.


Firstly, install MongoDB. You can see it is so simple and one and only one command.

To make sure that MongDB is running, you can use command below

To install Mongo PHP Driver, you use the command below.

The command above use Pecl, so if you machine have not installed it yet. You can install easily by the command below

And now, you have to enable Mongo PHP extension by way add “extension=mongo.so” to php.ini file or use the commands below

Almost done, just need to check that your job is done or not.

And writing a php script to create document, collection and insert data to mongodb.

If success, the output will be same as the below