Install and get started with Ansible in Ubuntu 14.04

This post will show you how to install Ansible and use Ansible to create a instance EC2 in AWS.

There are several way to install Ansible. In this post, I choose the way that uses PIP to install Ansible.

ac723946-269b-11e4-ae5f-22000a9780da-medium

If people are not laughing at your goals, your goals are too small

Install PIP, Python Boto:

We need python-boto to make Ansible is able to work AWS

Install Ansible:

 

Now, we need to add AWS Access Key to make Boto can access to your AWS account:

In the next step, you need create a folder for your project:

Create hosts file with the content below:

And create a provision file name ec2_launch.yml with the content:

Please custom the source code above for your case with the guide:

Finally, just run a command to create EC2 instance:

If you what to run a series of commands in all the instances you created. You should create a run_test.yml file with content:

And run the test by Ansible Playbook:

 

Centralize Docker logs with FluentD, ElasticSearch and Kibana

Beside monitor topic, Log also is a important issue we need to concern. In this post, I just mention the way how to centralize Docker Logs using FluentD, Elasticsearch and Kibana

4a2ee5ea-f25a-4af5-b56b-4ad90a87a985-medium

Try not to become a man of success, but rather try to become a man of value

Secenario

We will install FluentD, ElasticSearch and Kibana in the same machine.

  • FluentD : Collect and Transfer log data to Elasticseach
  • Elasticsearch: Store and indexing log data to support searching/filtering log data
  • Kibana: A web view supports you search/filter and virtualize the log data

Prerequisites

  • We have a machine installed Ubuntu 14.04 with IP 192.168.1.191
  • We already installed Docker, Wget

Now, I will show you step by step to get stated to centralize the log data with FluentD

Elasticsearch

Download:

You should check the latest version at https://www.elastic.co/downloads/elasticsearch

Uncompress:

Run:

Or run as daemon:

Now, we have Elasticsearch run on port 9200.

FluentD

Add the lines below to /etc/security/limits file:

Open new terminal and type command below, make sure the output is correct:

Install FluentD (uses Treasure Data):

For other Ubuntu version, please read: http://docs.fluentd.org/articles/install-by-deb

Now, we need to install Elasticsearch Plugin for FluentD:

Add the content below to /etc/td-agent/td-agent.conf to setup Fluentd transfer all docker logs to Elasticsearch :

And restart FluentD:

Docker

Now, we change docker configuration file to use Fluent as a Log Driver. Open /etc/default/docker, and add the line below:

Add restart docker to apply the change:

Kibana

We will run Kibana in Docker Container with command:

Now, you can access http://192.168.1.191:5601 to see Docker Logs in Kibana.

Tips

Delete a index in Elasticsearch:

List all Indexes in Elasticsearch:

Monitor Nginx with CollectD, InfluxDB and Grafana

Monitor is the best solution to know your system is working well or not.  If your system is complicated,  you will have many things need to be monitored. In this post, I just show you a simple way to monitor Nginx with CollectD, InfluxDB and Grafana.

Before reading this post, make sure that you are take a look Monitor server with CollectD, InfluxDB and Grafana to get started with CollectD, InfluxDB and Grafana.

Nginx

To monitor Nginx, your nginx have to be enabled http_stub_status_module. At the first, you should check your Nginx contains http_stub_status_module or not by command:

In the default, if you are using Ubuntu 14.04, and install nginx by apt-get command, then you do not worry about the step above.

And now, you should change nginx config with the content below:

And restart your nginx:

Now, you can get nginx status by url http://127.0.0.1/nginx_status:

The output should be:

CollectD

In the next step, we will enable Nginx plugin of CollectD in /opt/collectd/etc/collectd.conf:

Restart your CollectD:

InfluxDB

After restarting CollectD, you wait a minute then checking InfluxDB to make sure that Nginx monitor data is stored.

Grafana

In the last step, we create a Graph Panel in Grafana to monitor how many “requests per second” . And Switch editor Mode to input the query below:

If you do not know how to “Switch editor mode”, you should see the image below:

14-01-2016 9-17-47 SA

Grafana – Switch editor mode

Finally, you will get the graph below:

13-01-2016 10-17-47 SA

Nginx – Request Per Second

You can also use data stored in InfluxDB to create the output by your way.

 

Backup Postgres 9.4 to S3 with WAL-E in Ubuntu 14.04

If you are using Postgres 9.4 Database for your project. I think that you are thinking about backup Backup Postgres everyday. So in this post I will show you how to  backup Backup Postgres 9.4 to S3.

9efd1a62-cdf7-4a05-8d2f-11e0b44492c8-medium

A sign of a good leader is not how many followers you have, but how many leaders you create.

Install Dependencies:

Using PIP to install WAL-E:

Using PIP to upgrade Request:

Using PIP to upgrade Six:

If you not upgrade them,  maybe you will meet an error as below when you run WAL-E Backup:

And we should change permission for PIP packet so that postgres user is able to use them:

Edit postgresql.conf to do backup with wall-push command:

Now, we restart postgres to apply the changes:

Backup Everyday

Assume that you created a bucket on S3, and you have AWS credentials. So you should push them to config file with commands:

Now, we will try to backup to S3 in the first time. At the first, change to postgres user:

Run backup command:

The output should be:

Finally, we add the command to crontab to backup 5 AM everyday:

Finished your works now!