#elasticsearch

Logstash Elasticsearch data retention

If you are using Elasticsearch to centralize your log data, that is a great solution. However, after few months, you will have huge log data store in your server hard disk. You have to clean the old log data that you are sure that you will not use it in future.

To delete the 10-day-ago log data, we can use the script below:

So we should run the script above every night to delete data. So we use cronjob:

Thanks for your reading

linux shirt

Software is like sex t-shirt

How to HOT Backup Database (MongoDB, MySQL, ES …) to AWS S3

Actually, there are many way to backup your database. You can using RSYNC, MongoDump for Mongo, S3 Backup Plugin for ElasticSearch. However, this post will show you the way I used in my project. Maybe, It is not perfect for all case. But in my case, it is really perfect.

Docker Ironman T-shirt

Docker Ironman T-shirt

I am running a project with Microservice Architecture. All Databases and Services are running in Docker Container.  In my plan, I have to backup all databases every night.

At the beginning, I tried to use tar command to compress the data, and then I use command aws s3 copy to copy backup data to S3. It seems work. But tar command makes MongoDB stop working. I tried to google to solve the problem. I found the solution is rsync command.

The backup process should be implement in three steps:

  • Use rsync command to copy the data to other location
  • Compress the data by tar command
  • Move the compressed data to AWS S3

The script should be:

Thanks for your reading.

Monitor Nginx Response Time with FluentD, Kibana and Elasticsearch

In my previous post, I showed you How to centralize Nginx logs.  So now, I will use FluentD, Kibana and ElasticSearch to collect Nginx Response Time.

To implement it, we have to change Nginx Log Format. Because, in the default, Nginx does not store Response Time to access.log file. So we change nginx.conf as below:

After reload Nginx, you can try to tail access.log. The result should be as below:

Now, we will create a regex string that matches with the log format above:

And Insert to td-agent.conf as below:

After restarting td-agent. We can wait a minute and then view kibana. The outfit should be:

2016-02-26_1800

Monitor Nginx Response Time with FluentD, Kibana and Elasticsearch

For basic installation, please refer to https://sonnguyen.ws/centralize-docker-logs-with-fluentd-elasticsearch-and-kibana/

Centralize Nginx Logs with FluentD, Kibana and Elasticsearch

As you know, FluentD is a great tool to collect the logs, ElasticSearch supports store and search the log data and Kibana helps you to view and search logs in web interface.

Nginx Logs is a good thing help you monitor, debug, and troubleshoot your application. So in this post, I will show  you how to Centralize Nginx Logs with FluentD, Kibana and Elasticsearch.

Before reading this post, please read Centralize Docker Logs with FluentD, Kibana and Elasticsearch to know how to install Fluent, Kibana and Elasticsearch in Ubuntu 14.04.

In the next step, we add content below to /etc/td-agent/td-agent.conf file:

And Restart FluentD:

For Debugging:

If you got an error:

We just need add td-agent user to adm group:

Finally, access Kibana to view the logs as image below:

2016-02-16_1310

Centralize Nginx Logs with FluentD, Kibana and Elasticsearch

 

Centralize Docker logs with FluentD, ElasticSearch and Kibana

Beside monitor topic, Log also is a important issue we need to concern. In this post, I just mention the way how to centralize Docker Logs using FluentD, Elasticsearch and Kibana

4a2ee5ea-f25a-4af5-b56b-4ad90a87a985-medium

Try not to become a man of success, but rather try to become a man of value

Secenario

We will install FluentD, ElasticSearch and Kibana in the same machine.

  • FluentD : Collect and Transfer log data to Elasticseach
  • Elasticsearch: Store and indexing log data to support searching/filtering log data
  • Kibana: A web view supports you search/filter and virtualize the log data

Prerequisites

  • We have a machine installed Ubuntu 14.04 with IP 192.168.1.191
  • We already installed Docker, Wget

Now, I will show you step by step to get stated to centralize the log data with FluentD

Elasticsearch

Download:

You should check the latest version at https://www.elastic.co/downloads/elasticsearch

Uncompress:

Run:

Or run as daemon:

Now, we have Elasticsearch run on port 9200.

FluentD

Add the lines below to /etc/security/limits file:

Open new terminal and type command below, make sure the output is correct:

Install FluentD (uses Treasure Data):

For other Ubuntu version, please read: http://docs.fluentd.org/articles/install-by-deb

Now, we need to install Elasticsearch Plugin for FluentD:

Add the content below to /etc/td-agent/td-agent.conf to setup Fluentd transfer all docker logs to Elasticsearch :

And restart FluentD:

Docker

Now, we change docker configuration file to use Fluent as a Log Driver. Open /etc/default/docker, and add the line below:

Add restart docker to apply the change:

Kibana

We will run Kibana in Docker Container with command:

Now, you can access http://192.168.1.191:5601 to see Docker Logs in Kibana.

Tips

Delete a index in Elasticsearch:

List all Indexes in Elasticsearch: