#fluentd

Monitor Nginx Response Time with FluentD, Kibana and Elasticsearch

In my previous post, I showed you How to centralize Nginx logs.  So now, I will use FluentD, Kibana and ElasticSearch to collect Nginx Response Time.

To implement it, we have to change Nginx Log Format. Because, in the default, Nginx does not store Response Time to access.log file. So we change nginx.conf as below:

After reload Nginx, you can try to tail access.log. The result should be as below:

Now, we will create a regex string that matches with the log format above:

And Insert to td-agent.conf as below:

After restarting td-agent. We can wait a minute and then view kibana. The outfit should be:

2016-02-26_1800

Monitor Nginx Response Time with FluentD, Kibana and Elasticsearch

For basic installation, please refer to https://sonnguyen.ws/centralize-docker-logs-with-fluentd-elasticsearch-and-kibana/

Centralize Nginx Logs with FluentD, Kibana and Elasticsearch

As you know, FluentD is a great tool to collect the logs, ElasticSearch supports store and search the log data and Kibana helps you to view and search logs in web interface.

Nginx Logs is a good thing help you monitor, debug, and troubleshoot your application. So in this post, I will show  you how to Centralize Nginx Logs with FluentD, Kibana and Elasticsearch.

Before reading this post, please read Centralize Docker Logs with FluentD, Kibana and Elasticsearch to know how to install Fluent, Kibana and Elasticsearch in Ubuntu 14.04.

In the next step, we add content below to /etc/td-agent/td-agent.conf file:

And Restart FluentD:

For Debugging:

If you got an error:

We just need add td-agent user to adm group:

Finally, access Kibana to view the logs as image below:

2016-02-16_1310

Centralize Nginx Logs with FluentD, Kibana and Elasticsearch

 

Centralize Docker logs with FluentD, ElasticSearch and Kibana

Beside monitor topic, Log also is a important issue we need to concern. In this post, I just mention the way how to centralize Docker Logs using FluentD, Elasticsearch and Kibana

4a2ee5ea-f25a-4af5-b56b-4ad90a87a985-medium

Try not to become a man of success, but rather try to become a man of value

Secenario

We will install FluentD, ElasticSearch and Kibana in the same machine.

  • FluentD : Collect and Transfer log data to Elasticseach
  • Elasticsearch: Store and indexing log data to support searching/filtering log data
  • Kibana: A web view supports you search/filter and virtualize the log data

Prerequisites

  • We have a machine installed Ubuntu 14.04 with IP 192.168.1.191
  • We already installed Docker, Wget

Now, I will show you step by step to get stated to centralize the log data with FluentD

Elasticsearch

Download:

You should check the latest version at https://www.elastic.co/downloads/elasticsearch

Uncompress:

Run:

Or run as daemon:

Now, we have Elasticsearch run on port 9200.

FluentD

Add the lines below to /etc/security/limits file:

Open new terminal and type command below, make sure the output is correct:

Install FluentD (uses Treasure Data):

For other Ubuntu version, please read: http://docs.fluentd.org/articles/install-by-deb

Now, we need to install Elasticsearch Plugin for FluentD:

Add the content below to /etc/td-agent/td-agent.conf to setup Fluentd transfer all docker logs to Elasticsearch :

And restart FluentD:

Docker

Now, we change docker configuration file to use Fluent as a Log Driver. Open /etc/default/docker, and add the line below:

Add restart docker to apply the change:

Kibana

We will run Kibana in Docker Container with command:

Now, you can access http://192.168.1.191:5601 to see Docker Logs in Kibana.

Tips

Delete a index in Elasticsearch:

List all Indexes in Elasticsearch: