Centralize Docker logs with FluentD, ElasticSearch and Kibana

Beside monitor topic, Log also is a important issue we need to concern. In this post, I just mention the way how to centralize Docker Logs using FluentD, Elasticsearch and Kibana


Try not to become a man of success, but rather try to become a man of value


We will install FluentD, ElasticSearch and Kibana in the same machine.

  • FluentD : Collect and Transfer log data to Elasticseach
  • Elasticsearch: Store and indexing log data to support searching/filtering log data
  • Kibana: A web view supports you search/filter and virtualize the log data


  • We have a machine installed Ubuntu 14.04 with IP
  • We already installed Docker, Wget

Now, I will show you step by step to get stated to centralize the log data with FluentD



You should check the latest version at https://www.elastic.co/downloads/elasticsearch



Or run as daemon:

Now, we have Elasticsearch run on port 9200.


Add the lines below to /etc/security/limits file:

Open new terminal and type command below, make sure the output is correct:

Install FluentD (uses Treasure Data):

For other Ubuntu version, please read: http://docs.fluentd.org/articles/install-by-deb

Now, we need to install Elasticsearch Plugin for FluentD:

Add the content below to /etc/td-agent/td-agent.conf to setup Fluentd transfer all docker logs to Elasticsearch :

And restart FluentD:


Now, we change docker configuration file to use Fluent as a Log Driver. Open /etc/default/docker, and add the line below:

Add restart docker to apply the change:


We will run Kibana in Docker Container with command:

Now, you can access to see Docker Logs in Kibana.


Delete a index in Elasticsearch:

List all Indexes in Elasticsearch: