Skip to content

Commit 257fff5

Browse files
committed
Updates: Container Deployments.
Signed-off-by: Azeem Sajid <azeem.sajid@gmail.com>
1 parent 578142c commit 257fff5

File tree

4 files changed

+303
-260
lines changed

4 files changed

+303
-260
lines changed
Lines changed: 66 additions & 56 deletions
Original file line numberDiff line numberDiff line change
@@ -1,29 +1,28 @@
11
# Docker Logging via EFK (Elasticsearch + Fluentd + Kibana) Stack with Docker Compose
22

3-
This article explains how to collect [Docker](https://www.docker.com/)
4-
logs to EFK (Elasticsearch + Fluentd + Kibana) stack. The example uses
5-
[Docker Compose](https://docs.docker.com/compose/) for setting up
6-
multiple containers.
3+
This article explains how to collect [Docker](https://www.docker.com/) logs and
4+
propagate them to EFK (Elasticsearch + Fluentd + Kibana) stack. The example uses
5+
[Docker Compose](https://docs.docker.com/compose/) for setting up multiple
6+
containers.
77

8-
![](/images/7.2_kibana-homepage.png)
8+
![Kibana](/images/7.2_kibana-homepage.png)
99

10+
[Elasticsearch](https://www.elastic.co/products/elasticsearch) is an open-source
11+
search engine known for its ease of use.
12+
[Kibana](https://www.elastic.co/products/kibana) is an open-source Web UI that
13+
makes Elasticsearch user-friendly for marketers, engineers and data scientists
14+
alike.
1015

11-
[Elasticsearch](https://www.elastic.co/products/elasticsearch) is an
12-
open source search engine known for its ease of use.
13-
[Kibana](https://www.elastic.co/products/kibana) is an open source Web
14-
UI that makes Elasticsearch user friendly for marketers, engineers and
15-
data scientists alike.
16-
17-
By combining these three tools EFK (Elasticsearch + Fluentd + Kibana) we
18-
get a scalable, flexible, easy to use log collection and analytics
19-
pipeline. In this article, we will set up 4 containers, each includes:
16+
By combining these three tools EFK (Elasticsearch + Fluentd + Kibana) we get a
17+
scalable, flexible, easy to use log collection and analytics pipeline. In this
18+
article, we will set up four (4) containers, each includes:
2019

2120
- [Apache HTTP Server](https://hub.docker.com/_/httpd/)
2221
- [Fluentd](https://hub.docker.com/r/fluent/fluentd/)
2322
- [Elasticsearch](https://hub.docker.com/_/elasticsearch/)
2423
- [Kibana](https://hub.docker.com/_/kibana/)
2524

26-
All of `httpd`'s logs will be ingested into Elasticsearch + Kibana, via
25+
All the logs of `httpd` will be ingested into Elasticsearch + Kibana, via
2726
Fluentd.
2827

2928

@@ -33,13 +32,15 @@ Please download and install Docker / Docker Compose. Well, that's it :)
3332

3433
- [Docker Installation](https://docs.docker.com/engine/installation/)
3534

36-
## Step 0: prepare docker-compose.yml
3735

38-
First, please prepare `docker-compose.yml` for [Docker Compose](https://docs.docker.com/compose/overview/). Docker Compose is a
39-
tool for defining and running multi-container Docker applications.
36+
## Step 0: Create `docker-compose.yml`
37+
38+
Create `docker-compose.yml` for [Docker
39+
Compose](https://docs.docker.com/compose/overview/). Docker Compose is a tool
40+
for defining and running multi-container Docker applications.
4041

41-
With the YAML file below, you can create and start all the services (in
42-
this case, Apache, Fluentd, Elasticsearch, Kibana) by one command.
42+
With the YAML file below, you can create and start all the services (in this
43+
case, Apache, Fluentd, Elasticsearch, Kibana) by one command:
4344

4445
``` {.CodeRay}
4546
version: '3'
@@ -83,40 +84,46 @@ services:
8384
- "5601:5601"
8485
```
8586

86-
`logging` section (check [Docker Compose documentation](https://docs.docker.com/compose/compose-file/#/logging))
87-
of `web` container specifies [Docker Fluentd Logging Driver](https://docs.docker.com/engine/admin/logging/fluentd/) as a
88-
default container logging driver. All of the logs from `web` container
89-
will be automatically forwarded to host:port specified by
90-
`fluentd-address`.
87+
The `logging` section (check [Docker Compose
88+
documentation](https://docs.docker.com/compose/compose-file/#/logging)) of `web`
89+
container specifies [Docker Fluentd Logging
90+
Driver](https://docs.docker.com/engine/admin/logging/fluentd/) as a default
91+
container logging driver. All the logs from `web` container will automatically
92+
be forwarded to `host:port` specified by `fluentd-address`.
93+
9194

92-
## Step 1: Prepare Fluentd image with your Config + Plugin
95+
## Step 1: Create Fluentd Image with your Config + Plugin
9396

94-
Then, please prepare `fluentd/Dockerfile` with the following content, to
95-
use Fluentd's [official Docker image](https://hub.docker.com/r/fluent/fluentd/) and additionally
96-
install Elasticsearch plugin.
97+
Create `fluentd/Dockerfile` with the following content using the Fluentd
98+
[official Docker image](https://hub.docker.com/r/fluent/fluentd/); and then,
99+
install the Elasticsearch plugin:
97100

98101
``` {.CodeRay}
99102
# fluentd/Dockerfile
103+
100104
FROM fluent/fluentd:v1.6-debian-1
101105
USER root
102106
RUN ["gem", "install", "fluent-plugin-elasticsearch", "--no-document", "--version", "3.5.2"]
103107
USER fluent
104108
```
105109

106-
Then, please prepare Fluentd's configuration file
107-
`fluentd/conf/fluent.conf`. [in\_forward](/plugins/input/forward.md) plugin is used for
108-
receive logs from Docker logging driver, and out\_elasticsearch is for
109-
forwarding logs to Elasticsearch.
110+
Then, create the Fluentd configuration file `fluentd/conf/fluent.conf`. The
111+
[`forward`](/plugins/input/forward.md) input plugin receives logs from the
112+
Docker logging driver and `elasticsearch` output plugin forwards these logs to
113+
Elasticsearch.
110114

111115
``` {.CodeRay}
112116
# fluentd/conf/fluent.conf
117+
113118
<source>
114119
@type forward
115120
port 24224
116121
bind 0.0.0.0
117122
</source>
123+
118124
<match *.**>
119125
@type copy
126+
120127
<store>
121128
@type elasticsearch
122129
host elasticsearch
@@ -129,21 +136,24 @@ forwarding logs to Elasticsearch.
129136
tag_key @log_name
130137
flush_interval 1s
131138
</store>
139+
132140
<store>
133141
@type stdout
134142
</store>
135143
</match>
136144
```
137145

138-
## Step 2: Start Containers
139146

140-
Let's start all of the containers, with just one command.
147+
## Step 2: Start the Containers
148+
149+
Let's start the containers:
141150

142151
``` {.CodeRay}
143152
$ docker-compose up
144153
```
145154

146-
You can check to see if 4 containers are running by `docker ps` command.
155+
Use `docker ps` command to verify that the four (4) containers are up and
156+
running:
147157

148158
``` {.CodeRay}
149159
$ docker ps
@@ -153,10 +163,10 @@ bc5bcaedb282 kibana:7.2.0 "/usr/
153163
9fe2d02cff41 docker.elastic.co/elasticsearch/elasticsearch:7.2.0 "/usr/local/bin/dock…" 20 seconds ago Up 18 seconds 0.0.0.0:9200->9200/tcp, 9300/tcp docker_elasticsearch_1
154164
```
155165

156-
## Step 3: Generate httpd Access Logs
157166

158-
Let's access to `httpd` to generate some access logs. `curl` command is
159-
always your friend.
167+
## Step 3: Generate `httpd` Access Logs
168+
169+
Use `curl` command to generate some access logs like this:
160170

161171
``` {.CodeRay}
162172
$ curl http://localhost:80/[1-10]
@@ -172,35 +182,35 @@ $ curl http://localhost:80/[1-10]
172182
<html><body><h1>It works!</h1></body></html>
173183
```
174184

185+
175186
## Step 4: Confirm Logs from Kibana
176187

177-
Please go to `http://localhost:5601/` with your browser. Then, you need
178-
to set up the index name pattern for Kibana. Please specify `fluentd-*`
179-
to `Index name or pattern` and press `Create` button.
188+
Browse to `http://localhost:5601/` and set up the index name pattern for Kibana.
189+
Specify `fluentd-*` to `Index name or pattern` and click `Create`.
180190

181-
![](/images/7.2_efk-kibana-index.png)
182-
![](/images/7.2_efk-kibana-timestamp.png)
191+
![Kibana Index](/images/7.2_efk-kibana-index.png) ![Kibana
192+
Timestamp](/images/7.2_efk-kibana-timestamp.png)
183193

184-
Then, go to `Discover` tab to seek for the logs. As you can see, logs
185-
are properly collected into Elasticsearch + Kibana, via Fluentd.
194+
Then, go to `Discover` tab to check the logs. As you can see, logs are properly
195+
collected into Elasticsearch + Kibana, via Fluentd.
186196

187-
![](/images/7.2_efk-kibana-discover.png)
197+
![Kibana Discover](/images/7.2_efk-kibana-discover.png)
188198

189-
## Conclusion
199+
## Code
190200

191-
This article explains how to collect logs from Apache to EFK
192-
(Elasticsearch + Fluentd + Kibana). The example code is available in
193-
this repository.
194-
- <https://github.com/digikin/fluentd-elastic-kibana>
201+
The code is available at https://github.com/digikin/fluentd-elastic-kibana.
195202

196203
## Learn More
197204

198-
- [Fluentd Architecture](https://www.fluentd.org/architecture)
199-
- [Fluentd Get Started](/overview/quickstart.md)
205+
- [Fluentd: Architecture](https://www.fluentd.org/architecture)
206+
- [Fluentd: Get Started](/overview/quickstart.md)
200207
- [Downloading Fluentd](http://www.fluentd.org/download)
201208

202209

203210
------------------------------------------------------------------------
204211

205-
If this article is incorrect or outdated, or omits critical information, please [let us know](https://github.com/fluent/fluentd-docs-gitbook/issues?state=open).
206-
[Fluentd](http://www.fluentd.org/) is a open source project under [Cloud Native Computing Foundation (CNCF)](https://cncf.io/). All components are available under the Apache 2 License.
212+
If this article is incorrect or outdated, or omits critical information, please
213+
[let us know](https://github.com/fluent/fluentd-docs-gitbook/issues?state=open).
214+
[Fluentd](http://www.fluentd.org/) is an open-source project under [Cloud Native
215+
Computing Foundation (CNCF)](https://cncf.io/). All components are available
216+
under the Apache 2 License.

0 commit comments

Comments
 (0)