1
1
# Docker Logging via EFK (Elasticsearch + Fluentd + Kibana) Stack with Docker Compose
2
2
3
- This article explains how to collect [ Docker] ( https://www.docker.com/ )
4
- logs to EFK (Elasticsearch + Fluentd + Kibana) stack. The example uses
5
- [ Docker Compose] ( https://docs.docker.com/compose/ ) for setting up
6
- multiple containers.
3
+ This article explains how to collect [ Docker] ( https://www.docker.com/ ) logs and
4
+ propagate them to EFK (Elasticsearch + Fluentd + Kibana) stack. The example uses
5
+ [ Docker Compose] ( https://docs.docker.com/compose/ ) for setting up multiple
6
+ containers.
7
7
8
- ![ ] ( /images/7.2_kibana-homepage.png )
8
+ ![ Kibana ] ( /images/7.2_kibana-homepage.png )
9
9
10
+ [ Elasticsearch] ( https://www.elastic.co/products/elasticsearch ) is an open-source
11
+ search engine known for its ease of use.
12
+ [ Kibana] ( https://www.elastic.co/products/kibana ) is an open-source Web UI that
13
+ makes Elasticsearch user-friendly for marketers, engineers and data scientists
14
+ alike.
10
15
11
- [ Elasticsearch] ( https://www.elastic.co/products/elasticsearch ) is an
12
- open source search engine known for its ease of use.
13
- [ Kibana] ( https://www.elastic.co/products/kibana ) is an open source Web
14
- UI that makes Elasticsearch user friendly for marketers, engineers and
15
- data scientists alike.
16
-
17
- By combining these three tools EFK (Elasticsearch + Fluentd + Kibana) we
18
- get a scalable, flexible, easy to use log collection and analytics
19
- pipeline. In this article, we will set up 4 containers, each includes:
16
+ By combining these three tools EFK (Elasticsearch + Fluentd + Kibana) we get a
17
+ scalable, flexible, easy to use log collection and analytics pipeline. In this
18
+ article, we will set up four (4) containers, each includes:
20
19
21
20
- [ Apache HTTP Server] ( https://hub.docker.com/_/httpd/ )
22
21
- [ Fluentd] ( https://hub.docker.com/r/fluent/fluentd/ )
23
22
- [ Elasticsearch] ( https://hub.docker.com/_/elasticsearch/ )
24
23
- [ Kibana] ( https://hub.docker.com/_/kibana/ )
25
24
26
- All of ` httpd ` 's logs will be ingested into Elasticsearch + Kibana, via
25
+ All the logs of ` httpd ` will be ingested into Elasticsearch + Kibana, via
27
26
Fluentd.
28
27
29
28
@@ -33,13 +32,15 @@ Please download and install Docker / Docker Compose. Well, that's it :)
33
32
34
33
- [ Docker Installation] ( https://docs.docker.com/engine/installation/ )
35
34
36
- ## Step 0: prepare docker-compose.yml
37
35
38
- First, please prepare ` docker-compose.yml ` for [ Docker Compose] ( https://docs.docker.com/compose/overview/ ) . Docker Compose is a
39
- tool for defining and running multi-container Docker applications.
36
+ ## Step 0: Create ` docker-compose.yml `
37
+
38
+ Create ` docker-compose.yml ` for [ Docker
39
+ Compose] ( https://docs.docker.com/compose/overview/ ) . Docker Compose is a tool
40
+ for defining and running multi-container Docker applications.
40
41
41
- With the YAML file below, you can create and start all the services (in
42
- this case, Apache, Fluentd, Elasticsearch, Kibana) by one command.
42
+ With the YAML file below, you can create and start all the services (in this
43
+ case, Apache, Fluentd, Elasticsearch, Kibana) by one command:
43
44
44
45
``` {.CodeRay}
45
46
version: '3'
@@ -83,40 +84,46 @@ services:
83
84
- "5601:5601"
84
85
```
85
86
86
- ` logging ` section (check [ Docker Compose documentation] ( https://docs.docker.com/compose/compose-file/#/logging ) )
87
- of ` web ` container specifies [ Docker Fluentd Logging Driver] ( https://docs.docker.com/engine/admin/logging/fluentd/ ) as a
88
- default container logging driver. All of the logs from ` web ` container
89
- will be automatically forwarded to host: port specified by
90
- ` fluentd-address ` .
87
+ The ` logging ` section (check [ Docker Compose
88
+ documentation] ( https://docs.docker.com/compose/compose-file/#/logging ) ) of ` web `
89
+ container specifies [ Docker Fluentd Logging
90
+ Driver] ( https://docs.docker.com/engine/admin/logging/fluentd/ ) as a default
91
+ container logging driver. All the logs from ` web ` container will automatically
92
+ be forwarded to ` host:port ` specified by ` fluentd-address ` .
93
+
91
94
92
- ## Step 1: Prepare Fluentd image with your Config + Plugin
95
+ ## Step 1: Create Fluentd Image with your Config + Plugin
93
96
94
- Then, please prepare ` fluentd/Dockerfile ` with the following content, to
95
- use Fluentd's [ official Docker image] ( https://hub.docker.com/r/fluent/fluentd/ ) and additionally
96
- install Elasticsearch plugin.
97
+ Create ` fluentd/Dockerfile ` with the following content using the Fluentd
98
+ [ official Docker image] ( https://hub.docker.com/r/fluent/fluentd/ ) ; and then,
99
+ install the Elasticsearch plugin:
97
100
98
101
``` {.CodeRay}
99
102
# fluentd/Dockerfile
103
+
100
104
FROM fluent/fluentd:v1.6-debian-1
101
105
USER root
102
106
RUN ["gem", "install", "fluent-plugin-elasticsearch", "--no-document", "--version", "3.5.2"]
103
107
USER fluent
104
108
```
105
109
106
- Then, please prepare Fluentd's configuration file
107
- ` fluentd/conf/fluent.conf ` . [ in \_ forward ] ( /plugins/input/forward.md ) plugin is used for
108
- receive logs from Docker logging driver, and out \_ elasticsearch is for
109
- forwarding logs to Elasticsearch.
110
+ Then, create the Fluentd configuration file ` fluentd/conf/fluent.conf ` . The
111
+ [ ` forward ` ] ( /plugins/input/forward.md ) input plugin receives logs from the
112
+ Docker logging driver and ` elasticsearch ` output plugin forwards these logs to
113
+ Elasticsearch.
110
114
111
115
``` {.CodeRay}
112
116
# fluentd/conf/fluent.conf
117
+
113
118
<source>
114
119
@type forward
115
120
port 24224
116
121
bind 0.0.0.0
117
122
</source>
123
+
118
124
<match *.**>
119
125
@type copy
126
+
120
127
<store>
121
128
@type elasticsearch
122
129
host elasticsearch
@@ -129,21 +136,24 @@ forwarding logs to Elasticsearch.
129
136
tag_key @log_name
130
137
flush_interval 1s
131
138
</store>
139
+
132
140
<store>
133
141
@type stdout
134
142
</store>
135
143
</match>
136
144
```
137
145
138
- ## Step 2: Start Containers
139
146
140
- Let's start all of the containers, with just one command.
147
+ ## Step 2: Start the Containers
148
+
149
+ Let's start the containers:
141
150
142
151
``` {.CodeRay}
143
152
$ docker-compose up
144
153
```
145
154
146
- You can check to see if 4 containers are running by ` docker ps ` command.
155
+ Use ` docker ps ` command to verify that the four (4) containers are up and
156
+ running:
147
157
148
158
``` {.CodeRay}
149
159
$ docker ps
@@ -153,10 +163,10 @@ bc5bcaedb282 kibana:7.2.0 "/usr/
153
163
9fe2d02cff41 docker.elastic.co/elasticsearch/elasticsearch:7.2.0 "/usr/local/bin/dock…" 20 seconds ago Up 18 seconds 0.0.0.0:9200->9200/tcp, 9300/tcp docker_elasticsearch_1
154
164
```
155
165
156
- ## Step 3: Generate httpd Access Logs
157
166
158
- Let's access to ` httpd ` to generate some access logs. ` curl ` command is
159
- always your friend.
167
+ ## Step 3: Generate ` httpd ` Access Logs
168
+
169
+ Use ` curl ` command to generate some access logs like this:
160
170
161
171
``` {.CodeRay}
162
172
$ curl http://localhost:80/[1-10]
@@ -172,35 +182,35 @@ $ curl http://localhost:80/[1-10]
172
182
<html><body><h1>It works!</h1></body></html>
173
183
```
174
184
185
+
175
186
## Step 4: Confirm Logs from Kibana
176
187
177
- Please go to ` http://localhost:5601/ ` with your browser. Then, you need
178
- to set up the index name pattern for Kibana. Please specify ` fluentd-* `
179
- to ` Index name or pattern ` and press ` Create ` button.
188
+ Browse to ` http://localhost:5601/ ` and set up the index name pattern for Kibana.
189
+ Specify ` fluentd-* ` to ` Index name or pattern ` and click ` Create ` .
180
190
181
- ![ ] ( /images/7.2_efk-kibana-index.png )
182
- ![ ] ( /images/7.2_efk-kibana-timestamp.png )
191
+ ![ Kibana Index ] ( /images/7.2_efk-kibana-index.png ) ![ Kibana
192
+ Timestamp ] ( /images/7.2_efk-kibana-timestamp.png )
183
193
184
- Then, go to ` Discover ` tab to seek for the logs. As you can see, logs
185
- are properly collected into Elasticsearch + Kibana, via Fluentd.
194
+ Then, go to ` Discover ` tab to check the logs. As you can see, logs are properly
195
+ collected into Elasticsearch + Kibana, via Fluentd.
186
196
187
- ![ ] ( /images/7.2_efk-kibana-discover.png )
197
+ ![ Kibana Discover ] ( /images/7.2_efk-kibana-discover.png )
188
198
189
- ## Conclusion
199
+ ## Code
190
200
191
- This article explains how to collect logs from Apache to EFK
192
- (Elasticsearch + Fluentd + Kibana). The example code is available in
193
- this repository.
194
- - < https://github.com/digikin/fluentd-elastic-kibana >
201
+ The code is available at https://github.com/digikin/fluentd-elastic-kibana .
195
202
196
203
## Learn More
197
204
198
- - [ Fluentd Architecture] ( https://www.fluentd.org/architecture )
199
- - [ Fluentd Get Started] ( /overview/quickstart.md )
205
+ - [ Fluentd: Architecture] ( https://www.fluentd.org/architecture )
206
+ - [ Fluentd: Get Started] ( /overview/quickstart.md )
200
207
- [ Downloading Fluentd] ( http://www.fluentd.org/download )
201
208
202
209
203
210
------------------------------------------------------------------------
204
211
205
- If this article is incorrect or outdated, or omits critical information, please [ let us know] ( https://github.com/fluent/fluentd-docs-gitbook/issues?state=open ) .
206
- [ Fluentd] ( http://www.fluentd.org/ ) is a open source project under [ Cloud Native Computing Foundation (CNCF)] ( https://cncf.io/ ) . All components are available under the Apache 2 License.
212
+ If this article is incorrect or outdated, or omits critical information, please
213
+ [ let us know] ( https://github.com/fluent/fluentd-docs-gitbook/issues?state=open ) .
214
+ [ Fluentd] ( http://www.fluentd.org/ ) is an open-source project under [ Cloud Native
215
+ Computing Foundation (CNCF)] ( https://cncf.io/ ) . All components are available
216
+ under the Apache 2 License.
0 commit comments