Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multiple grok patterns with multiline not parsing the log #36

Closed
cassador opened this issue Nov 24, 2017 · 8 comments
Closed

Multiple grok patterns with multiline not parsing the log #36

cassador opened this issue Nov 24, 2017 · 8 comments

Comments

@cassador
Copy link

Hi.

I am facing problem while using fluentd-0.14.23. The logs are not being parsed even when i went according to the documentation and your Readme file. Can you help me a little with solving this issue? It is running under Docker container created by Kubernetes DaemonSet.

I uploaded the zip file which contains an example of log i am trying to parse with my configuration
data.zip

Thanks ahead.

@okkez
Copy link
Collaborator

okkez commented Nov 27, 2017

Use grok_parser in <source> section.
Because example.log does not contain JSON format logs.

For example (grok patterns are same as your configuration),

<source>
  @type tail
  path /var/log/containers/*.log
  exclude_path ["/var/log/containers/kube-*.log", "/var/log/containers/calico-*.log", "/var/log/containers/heapster-*.log", "/var/log/containers/etcd-*.log", "/var/log/containers/kubernetes-*.log", "/var/log/containers/fluentd-*.log", "/var/log/containers/monitoring-*.log"]
  pos_file /var/log/fluentd-containers.log.pos
  time_format %Y-%m-%dT%H:%M:%S.%NZ
  tag kubernetes.*  
  read_from_head true
  <parse>
    @type multiline_grok
    grok_failure_key grokfailure
    multiline_start_regexp ^(?>\d\d){1,2}-(?:0?[1-9]|1[0-2])-(?:(?:0[1-9])|(?:[12][0-9])|(?:3[01])|[1-9])[T ](?:2[0123]|[01]?[0-9]):?(?:[0-5][0-9])(?::?(?:(?:[0-5]?[0-9]|60)(?:[:.,][0-9]+)?))?(?:Z|[+-](?:2[0123]|[01]?[0-9])(?::?(?:[0-5][0-9])))?
    <grok>
      pattern ^-*%{TIMESTAMP_ISO8601:timestamp}\s*-*\s*%{LOGLEVEL:level}\s+%{NUMBER:pid}\s+---\s+\[\s*%{USERNAME:thread}\s*\]\s+%{DATA:class}\s*:\s*\[%{DATA:rootProcessInstanceId};\s*%{DATA:activityId};\s*%{DATA:transactionId}\]\s*%{DATA:message}(?:\n%{GREEDYDATA:stack})?\n*$
    </grok>
    <grok>
      pattern ^-*%{TIMESTAMP_ISO8601:timestamp}\s*-*\s*%{LOGLEVEL:level}\s+%{NUMBER:pid}\s+---\s+\[\s*%{USERNAME:thread}\s*\]\s+%{DATA:class}\s*:\s*\[%{DATA:rootProcessInstanceId};\s*%{DATA:activityId}\]\s*%{DATA:message}(?:\n%{GREEDYDATA:stack})?\n*$
    </grok>
    <grok>
      pattern ^-*%{TIMESTAMP_ISO8601:timestamp}\s*-*\s*%{LOGLEVEL:level}\s+%{NUMBER:pid}\s+---\s+\[\s*%{USERNAME:thread}\s*\]\s+%{DATA:class}\s*:\s*%{DATA:message}(?:\n%{GREEDYDATA:stack})?\n*$
    </grok>
  </parse>  
</source>

And you may need multiline_flush_interval in your configuration to flush logs in example.log.

@cassador
Copy link
Author

Hi. I double checked and don't know why dashboard is not showing logs in json format. The correct file is in attachment from which is fluentd taking the logs. They are in json format.

I apologize for inconvenience.

output_in_json.log

@okkez
Copy link
Collaborator

okkez commented Nov 27, 2017

In this case, you can use https://github.com/fluent-plugins-nursery/fluent-plugin-concat.

For example:

<source>
  @type tail
  path /var/log/containers/*.log
  exclude_path ["/var/log/containers/kube-*.log", "/var/log/containers/calico-*.log", "/var/log/containers/heapster-*.log", "/var/log/containers/etcd-*.log", "/var/log/containers/kubernetes-*.log", "/var/log/containers/fluentd-*.log", "/var/log/containers/monitoring-*.log"]
  pos_file /var/log/fluentd-containers.log.pos
  time_format %Y-%m-%dT%H:%M:%S.%NZ
  tag kubernetes.*  
  read_from_head true
  multiline_flush_interval 3s
  <parse>
    @type json    
  </parse>
  @label @INPUT
</source>

<label @INPUT>
  <filter kubernetes.**>
    @type concat
    key log
    multiline_start_regexp ^(?>\d\d){1,2}-(?:0?[1-9]|1[0-2])-(?:(?:0[1-9])|(?:[12][0-9])|(?:3[01])|[1-9])[T ](?:2[0123]|[01]?[0-9]):?(?:[0-5][0-9])(?::?(?:(?:[0-5]?[0-9]|60)(?:[:.,][0-9]+)?))?(?:Z|[+-](?:2[0123]|[01]?[0-9])(?::?(?:[0-5][0-9])))?
    continuous_line_regexp ^\s+
    separator ""
    flush_interval 3s
    timeout_label @PARSE
  </filter>
  <match>
    @type relabel
    @label @PARSE
  </match>
</label>

<label @PARSE>
  <filter kubernetes.**>
    @type parser
    key_name log
    inject_key_prefix log.
    <parse>
      @type multiline_grok
      grok_failure_key grokfailure
      <grok>
        pattern ^-*%{TIMESTAMP_ISO8601:timestamp}\s*-*\s*%{LOGLEVEL:level}\s+%{NUMBER:pid}\s+---\s+\[\s*%{USERNAME:thread}\s*\]\s+%{DATA:class}\s*:\s*\[%{DATA:rootProcessInstanceId};\s*%{DATA:activityId};\s*%{DATA:transactionId}\]\s*%{DATA:message}(?:\n%{GREEDYDATA:stack})?\n*$
      </grok>
      <grok>
        pattern ^-*%{TIMESTAMP_ISO8601:timestamp}\s*-*\s*%{LOGLEVEL:level}\s+%{NUMBER:pid}\s+---\s+\[\s*%{USERNAME:thread}\s*\]\s+%{DATA:class}\s*:\s*\[%{DATA:rootProcessInstanceId};\s*%{DATA:activityId}\]\s*%{DATA:message}(?:\n%{GREEDYDATA:stack})?\n*$
      </grok>
      <grok>
        pattern ^-*%{TIMESTAMP_ISO8601:timestamp}\s*-*\s*%{LOGLEVEL:level}\s+%{NUMBER:pid}\s+---\s+\[\s*%{USERNAME:thread}\s*\]\s+%{DATA:class}\s*:\s*%{DATA:message}(?:\n%{GREEDYDATA:stack})?\n*$
      </grok>
    </parse>  
  </filter>
  <match>
    @type relabel
    @label @MAIN
  </match>
</label>

<label @MAIN>
# <filter kubernetes.**>
#   @type kubernetes_metadata
# </filter> 
  <match kubernetes.**>
    @type stdout
  </match>
</label>

This will generate following event for last ERROR log:

2017-11-27 17:54:47.106052737 +0900 kubernetes.output_in_json.log: {"log.timestamp":"2017-11-27T07:32:53.240+0000","log.level":"ERROR","log.pid":"76","log.thread":"nio-8080-exec-1","log.class":"a.u.a.s.GlobalControllerExceptionHandler","log.message":"Handling exception with reference 2ebf5418-eeee-4b68-bb0c-e824e9e75ab7","log.stack":"com.company.ui.aggregates.exception.AggregatesException: null\n\tat com.company.ui.aggregates.command.controller.CommandAggregateController.read(CommandAggregateController.java:79)\n\tat sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)\n\tat sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)\n\tat sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\n\tat java.lang.reflect.Method.invoke(Method.java:498)\n\tat org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:205)\n\tat org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:133)\n\tat org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:97)\n\tat org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:827)\n\tat org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:738)\n\tat org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:85)\n\tat org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:967)\n\tat org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:901)\n\tat org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:970)\n\tat org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:861)\n\tat javax.servlet.http.HttpServlet.service(HttpServlet.java:635)\n\tat org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:846)\n\tat javax.servlet.http.HttpServlet.service(HttpServlet.java:742)\n\tat org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:231)\n\tat org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)\n\tat org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)\n\tat org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)\n\tat org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)\n\tat org.springframework.boot.web.filter.ApplicationContextHeaderFilter.doFilterInternal(ApplicationContextHeaderFilter.java:55)\n\tat org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)\n\tat org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)\n\tat org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)\n\tat com.company.ui.aggregates.transaction.TransactionOptionalFilter.doFilterInternal(TransactionOptionalFilter.java:57)\n\tat org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)\n\tat org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)\n\tat org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)\n\tat org.springframework.boot.actuate.trace.WebRequestTraceFilter.doFilterInternal(WebRequestTraceFilter.java:110)\n\tat org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)\n\tat org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)\n\tat org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)\n\tat org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:99)\n\tat org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)\n\tat org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)\n\tat org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)\n\tat org.springframework.web.filter.HttpPutFormContentFilter.doFilterInternal(HttpPutFormContentFilter.java:108)\n\tat org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)\n\tat org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)\n\tat org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)\n\tat org.springframework.web.filter.HiddenHttpMethodFilter.doFilterInternal(HiddenHttpMethodFilter.java:81)\n\tat org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)\n\tat org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)\n\tat org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)\n\tat org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:197)\n\tat org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)\n\tat org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)\n\tat org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)\n\tat org.springframework.boot.actuate.autoconfigure.MetricsFilter.doFilterInternal(MetricsFilter.java:106)\n\tat org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)\n\tat org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)\n\tat org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)\n\tat org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:198)\n\tat org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:96)\n\tat org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:478)\n\tat org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:140)\n\tat org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:80)\n\tat org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:87)\n\tat org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:342)\n\tat org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:799)\n\tat org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:66)\n\tat org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:868)\n\tat org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1457)\n\tat org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49)\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)\n\tat org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)\n\tat java.lang.Thread.run(Thread.java:748)\n"}

@cassador
Copy link
Author

Thanks for help. There were some minor changes required, but now i have the logs parsed and it is working. So you can close this as closed.
Thanks again for quick response, you literally saved a few working days for me.

@azman0101
Copy link

There were some minor changes required

What kind of minor changes do you made ?

Regards

@xuanyuanaosheng
Copy link

xuanyuanaosheng commented Apr 19, 2018

I have also encounter the problems, https://stackoverflow.com/questions/49915805/using-fluentd-to-parse-the-docker-json-file-log-which-is-in-the-var-lib-docker. The logs can not being parsed when I did according to the documentation and your Readme file.

The origin log is:

 {"log":"2018-04-19 14:19:57,915 INFO  [FixedTimeScheduler] com.testjavatest.fastdemo.ws.WebSocketClientManager: send ws message -{TAB} for envirnem -{{\"res\":\"heartbeat\"}}\n","stream":"stdou
 t","time":"2018-04-19T06:19:57.916259717Z"}
 {"log":"2018-04-19 14:19:57,915 INFO  [FixedTimeScheduler] com.testjavatest.fastdemo.ws.WebSocketClientManager: send ws message -{TAB} for envirnem -{{\"res\":\"heartbeat\"}}\n","stream
 ":"stdout","time":"2018-04-19T06:19:57.916265977Z"}
 {"log":"2018-04-19 14:20:43,446 ERROR [FixedTimeScheduler] com.testjavatest.fastdemo.task.JobTask: Connect to cloud.testjavatest.com:5002 [cloud.testjavatest.com/10.111.2.77] failed: Connection timed o
 ut (Connection timed out)\n","stream":"stdout","time":"2018-04-19T06:20:43.448436321Z"}
 {"log":"org.apache.http.conn.HttpHostConnectException: Connect to cloud.testjavatest.com:5002 [cloud.testjavatest.com/10.111.2.77] failed: Connection timed out (Connection timed out)\n","stream":"s
 tdout","time":"2018-04-19T06:20:43.448475801Z"}
 {"log":"\u0009at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:151)\n","stream":"stdout","time":"2018-04-19T06:20:43.4484860
 6Z"}
 {"log":"\u0009at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:353)\n","stream":"stdout","time":"2018-04-19T06:20:43.448492586
 Z"}
 {"log":"\u0009at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:380)\n","stream":"stdout","time":"2018-04-19T06:20:43.448498085Z"}
 {"log":"\u0009at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:236)\n","stream":"stdout","time":"2018-04-19T06:20:43.448503302Z"}
 {"log":"\u0009at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:184)\n","stream":"stdout","time":"2018-04-19T06:20:43.4485085Z"}
 {"log":"\u0009at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:88)\n","stream":"stdout","time":"2018-04-19T06:20:43.448527373Z"}
 {"log":"\u0009at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)\n","stream":"stdout","time":"2018-04-19T06:20:43.448532363Z"}
 {"log":"\u0009at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:184)\n","stream":"stdout","time":"2018-04-19T06:20:43.44853704Z"}
 {"log":"\u0009at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82)\n","stream":"stdout","time":"2018-04-19T06:20:43.448541864Z"}
 {"log":"\u0009at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:107)\n","stream":"stdout","time":"2018-04-19T06:20:43.448546452Z"}
 {"log":"\u0009at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:55)\n","stream":"stdout","time":"2018-04-19T06:20:43.448551288Z"}
 {"log":"\u0009at com.testjavatest.httpclient.MyHttpClient.send(MyHttpClient.java:308)\n","stream":"stdout","time":"2018-04-19T06:20:43.448555115Z"}
 {"log":"\u0009at com.testjavatest.httpclient.MyHttpClient.send(MyHttpClient.java:264)\n","stream":"stdout","time":"2018-04-19T06:20:43.448559028Z"}
 {"log":"\u0009at com.testjavatest.httpclient.MyHttpClient.recvBytes(MyHttpClient.java:470)\n","stream":"stdout","time":"2018-04-19T06:20:43.448563048Z"}
 {"log":"\u0009at com.testjavatest.httpclient.MyHttpClient.recvString(MyHttpClient.java:474)\n","stream":"stdout","time":"2018-04-19T06:20:43.448567524Z"}
 {"log":"\u0009at com.testjavatest.httpclient.MyHttpClient.recvJSON(MyHttpClient.java:483)\n","stream":"stdout","time":"2018-04-19T06:20:43.448571872Z"}
 {"log":"\u0009at com.testjavatest.fastdemo.task.JobTask.run(JobTask.java:70)\n","stream":"stdout","time":"2018-04-19T06:20:43.448576282Z"}
 {"log":"\u0009at com.testjavatest.fastdemo.task.timer.FixedTimeScheduler.run(FixedTimeScheduler.java:43)\n","stream":"stdout","time":"2018-04-19T06:20:43.44858108Z"}
 {"log":"Caused by: java.net.ConnectException: Connection timed out (Connection timed out)\n","stream":"stdout","time":"2018-04-19T06:20:43.448585665Z"}
 {"log":"\u0009at java.net.PlainSocketImpl.socketConnect(Native Method)\n","stream":"stdout","time":"2018-04-19T06:20:43.448590045Z"}
 {"log":"\u0009at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)\n","stream":"stdout","time":"2018-04-19T06:20:43.448596151Z"}
 {"log":"\u0009at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)\n","stream":"stdout","time":"2018-04-19T06:20:43.448600888Z"}
 {"log":"\u0009at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)\n","stream":"stdout","time":"2018-04-19T06:20:43.448605225Z"}
 {"log":"\u0009at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)\n","stream":"stdout","time":"2018-04-19T06:20:43.448609704Z"}
 {"log":"\u0009at java.net.Socket.connect(Socket.java:589)\n","stream":"stdout","time":"2018-04-19T06:20:43.448614111Z"}
 {"log":"\u0009at org.apache.http.conn.ssl.SSLConnectionSocketFactory.connectSocket(SSLConnectionSocketFactory.java:337)\n","stream":"stdout","time":"2018-04-19T06:20:43.448618537Z"}
 {"log":"\u0009at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:134)\n","stream":"stdout","time":"2018-04-19T06:20:43.4486230
 34Z"}
 {"log":"\u0009... 17 more\n","stream":"stdout","time":"2018-04-19T06:20:43.448627326Z"}
 {"log":"2018-04-19 14:20:57,934 INFO  [FixedTimeScheduler] com.testjavatest.fastdemo.config.WsJobTask: ws heartbeat run\n","stream":"stdout","time":"2018-04-19T06:20:57.934646438Z"}

my fluentd config:

  <source>
   @id fluentd-containers.log
   @type tail
   from_encoding UTF-8
   encoding UTF-8
   path /var/lib/docker/containers/52b26345771ff24b76840a57f26808562aea0e917d72746d48aac6f9870b901b/52b26345771ff24b76840a57f26808562aea0e917d72746d48aac6f9870b901b-json.log
   pos_file /var/log/fluentd-containers.log.pos
   time_format %Y-%m-%dT%H:%M:%S.%NZ
   tag raw.docker.*
   format json
   read_from_head true
 </source>
 <filter raw.docker.**> 
   @type grep
   regexp1 stream stdout
   @type concat
   key log
   multiline_start_regexp /^\d{4}-\d{2}-\d{2}\s\d{2}:\d{2}:\d{2}\,\d+/ 
   continuous_line_regexp /^\s+/ 
   separator ""
   flush_interval 3s
 </filter>
 <filter raw.docker.**>
   @type parser
   key_name log
   inject_key_prefix log.
   <parse>
     @type multiline_grok
     grok_failure_key grokfailure
     <grok>
       pattern /%{TIMESTAMP_ISO8601:log_time}%{SPACE}%{LOGLEVEL:log_level}%{SPACE}\[%{DATA:threadname}\]%{SPACE}%{DATA:classname}%{SPACE}:%{SPACE}%{GREEDYDATA:log_message}/ 
     </grok>
   </parse>
 </filter>
 <match raw.docker.**>
   @type copy
   <store>
     @type gelf
     protocol udp
     host 10.111.2.4
     port 12204
     flush_interval 5s
   </store>
   <store>
     @type stdout
   </store>
   <store>
     @type file
     path /var/log/test
   </store> 
 </match>    

I test a lot, but it can not work, Can you help me a little with solving this issue? Thanks ahead.
@cassador @okkez

@okkez
Copy link
Collaborator

okkez commented Apr 20, 2018

@xuanyuanaosheng
Copy link

@okkez Thanks for you help, your plugin is very good for parse docker logs in multiline. Thanks a lot.

I wish my config will help some one, my fluent config:

  <source>
   @id fluentd-containers.log
   @type tail
   from_encoding UTF-8
   encoding UTF-8
   path /var/lib/docker/containers/52b26345771ff24b76840a57f26808562aea0e917d72746d48aac6f9870b901b/52b26345771ff24b76840a57f26808562aea0e917d72746d48aac6f9870b901b-json.log
   pos_file /var/log/fluentd-containers.log.pos
   time_format %Y-%m-%dT%H:%M:%S.%NZ
   tag raw.docker.*
   format json
   read_from_head true
 </source>
 <filter raw.docker.**> 
   @type grep
   regexp1 stream stdout
   @type record_transformer
   <record>
   </record>
 </filter>
 <filter raw.docker.**>  
   @type concat 
   key log
   stream_identity_key container_id
   multiline_start_regexp /^\d{4}-\d{2}-\d{2}\s\d{2}:\d{2}:\d{2}\,\d+/  
 </filter>
 <match raw.docker.**>
   @type copy
   <store>
     @type gelf
     protocol udp
     host 10.111.2.4
     port 12204
     flush_interval 5s
   </store>
 </match>

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants