Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Exception with simple log delivery #6

Closed
rverma-nikiai opened this issue Jul 23, 2019 · 2 comments
Closed

Exception with simple log delivery #6

rverma-nikiai opened this issue Jul 23, 2019 · 2 comments

Comments

@rverma-nikiai
Copy link

rverma-nikiai commented Jul 23, 2019

Copyright (C) Treasure Data

time="2019-07-23T04:57:59Z" level=info msg="[firehose] plugin parameter delivery_stream = 'niki-staging-logs'\n"
time="2019-07-23T04:57:59Z" level=info msg="[firehose] plugin parameter region = 'ap-south-1'\n"
time="2019-07-23T04:57:59Z" level=info msg="[firehose] plugin parameter data_keys = ''\n"
time="2019-07-23T04:57:59Z" level=info msg="[firehose] plugin parameter role_arn = 'arn:aws:iam::xxx:role/niki-staging-fluentd'\n"
time="2019-07-23T04:57:59Z" level=info msg="[firehose] plugin parameter endpoint = ''\n"
[2019/07/23 04:57:59] [ info] [storage] initializing...
[2019/07/23 04:57:59] [ info] [storage] in-memory
[2019/07/23 04:57:59] [ info] [storage] normal synchronization mode, checksum disabled, max_chunks_up=128
[2019/07/23 04:57:59] [ info] [engine] started (pid=1)
[2019/07/23 04:57:59] [ warn] [filter_kube] merge_json_log is deprecated, enabling 'merge_log' option instead
[2019/07/23 04:57:59] [ info] [filter_kube] https=1 host=kubernetes.default.svc port=443
[2019/07/23 04:57:59] [ info] [filter_kube] local POD info OK
[2019/07/23 04:57:59] [ info] [filter_kube] testing connectivity with API server...
[2019/07/23 04:57:59] [ info] [filter_kube] API server connectivity OK
[2019/07/23 04:57:59] [ info] [http_server] listen iface=0.0.0.0 tcp_port=2020
[2019/07/23 04:57:59] [ info] [sp] stream processor started
time="2019-07-23T04:58:02Z" level=error msg="[firehose] PutRecordBatch request returned with no records successfully recieved\n"
time="2019-07-23T04:58:03Z" level=error msg="[firehose] PutRecordBatch request returned with no records successfully recieved\n"
time="2019-07-23T04:58:03Z" level=error msg="[firehose] PutRecordBatch request returned with no records successfully recieved\n"
time="2019-07-23T04:58:13Z" level=error msg="[firehose] PutRecordBatch request returned with no records successfully recieved\n"

Firehose destination settings

Amazon S3 destination
S3 bucket
niki-staging-logs
Prefixr aw/!{timestamp:yyyy/MM-dd}/
Error prefixerror/!{firehose:error-output-type}/!{timestamp:yyyy/MM-dd}/
Buffer conditions 30 MB or 600 seconds
CompressionDisabled
EncryptionDisabled

No record format conversion

Fluentbit conf

fluent-bit.conf: |
    [SERVICE]
        Flush         1
        Log_Level     info
        Daemon        off
        Parsers_File  parsers.conf
        HTTP_Server   On
        HTTP_Listen   0.0.0.0
        HTTP_Port     2020

    @INCLUDE input-kubernetes.conf
    @INCLUDE filter-grep.conf
    @INCLUDE filter-kubernetes.conf
    @INCLUDE output-firehose.conf
  input-kubernetes.conf: |
    [INPUT]
        Name              tail
        Tag               app.*
        Path              /var/log/containers/*_app_*.log
        Parser            docker
        DB                /var/log/flb_kube_ns.db
        Mem_Buf_Limit     30MB
        Skip_Long_Lines   On
        Refresh_Interval  10
  filter-grep.conf: |
    [FILTER]
        Name     grep
        Match    *
        Exclude  log /health
  filter-kubernetes.conf: |
    [FILTER]
        Name                kubernetes
        Match               app.*
        Kube_URL            https://kubernetes.default.svc:443
        Kube_CA_File        /var/run/secrets/kubernetes.io/serviceaccount/ca.crt
        Kube_Token_File     /var/run/secrets/kubernetes.io/serviceaccount/token
        Regex_Parser        ns_k8s_parser
        Merge_JSON_Log      On
  output-firehose.conf: |
    [OUTPUT]
        Name firehose
        Match app.*
        region ap-south-1
        role_arn arn:aws:iam::xxx:role/niki-staging-fluentd
        delivery_stream niki-staging-logs
  parsers.conf: |
    [PARSER]
        Name        first_line
        Format      regex
        Regex       ^{"log":"(?!\\u0009)(?<log>\S(?:(\\")|[^"]){9}(?:(\\")|[^"])*)"

    [PARSER]
        Name        nested_json
        Format      json
        Time_Keep   true
        Time_Key    time
        Time_Format %Y-%m-%dT%H:%M:%S.%L
        Decode_Field_As json log do_next
        Decode_Field_As escaped message

    [PARSER]
        Name        ns_k8s_parser
        Format      regex
        Regex       (?<tag>[^.]+)?\.?(?<pod_name>[^_]+)_(?<namespace_name>[^_]+)_(?<container_name>.+-(?<docker_id>[a-z0-9]{64}))\.log$

    [PARSER]
        Name        docker
        Format      json
        Time_Key    time
        Time_Format %Y-%m-%dT%H:%M:%S.%L
        Time_Keep   On
        Decode_Field_As    json     log

What am I doing wrong?

@rverma-nikiai
Copy link
Author

Had to get limits increased by aws team, issue resolved post that.

@from20020516
Copy link

Don't let someone make the same mistakes... :/

It should be noted that most FireLens users will need to request limit increases in order to use Kinesis Data Firehose for their logs.

https://aws.amazon.com/jp/blogs/containers/under-the-hood-firelens-for-amazon-ecs-tasks/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants