Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak #3302

Closed
svk-28 opened this issue Mar 30, 2021 · 6 comments
Closed

Memory leak #3302

svk-28 opened this issue Mar 30, 2021 · 6 comments
Labels
fixed Stale waiting-for-user Waiting for more information, tests or requested changes

Comments

@svk-28
Copy link

svk-28 commented Mar 30, 2021

Bug Report

Describe the bug
Memory leaks are observed when using tail input and regexp parser

To Reproduce
td-agent-bit.conf

[SERVICE]
    flush        5
    daemon       Off
    log_level    info
    parsers_file parsers.conf
    plugins_file plugins.conf
    http_server  Off
    http_listen  0.0.0.0
    http_port    2020
    storage.metrics on

[INPUT]
    Name        tail
    Path        /opt/zimbra/log/mailbox.log,/opt/zimbra/log/audit.log
    Path_Key    On

[FILTER]
    Name            parser
    Match           *
    # Parser          java-exception
    Parser          mailboxlog2
    Parser          mailboxlog3
    Parser          mailboxlog4
    Parser          mailboxlog5
    Parser          mailboxlog6
    Parser          mailboxlog7
    #Preserve_Key    On
    Reserve_Data    On
    Key_Name        log

[OUTPUT]
    Name  es
    Match *
    Host elastic
    Port 9200
    tls On
    tls.verify Off
    HTTP_User username
    HTTP_Passwd pass
    Index mailboxlog-beat-%Y-%m-%d

oomkiller

[root@mailbox1 td-agent-bit]# dmesg -T | grep -i 'killed process'
[Пт мар 26 18:40:03 2021] Killed process 95435 (td-agent-bit) total-vm:39015560kB, anon-rss:32188472kB, file-rss:0kB, shmem-rss:0kB
[Пт мар 26 18:40:03 2021] Killed process 95436 (flb-pipeline) total-vm:39015560kB, anon-rss:32188472kB, file-rss:0kB, shmem-rss:0kB
[Пн мар 29 14:03:23 2021] Killed process 142984 (td-agent-bit) total-vm:39015560kB, anon-rss:30719484kB, file-rss:0kB, shmem-rss:0kB
[Пн мар 29 14:03:23 2021] Killed process 142985 (flb-pipeline) total-vm:39015560kB, anon-rss:30719492kB, file-rss:0kB, shmem-rss:0kB
[Вт мар 30 12:00:03 2021] Killed process 117068 (td-agent-bit) total-vm:39015560kB, anon-rss:30388940kB, file-rss:0kB, shmem-rss:0kB
[Вт мар 30 12:00:03 2021] Killed process 117069 (flb-pipeline) total-vm:39015560kB, anon-rss:30388992kB, file-rss:0kB, shmem-rss:0kB

and a lots messages like this:

Mar 29 11:00:46 mailbox1 td-agent-bit: [2021/03/29 11:00:46] [error] [upstream] connection #23 to elastic:9200 timed out after 10 seconds
Mar 29 11:00:46 mailbox1 td-agent-bit: [2021/03/29 11:00:46] [error] [upstream] connection #24 to elastic:9200 timed out after 10 seconds
Mar 29 11:00:46 mailbox1 td-agent-bit: [2021/03/29 11:00:46] [error] [upstream] connection #35 to elastic:9200 timed out after 10 seconds
Mar 29 11:00:46 mailbox1 td-agent-bit: [2021/03/29 11:00:46] [error] [upstream] connection #24 to elastic:9200 timed out after 10 seconds
^CMar 29 11:00:46 mailbox1 td-agent-bit: [2021/03/29 11:00:46] [error] [upstream] connection #23 to elastic:9200 timed out after 10 seconds

Your Environment

  • Version used: 1.7.2
  • Configuration: see above
  • Environment name and version (e.g. Kubernetes? What version?): system daemon
  • Server type and version: hyper-v virtual machine
  • Operating System and version: centos 7
  • Filters and plugins:

parsers.conf.txt

@nokute78
Copy link
Collaborator

The patch 1938ec8 will be merged from v1.7.3 (not released yet.)

Similar to #3192

@agup006
Copy link
Member

agup006 commented Apr 6, 2021

@svk-28 with 1.7.3 released are you able to test and validate?

@agup006 agup006 added waiting-for-user Waiting for more information, tests or requested changes fixed labels Apr 6, 2021
@svk-28
Copy link
Author

svk-28 commented Apr 6, 2021

@svk-28 with 1.7.3 released are you able to test and validate?

Yes, i will try.

@svk-28
Copy link
Author

svk-28 commented Apr 17, 2021

Hi!
I ran the new version in a test system and the memory leak problem is no longer reproducible. I will try run it on a production instance on a next week.

@github-actions
Copy link
Contributor

This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.

@github-actions github-actions bot added the Stale label May 18, 2021
@github-actions
Copy link
Contributor

This issue was closed because it has been stalled for 5 days with no activity.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
fixed Stale waiting-for-user Waiting for more information, tests or requested changes
Projects
None yet
Development

No branches or pull requests

3 participants