Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak with 0.12.10 #231

Closed
eriedl-kiban opened this issue Nov 14, 2018 · 9 comments
Closed

Memory leak with 0.12.10 #231

eriedl-kiban opened this issue Nov 14, 2018 · 9 comments

Comments

@eriedl-kiban
Copy link

Our daily automatic deployments failed this morning due to instances running out of memory. We identified the amazon-kinesis-producer library as the only part that has changed in the last 48 hours. Downgrading to 0.12.9 fixed the issue.

The application logs repeat the following two error messages on application startup until the instance runs out of memory:

2018-11-14 17:46:12.650  WARN 811 --- [ost-startStop-2] o.a.c.loader.WebappClassLoaderBase       : The web application [ROOT] appears to have started a thread named [kpl-daemon-0001] but has failed to stop it. This is very likely to create a memory leak. Stack trace of thread:
 sun.nio.fs.UnixNativeDispatcher.open0(Native Method)
 sun.nio.fs.UnixNativeDispatcher.open(UnixNativeDispatcher.java:71)
 sun.nio.fs.UnixChannelFactory.open(UnixChannelFactory.java:257)
 sun.nio.fs.UnixChannelFactory.newFileChannel(UnixChannelFactory.java:136)
 sun.nio.fs.UnixChannelFactory.newFileChannel(UnixChannelFactory.java:148)
 sun.nio.fs.UnixFileSystemProvider.newFileChannel(UnixFileSystemProvider.java:175)
 java.nio.channels.FileChannel.open(FileChannel.java:287)
 java.nio.channels.FileChannel.open(FileChannel.java:335)
 com.amazonaws.services.kinesis.producer.Daemon.connectToChild(Daemon.java:347)
 com.amazonaws.services.kinesis.producer.Daemon.access$1000(Daemon.java:63)
 com.amazonaws.services.kinesis.producer.Daemon$6.run(Daemon.java:457)
 java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
 java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
 java.lang.Thread.run(Thread.java:748)

and

2018-11-14 17:46:08.238  INFO 811 --- [kpl-daemon-0000] c.a.s.kinesis.producer.KinesisProducer   : Restarting native producer process.
2018-11-14 17:46:09.322 ERROR 811 --- [kpl-daemon-0000] c.a.s.kinesis.producer.KinesisProducer   : Error in child process

com.amazonaws.services.kinesis.producer.IrrecoverableError: Child process exited with code 1
    at com.amazonaws.services.kinesis.producer.Daemon.fatalError(Daemon.java:537)
    at com.amazonaws.services.kinesis.producer.Daemon.fatalError(Daemon.java:509)
    at com.amazonaws.services.kinesis.producer.Daemon.startChildProcess(Daemon.java:487)
    at com.amazonaws.services.kinesis.producer.Daemon.access$100(Daemon.java:63)
    at com.amazonaws.services.kinesis.producer.Daemon$1.run(Daemon.java:133)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)
@zackb
Copy link

zackb commented Nov 14, 2018

Just started happening here too.

Linux 4.9.0-040900rc3-generic #201610291831 SMP Sat Oct 29 22:32:46 UTC 2016 x86_64 x86_64 x86_64 GNU/Linux
stretch/sid

[kpl-daemon-0000] INFO com.amazonaws.services.kinesis.producer.KinesisProducer - Restarting native producer process.
[kpl-daemon-0003] WARN com.amazonaws.services.kinesis.producer.LogInputStreamReader - /tmp/amazon-kinesis-producer-native-binaries/kinesis_producer_
D626AFB58CB68D8C81E7F8671FE0048978DAA5B6: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.25' not found (required by /tmp/amazon-kinesis-producer-
native-binaries/kinesis_producer_D626AFB58CB68D8C81E7F8671FE0048978DAA5B6)
[kpl-daemon-0000] ERROR com.amazonaws.services.kinesis.producer.KinesisProducer - Error in child process
com.amazonaws.services.kinesis.producer.IrrecoverableError: Child process exited with code 1
at com.amazonaws.services.kinesis.producer.Daemon.fatalError(Daemon.java:537)
at com.amazonaws.services.kinesis.producer.Daemon.fatalError(Daemon.java:509)
at com.amazonaws.services.kinesis.producer.Daemon.startChildProcess(Daemon.java:487)
at com.amazonaws.services.kinesis.producer.Daemon.access$100(Daemon.java:63)
at com.amazonaws.services.kinesis.producer.Daemon$1.run(Daemon.java:133)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
[kpl-daemon-0000] ERROR com.amazonaws.services.kinesis.producer.KinesisProducer - Error in child process

@zackb
Copy link

zackb commented Nov 14, 2018

reverting to 0.12.9 fixed it here too

@knap1930
Copy link

having the same issue after upgrading to 0.12.10... seems like a pretty big issue . :(

@jeremysears
Copy link
Contributor

We are also seeing this issue as well. @ankugar is there an ETA to fix this?

@ankugar
Copy link
Contributor

ankugar commented Dec 4, 2018

Thank you for reporting this. I will look into the issue.

@jeremysears
Copy link
Contributor

#232 Is a possible dup of this bug that @zackb mentioned. The failed dependency and the memory leak issue may be separate issues.

@ankugar
Copy link
Contributor

ankugar commented Dec 5, 2018

@eriedl-kiban @jeremysears @zackb @knap1930 I have released a new version v0.12.11 which I hope would fix the issues described here. Kindly reopen this issue if you still see any of the failure mentioned here with the latest release i.e. v 0.12.11.

@jeremysears
Copy link
Contributor

jeremysears commented Dec 7, 2018

@ankugar Unless I'm reading the commit history incorrectly, I don't see any changes in the 0.12.11 release, other than a release note and a version bump. Did you merge your changes? Also, I don't have permissions to re-open issues.

@ankugar
Copy link
Contributor

ankugar commented Dec 7, 2018

@jeremysears You are definitely reading the commit history correctly. My suspicion is that this was a build issue and hence I have released a new build with correct libraries/dependencies with just a version bump. I have not been able to reproduce this on my end, so if this issue is still being noticed, I would appreciate any reproduction steps so I may be able to root cause this further. Thank you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants