New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory leak / infinite recursion #181
Comments
It seems like it isn't due to 2.11 actually, as just downgraded to 2.10 (which previously worked for me) and still getting a memory leak. Any ideas? thanks |
The recursive file listing routine seems to run away sometimes. Maybe we should replace it with [Apache Commons FileUtils](https://commons.apache.org/proper/commons-io/apidocs/org/apache/commons/io/FileUtils.html#listFiles%28java.io.File, java.lang.String[], boolean%29) and see if the problem disappears. Would you like to try? |
Sorry, I'm not very familiar with Scala nor Java. Any ideas what might be triggering it? / Something I can do to work around it? As I'm not able to push any projects at the moment. Thanks for your work |
@code-tree try another version of Java? See the official download page at http://www.oracle.com/technetwork/java/javase/downloads/index.html. |
@code-tree will this comment help you? |
Thanks for your help |
Sorry, that was actually a misdiagnosis. It doesn't matter which version, it works in 6, 7 and 8 openjdk. Instead, the problem occurs when there are too many files to list, and depends on what the working directory is when executing s3_website. When executed from dir:
So I believe s3_website is actually listing every file from the dir it is executed from (possibly in here), even though I have the --site option specified. |
Thanks for reporting your valuable discovery! |
@code-tree please try out the new version 2.11.1. It contains a fix for this problem. |
Thanks for the fix, though unfortunately the problem still remains. Process went to 500MB before I stopped it (using 2.11.1), running from root (/) with --site set to my project. In addition to limiting the recursion, it would be nice to stop it trying to list files from the working dir when --site is specified, as I think that might be the root cause of the issue? (I'm assuming this is the case, given it works when run from a file tree with little depth. |
@code-tree try 2.11.2, it contains a new fix. |
Thanks Lauri, almost there. It appears to work for That said, even when testing with |
@code-tree thanks for the feedback. According to the implementation the
|
Sorry, what I meant was: the fix works when site is specified in the yaml config (s3_website does not recurse through working dir) but does not seem to work when site is given as a CLI arg (s3_website still recurses through working dir). It seems like def resolveSiteDir(implicit yamlConfig: S3_website_yml, config: Config, cliArgs: CliArgs, workingDirectory: File): Either[ErrorReport, File] = {
val siteFromAutoDetect = if (config.site.isEmpty) { autodetectSiteDir(workingDirectory) } else { None }
val errOrSiteFromCliArgs: Either[ErrorReport, Option[File]] = Option(cliArgs.site) match { |
You are right in your reasoning. However, the code seems not to recurse in the case where one defines the site via the CLI arg. Hence I wonder what could possibly cause the out-of-memory error in that situation. |
@code-tree you can also try to work around the problem like this: First, add into s3_website.yml the following line: site: <%= ENV['SITE_DIR'] %> Then invoke If there is a bug in the way s3_website handles the |
Yes, using ENV in the yaml config has fixed it for the time being, thanks |
Hey there, experiencing the same issue, but setting the Any additional insight to the problem? I'm currently using v2.12.2 |
As a workaround, you can try the pure Ruby implementation at https://github.com/laurilehmijoki/s3_website/tree/1.x |
I'm experiencing the very same behavior even with the I am using Thanks! |
Hi, |
@Nihahs do you mean the root @laurilehmijoki are there any new clues on what may be causing that? In my case, the only reason I'd see is that one of my buckets has a big chunk of logs and although I have Thanks all! |
Since upgrading to 2.11 I'm getting a memory leak when trying to push files. After disabling the below exception with -XX:-UseGCOverheadLimit the process went over 1GB before I killed it. The files I'm sending are tiny though.
Would appreciate any help, thanks
The text was updated successfully, but these errors were encountered: