Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

plugin sometimes stops uploading #56

Open
shaharmor opened this issue Nov 22, 2015 · 9 comments
Open

plugin sometimes stops uploading #56

shaharmor opened this issue Nov 22, 2015 · 9 comments

Comments

@shaharmor
Copy link

Hi,

We have a recurring bug that is happening for a while now.
Every once in a while the plugin just stops uploading the files to S3.
The files are still saved to disk, but the disks just keeps getting filled with the uploader not doing anything..

Any ideas what it can be or how i can debug it? There are no warnings/error in the log file

@LarsFronius
Copy link

Can you double check if its related to #55
?
Do you have time_file and/or size_file configured?

@shaharmor
Copy link
Author

I have time_file set to 1 minute.
There are no errors what so ever in the logs, so i can't tell you exactly what happens, but i can tell you for sure that the plugin continues to work, but is not uploading.
The plugin keeps writing events to the temp file, but never uploads them.
I think that maybe if the connection to S3 is broken in the middle of the script running, than it won't reconnect and you don't catch that error.

@zot420
Copy link

zot420 commented Dec 17, 2015

I am seeing the same behavior, happens with both time_file and size_file.

I have tried each individually and also together (small file size and longer time) just to see if the size didn't catch it the time one would, but still having the issue

@zot420
Copy link

zot420 commented Dec 18, 2015

So it looks like the problem is with how you are setting up the worker threads. They don't stay alive after they process an upload. I tested this by setting the number of upload workers, and every time I get that number of files and then things start queuing on disk.

logstash 2.1.1
logstash-output-s3-2.0.3

@Tratnis
Copy link

Tratnis commented Feb 10, 2016

We seam to have the exact problem reported by zot420.

Is there any advice here? At least a work around?

Thanks!

@arunsanna
Copy link

Is this issue resolved ??

@Tratnis
Copy link

Tratnis commented Nov 16, 2016

Unfortunately i have no idea about the bug and it's resolution. However a quick work around for me was to restart logstash in case the folder start filling up more than expected.

I have nothing to share as we are currently migrating to another solution, therefor did no automation script (aside from standard disk monitoring) to manage that.

Hope this help at least a little.

@cypai
Copy link

cypai commented Dec 21, 2017

My team is also running into this bug consistently. I don't think it is related to workers though - we have multiple identical machines, but one of them stopped uploading at 998 files, and the other one is at 1400 files but is still working correctly. My guess is that the S3 connection returns some sort of unexpected error, and the plugin doesn't handle it correctly. It must be the S3 upload part - the file rotation still works as expected, just that the file doesn't get uploaded.

Restarting logstash does work, but is obviously not a good solution. We will likely look into alternatives until this is fixed.

@richard-mauri
Copy link

I am testing logstash under docker with this plugin under localstack.
I ran a test case and found there are files under /tmp/logstash, but they are not uploaded to s3.
When I restart logstash (kill -TERM) I see the tmplogstash is now empty; then I checked the s3 bucket and indeed there are docs there. The /tmp/logstash is still empty. It seems like this plugin has some state/buffering/flushing problems.

Here is my output config file:

output {
    s3 {

        endpoint                   => "http://localstack:4566"
        access_key_id              => "test"
        secret_access_key          => "test"
        additional_settings => { "force_path_style" => true }
        validate_credentials_on_root_bucket => false
        region => "us-east-1"
        bucket => "em-top-archive-us-east-1-local-localstack"
        codec => "json"
        canned_acl => "private"
        prefix => "year=%{[@metadata][index_year]}/month=%{[@metadata][index_month]}/day=%{[@metadata][index_day]}/type=%{[@metadata][index]}/project=%{[project.name]}/environment=%{[ecsenv]}"
    }
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants