Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Flushing the buffer fails with AWS::S3::Errors::Forbidden #92

Closed
hitochan777 opened this issue Aug 6, 2015 · 5 comments
Closed

Flushing the buffer fails with AWS::S3::Errors::Forbidden #92

hitochan777 opened this issue Aug 6, 2015 · 5 comments

Comments

@hitochan777
Copy link

Whenever there already exists a file in the s3, fluentd fails to flush the buffer with the following error message.

2015-08-06 13:39:01 +0900 [warn]: temporarily failed to flush the buffer. next_retry=2015-08-06-13:39:17 +0900 error_class="AWS::S3::Errors::Forbidden" error="AWS::S3::Errors::Forbidden" plugin_id="object:3fa6da00f774"
2015-08-06 13:39:01 +0900 [warn]: suppressed same stacktrace

However, if I delete the already existing file, then the flush works fine. I am guessing that the plugin doesn't correctly set the index of a new file. How can I fix this problem? Below is the snippet of configuration file of fluentd.

<match rails>
  type copy
  <store>
      type s3
      aws_key_id AWS_KEY_ID
      aws_sec_key AWS_SEC_KEY
      s3_bucket S3_BUCKET
      s3_region ap-northeast-1
     path logs/
     buffer_path /var/log/td-agent/buffer/s3
     s3_object_key_format %{path}%{time_slice}_%{index}.%{file_extension}
     time_slice_format %Y-%m-%d/%H
     flush_interval 1s
  </store>
</match>
@repeatedly
Copy link
Member

I'm not sure because I can't reproduce this problem.
Maybe, your S3 setting is something wrong for object check API.

@hitochan777
Copy link
Author

s3
What I did was just adding Access Control List (ACL) as above.
I gave the permission to list, upload and delete resources to the authenticated users (who have correct AWS_KEY_ID and AWS_SEC_KEY) but this didn't work.
If I add a bucket policy instead, it worked fine. I don't know why ACL doesn't work. Maybe I misunderstand something very important.

@repeatedly
Copy link
Member

I close this issue.
I hope this problem is not related to this plugin but if someone hit same error, please re-open an issue.

@hitochan777
Copy link
Author

@repeatedly I am sorry for the late notice, but indeed this was NOT due to your plugin, but some settings in AWS was wrong. Anyways, thank you so much for your help!

@abijitn
Copy link

abijitn commented Jul 16, 2018

@repeatedly : I have a similar issue with Fluentd v1.0.2.
What I have observed is that even for uploading logs onto S3, fluentd required a "s3:GetObject" if fluentd is retrying a failed Put operation. If its not a retry operation, it works fine with PutObject permissions. For retry looks like it needs GetObject, ListBucket and PutObject.
My purpose is to limit the IAM user only to PutObject permission.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants