Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

S3 input error The AWS Access Key Id you provided does not exist in our records #3193

Closed
karasmeitar opened this issue May 7, 2015 · 7 comments

Comments

@karasmeitar
Copy link

I'm trying to get some log files from s3 bucket and put it to elasticsearch.
My config file is:
input {
s3 {
bucket => "dist-platform-qa"
prefix => "es_export_data"
credentials =>"/home/dev/logstash-1.4.2/Aws.config"
region_endpoint => "us-east-1"
}
}
output {
elasticsearch {
host => "localhost"
protocol => "http"
port=> "9200"
index=> "all"
}
}

My Aws.config file:

AWS_ACCESS_KEY_ID = "blabla"
AWS_SECRET_ACCESS_KEY = "blabla"

But i'm still getting errors for my Aws access key(he AWS Access Key Id you provided does not exist in our records).
When i check the permissions with s3cmd i can get files from the bucket and everything is ok.
Any idea?

@ph
Copy link
Contributor

ph commented May 7, 2015

You should use yaml as the configuration format, can you try this format?

:access_key_id: '1234'
:secret_access_key: secret

@karasmeitar
Copy link
Author

Now i'm getting:
Missing AWS credentials

@karasmeitar
Copy link
Author

When i'm writing the Aws id and secret in the config file i'm getting forbidden.
I'm using ubunto in virtualbox if that change anything.

@ph
Copy link
Contributor

ph commented May 7, 2015

Sorry was a bit confused by your file.

In logstash 1.4.2 the credentials options accept an array, your input configuration should look similar to this:

input {
s3 {
bucket => "dist-platform-qa"
prefix => "es_export_data"
credentials => ["ID", "SECRET"]
region_endpoint => "us-east-1"
}
}

@karasmeitar
Copy link
Author

When i'm writing the credentials as you said i'm getting "Error: AWS::S3::Errors::Forbidden".
But i'm positive that the id and secret are correct.

@karasmeitar
Copy link
Author

maybe its a problem with my VM, ill try this from diffrent VM.do you know how can i change logstash elasticsearch index base on column values in every record?somthing like createddate.

@jordansissel
Copy link
Contributor

For Logstash 1.5.0, we've moved all plugins to individual repositories, so I have moved this issue to logstash-plugins/logstash-input-s3#40. Let's continue the discussion there! :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants