New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Spark-UI docker container : setting aws security credentials throws #75
Comments
Seeing the same error. I fixed by adding |
I had the same issue. This was because I had set the LOG dir to S3 instead of S3A:
instead of
|
@Robinspecteur Does pointing the variable to s3a require some special configuration on the s3 destination? Or just setting a Glue job to write to regular S3 is adequate? I try s3a in the address, but get a credential error: I also addressed an error, the documentation is out of date, by setting the key handlers to Edit: For anyone who has this issue, using OP's posted command worked -- I am not sure why, but you need to pass the secret access key to both s3a and s3n handlers. |
BTW, when you use |
Let me close this since there are no updates for several months. |
For anyone stumbling upon this from the google, this is what worked for me
the docker command:
Importantly, notice the environment vars in set explicitly in the docker container in place of the following options in the readme:
I may make a pull request after playing with it some more, but in case I don't just wanted to leave it for anyone. |
After building the docker image and attempting to start it (env vars are exported):
I get:
The text was updated successfully, but these errors were encountered: