Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ActiveStorage should not attempt public access on create / update #37843

Closed
collimarco opened this issue Nov 30, 2019 · 8 comments
Closed

ActiveStorage should not attempt public access on create / update #37843

collimarco opened this issue Nov 30, 2019 · 8 comments

Comments

@collimarco
Copy link

Steps to reproduce

Rails 6.1 introduces public access for buckets.

Once you have a public bucket you usually want to add a CDN / WAF (like Cloudflare or Cloudfront) to reduce latency, reduce cost and increase security / protection.

If you call variant.service_url.sub('my.s3.host', 'my.cdn.host') you can easily use a CDN, without Rails patches. Once you have added a CDN the natural step to increase security (as recommended by Amazon) is to allow public read to the bucket only to the CDN. You can do that easily by turning off public access in the bucket settings. The images at this point are still served properly through the CDN, but you cannot update your images anymore!

The problem is this:

Aws::S3::Errors::AccessDenied (Access Denied):
app/controllers/posts_controller.rb:20:in `update`

Basically when you call @post.update() (and post has an attached image) Rails attempts unauthenticated, public access to the bucket.

Public access should be intended only for end users, meaning that Rails does not generate the signature when the bucket is marked as public in storage.yml. At the same time Rails should keep authenticating its requests (including GET) when they are for internal purposes (like the update method or retrievals of the file to generate a variant).

This would allow anyone to add a CDN in front of the public bucket and restrict access to the bucket. Otherwise you cannot restrict the access to the bucket, because Rails keeps sending unauthenticated, public read requests directly to the bucket.

Expected behavior

All requests (including GET) sent directly by the Rails app to a bucket marked as public in storage.yml should use authentication. "Directly" means "not intermediated by a user browser".

Actual behavior

When you call update() or other methods on a model that has an attachment, Rails sends unauthenticated HTTP requests directly to the bucket, resulting in AccessDenied errors if the "public access" to a bucket is restricted to a CDN.

System configuration

Rails version: 6.1.alpha
Ruby version: 2.6.5

@janko
Copy link
Contributor

janko commented Nov 30, 2019

Could you include the full stack trace of the Aws::S3::Errors::AccessDenied exception (or the relevant line inside Active Storage)? I would like to know where it's coming from, because at the moment I don't see where Active Storage is making the unauthenticated requests.

@collimarco
Copy link
Author

@janko Yes, sure, here's the full trace: https://gist.github.com/collimarco/59d6c0c6ee54d68dec044e83267fedbe

@janko
Copy link
Contributor

janko commented Nov 30, 2019

I think that Active Storage is still authenticating uploads and other requests, I think there might be a problem with the permissions after the changes you've made. Maybe it's related to blocking of the public access on the bucket level, but I couldn't reproduce it locally using raw aws-sdk-s3. I'm just speculating, though.

@collimarco
Copy link
Author

@janko If I go to the s3 bucket and I turn on "Block all public access", I get the AccessDenied error. If I turn that off, then everything works fine.
So that proves that Rails is attempting some sort of public access when you call a normal create / update on the model (obviously remember to include an attachment).

If you want to reproduce that yourself:

  1. Add this bucket policy; remember to replace YOURUSER with your actual user and example.storage.com with your actual bucket name
  2. Then turn on "Block all public access"
  3. the public access is restricted to CF ip ranges, however Rails itself (not the end user) should still be able to create / update since YOURUSER has the proper permissions

@janko
Copy link
Contributor

janko commented Dec 1, 2019

Thanks for the extra instructions! I'm too confused by the AWS UI to be able to follow the instructions (I didn't manage to make the policy valid and AWS wouldn't tell me why), but I believe you.

The only difference IMO when public: true is set is the additional acl: "public-read" parameter that's sent to the upload request. The upload request will still be authenticated (it's not possible to make an unauthenticated upload request), but perhaps that public bucket access setting doesn't allow for the additional acl: "public-read" parameter, or maybe it needs to be somehow allowed through the policy.

@collimarco
Copy link
Author

I didn't manage to make the policy valid

@janko You should replace this line entirely "arn:aws:iam::123456789:user/YOURUSER" and not just YOURUSER, as I wrote above. BTW yes, AWS policy errors are terrible...

perhaps that public bucket access setting doesn't allow for the additional acl: "public-read" parameter

That is a good guess!

I have just made a test and it seems that if I use my bucket policy above, which explicitly allows Cloudflare IPs, I don't need public ACL. So I could keep the default private behavior / acl of Rails... and simply allow the Cloudflare IPs with a bucket policy.

Only problem... If I remove public: true from the storage.yml, Rails will also start generating signed URLs for the end users, but I don't need that :( Indeed the end users connect through Cloudflare (whose IPs are whitelisted) and thus don't need a signature.

If I keep Rails private behavior / private bucket, is there any way to get a non-signed URL to a variant?

@collimarco
Copy link
Author

Solved! I can use the Rails default private bucket / private acl and simply allow the Cloudflare IP ranges in the bucket policy.

Then when I wan to display the image from the CDN, without signatures, I use:

"https://storage.example.com/" + variant.key

storage.example.com points to Cloudflare, which proxies to the actual bucket on s3.

@andypearson
Copy link

I've spent the afternoon bumping in to this problem. As alluded to by @janko it turns out that S3 storage for ActiveStorage on Rails 6.1 with public: true requires the s3:PutObjectAcl action to be allowed in the bucket policy.

I ended up with the following policy which seems to work:

{
  "Statement": [
    {
      "Effect": "Allow",
      "Action": [
        "s3:ListBucket"
      ],
      "Resource": [
        "arn:aws:s3:::BUCKET_NAME"
      ]
    },
    {
      "Effect": "Allow",
      "Action": [
        "s3:PutObjectAcl",
        "s3:PutObject",
        "s3:GetObject",
        "s3:DeleteObject"
      ],
      "Resource": [
        "arn:aws:s3:::BUCKET_NAME/*"
      ]
    }
  ]
}

Warning! I'm not an S3 expert so my policy is probably not perfect.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants