Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

can I takeover S3 bucket? #361

Closed
C0oki3s opened this issue Mar 16, 2023 · 13 comments
Closed

can I takeover S3 bucket? #361

C0oki3s opened this issue Mar 16, 2023 · 13 comments
Labels
question Further information is requested

Comments

@C0oki3s
Copy link

C0oki3s commented Mar 16, 2023

Service name

image

  1. I have a subdomain xyz.domain.com where (Hosted/pointed) to S3 bucket but as you can see from above image that the bucket error AccessDenied
  2. so I tried checking is Bucket is already taken or not using aws-cli

image

  1. seeing this I claimed S3 bucket and hosted publicly still the xyz.domain.com gives the same exact error message as (Image1)
  2. I did checked nslookup which it was pointing to fastly

image

5) I tried claiming xyz.domain.com but it was already clamied(Domain 'domain.com' is owned by another customer)

what I Have tried?

  1. I have Adding xyz.domain.com to Route53 and I was successful with it but it still shows the error from image[1]
  2. Updated Both CNAME and A record with appending www[1].xyz.domain.com
@EdOverflow EdOverflow added the question Further information is requested label Mar 18, 2023
@knowthetech
Copy link

how can you takeover already claimed bucket?

@C0oki3s
Copy link
Author

C0oki3s commented Mar 22, 2023

I claimed the bucket not someone else @knowthetech

@GDATTACKER-RESEARCHER
Copy link

I claimed the bucket not someone else @knowthetech

If you have claimed that's simple what's the issue simply add bucket policy and enable static hosting all done why messing up with other services😁

@C0oki3s
Copy link
Author

C0oki3s commented Mar 22, 2023

Please read the comment again even if I add bucket that doesn't do anything on xyz.domain.com still it was showing same error I enabled static hosting too, if you clearly read the comment I stated the cname is pointing to fastly but not S3 but there server header is showing AmazonS3. This is edge case and I want to know what's happening if you guys know please let me know I think there is misconfiguration in DNS server

@GDATTACKER-RESEARCHER
Copy link

If you add the bucket the bucket doesn't exist error will change to access denied in case of xml format takeover of s3 bucket than you change it to static hosting but you missed also adding bucket policy i get. That doesn't matter if they have fasty or cloudflare or edge in case of cdn.

@C0oki3s
Copy link
Author

C0oki3s commented Mar 22, 2023

@GDATTACKER-RESEARCHER can you please elaborate.

{
"Version": "2012-10-17",
"Id": "Policy167899115xxxx",
"Statement": [
{
"Sid": "Stmt167899114xxxx",
"Effect": "Allow",
"Principal": "",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::xyz.domain.com/
"
}
]
}

Here is my policy, and I don't know the term XML format takeover, what I'm assuming is if the someone already did XML format takeover then it shows 403 error but they didn't update the policy to S3 bucket. But still I'm confused how cloud I takeover S3 bucket.

@C0oki3s C0oki3s closed this as completed Mar 22, 2023
@C0oki3s C0oki3s reopened this Mar 22, 2023
@knowthetech
Copy link

{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "PublicReadGetObject",
"Effect": "Allow",
"Principal": {
"AWS": ""
},
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::xyz.domain.com/
"
}
]
}

that guy mean there are 2 type of takeover errors in s3 takeover one shows error in xml format where as other is normal white page with error metioned try this policy it will work and try visiting path to your uploaded file than.

@knowthetech
Copy link

but there need to be a asterisk after xyz.domain.com/

@C0oki3s
Copy link
Author

C0oki3s commented Mar 22, 2023

The policy is indeed correct and I'm able to access my html file at http://xyz.domain.com.s3-website-us-east-1.amazonaws.com 

@C0oki3s
Copy link
Author

C0oki3s commented Mar 22, 2023

Please tell me guys if

@C0oki3s C0oki3s closed this as completed Mar 22, 2023
@C0oki3s C0oki3s reopened this Mar 22, 2023
@GDATTACKER-RESEARCHER
Copy link

The policy is indeed correct and I'm able to access my html file at http://xyz.domain.com.s3-website-us-east-1.amazonaws.com 

That's not a big issue organization will accept it still it happens sometimes.

@C0oki3s
Copy link
Author

C0oki3s commented Mar 22, 2023

@GDATTACKER-RESEARCHER it's not about accepting the report or not I just want to know what could be the developer did the mistake to show such response. And btw the report got duplicated in H1 but still want to know!!

@C0oki3s
Copy link
Author

C0oki3s commented Apr 7, 2023

@GDATTACKER-RESEARCHER, @knowthetech, @EdOverflow, @codingo
Hey there, I was able to figure out what's going on in this scenario thanks to Green-jam, who provided me with a lead from which I was capable of replicating this scenario.
Here are my Findings,

  1. firstly, I have an S3 bucket abc.rhack.tech.s3-website-us-east-1.amazonaws.com that is public and has static hosting enabled in this situation, but the above it is hosted in an unknown bucket that we cannot discover, why? I'll explain further below.
  2. Dev is achieving this by making his S3 data buckets available via Fastly.
    2.1) Adding his subdomain xyz.rhack.tech to fastly, [where in the above scenario its xyz.domain.com]
    2.2) then as for host he is pointing it to S3 bucket [abc.rhack.tech.s3-website-us-east-1.amazonaws.com], [where in the above scenario its Unknown]

image

image

3) lastly, Adding CNAME to DNS
3.1) I have added my fastly-dns [xyz.rhack.tech.global.prod.fastly.net] to target field 
 which will obfuscate S3 bucket from dig, nslookup, reverse lookup, etc.

image

So, the above scenario is a false positive and cannot able to perform S3 bucket takeover

Green-jam : fastly is connected to an S3 bucket that you don't have the details of. The S3 bucket you have claimed may just effectively be a random S3 bucket that matches the subdomain name. The actual S3 bucket connected to by fastly that you do not know the name of, plus it already looks claimed anyway hence the 403 from s3.

regards,
@C0oki3s

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

4 participants