Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

companion: add stronger validation for urls sent via URL plugin #2083

Merged
merged 8 commits into from
Feb 25, 2020

Conversation

ifedapoolarewaju
Copy link
Contributor

@ifedapoolarewaju ifedapoolarewaju commented Feb 19, 2020

this PR allows for stronger validation on URLs sent via the URL plugin (for file downloads).

It blocks URLs like:

  • redis://domain.com/file-name (because of bad protocol)
  • https://168.192.4.100/file-name (because hostname is not a Fully Qualified Domain Name)
  • http://localhost:9000/file-name (because hostname has no TLD i.e .com suffix)

The last two above are only blocked when companion is not running in debug mode. So that testers running the app locally can test with local servers that may have such URLs.

UPDATE:
I have also added a DNS lookup to validate that download URLs are not pointing to a local IP. In which case, the request will be aborted.

A follow up PR will add an option to companion where users can set a blacklist of URLs similar to the whitelist available for upload URLs

@arturi arturi requested a review from Acconut February 19, 2020 10:34
@Acconut
Copy link
Member

Acconut commented Feb 19, 2020

I am not sure what the purpose of blocking IPs and requiring TLDs is but that can easily be circumvented. For example, I can simply make my domain (evilshit.de) point to 127.0.0.1 and then use http://evilshit.de:8000/foo instead of http://localhost:8000/foo. Depending on your threat model, this may or may not be a problem.

@netcode
Copy link

netcode commented Feb 20, 2020

@Acconut
For the current prevention implementation you definitely can do DNS pinning as you described.

But I believe it can be fixed using DNS resolve first before calling the url using dns.resolve , something like that

dns.resolve(url, (err,dnsRecords) => {
   if(err) return; //error in dns resolve
   for (const record of dnsRecords) {
      //compare the record with IPS blacklisted 
      //ex: 127.0.0.1, 0.0.0.0 , some internal services .. etc
      //if success mark this url as infected
   }
});

What do you think ?

@ifedapoolarewaju
Copy link
Contributor Author

@Acconut @netcode I have added a DNS lookup to validate that download URLs are not pointing to a local IP. In which case, the request will be aborted. Tested with LOCAL_IP_ADDRESS.xip.io. But please let me know what you think.

@goto-bus-stop
Copy link
Contributor

We don't need the redirect prevention and tld filtering anymore if we already do filtering at the DNS level right? There are legitimate use cases for redirects and loading files from IPs

@netcode
Copy link

netcode commented Feb 20, 2020

@goto-bus-stop The redirection is essential , because I can host script that redirect to the IP

evilwebsite.com/index.php

<?php
header('Location: http://127.0.0.1');

@goto-bus-stop
Copy link
Contributor

goto-bus-stop commented Feb 20, 2020

@netcode The current implementation in this PR does the check after each DNS lookup, including when following redirects (instead of doing a separate lookup first). So if we prevent private IPs at that stage it should automatically address malicious redirects, while mundane ones still work, right?

Copy link
Member

@Acconut Acconut left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not going to comment much on the Uppy code itself, but the IP filtering using the HTTP agent looks amazing! Would be nice to have a test for this, though!

@ifedapoolarewaju
Copy link
Contributor Author

@netcode @goto-bus-stop @Acconut I tested with redirects enabled and it's true what @goto-bus-stop says, the dns lookup (validation) is done on every URL including the URL redirected to. So we can indeed enable redirects

There are legitimate use cases for loading files from IPs

However, if we enable IPs (i.e if we remove the FQDN validation), then it doesn't do a dns lookup, hence the validation is skipped. Except we try to do a validation on the higher level to see if the URL is an IP, and then we'd validate that it's not a private IP. But I'm not sure if there are other risks involved by allowing this. I'm also not sure how far we are willing to go to allow this.

@Acconut
Copy link
Member

Acconut commented Feb 21, 2020

we try to do a validation on the higher level to see if the URL is an IP, and then we'd validate that it's not a private IP

👍 for that

@goto-bus-stop
Copy link
Contributor

Yeah in that case we can also do that check inside createConnection directly right? since it will be called again for the redirect but with an IP as the hostname?

@ifedapoolarewaju
Copy link
Contributor Author

I have updated the PR to allow IP addresses and redirects, but validate that the IP address is not a private one.

ipPrefix.push('0.')
// Add IP V4 prefix for private addresses
// See https://en.wikipedia.org/wiki/Private_network
ipPrefix.push('10.')
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't believe this to be correct, i think there are actual WAN addresses that start with 10 and the 10. range should be more discriminatory

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i just checked but couldn't find any such claims online so it seems i was wrong

@goto-bus-stop
Copy link
Contributor

looks like we're missing an @types/ dependency for something?

@goto-bus-stop
Copy link
Contributor

Oh. please remove the packages/@uppy/companion/package-lock.json and do an npm install in the repo root, then it should add ip-address to the root package-lock.json instead. it's kinda annoying that this is what we have to do but oh well 🙃

Copy link
Contributor

@goto-bus-stop goto-bus-stop left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

save for CI and the package-lock, this LGTM!

(can we also add a test?)

Copy link

@netcode netcode left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me

@ifedapoolarewaju ifedapoolarewaju merged commit d983849 into master Feb 25, 2020
@delete-merged-branch delete-merged-branch bot deleted the validate-url branch February 25, 2020 11:51
@kvz kvz mentioned this pull request Mar 22, 2021
HeavenFox pushed a commit to docsend/uppy that referenced this pull request Jun 27, 2023
companion: add stronger validation for urls sent via URL plugin
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants