Skip to content

Allow strong filtering of linked URLs #156

Closed
wants to merge 3 commits into from

3 participants

@iamcal
iamcal commented Mar 2, 2014

Currently, apps using this library for turning user-entered Markdown into HTML are vulnerable to XSS attacks via allowing javascript (and others) as the protocol for links. For example:

[CLICK ME](javascript:alert(document.cookie))

This PR passes all link URLs through the filterUrl() method. By default, this function will allow through URls using a whitelist of protocols, or local/relative URLs without a protocol. There is some heavy decoding involved to make sure URLs like javascript%3Aalert(document.cookie) don't slip through.

This change breaks two tests in the current suite ("Links, inline style" & "Quotes in attributes") since they both involve putting broken URLs inside links. Changing the test to have URLs prefixed with http:// makes these test pass correctly.

@michelf
Owner
michelf commented Mar 2, 2014

By default, PHP Markdown lets HTML snippets pass through unchanged. This is part of the design of Markdown. Markdown is not designed for dealing with unfiltered user input or a substitute for an XSS filter.

Assuming we add one, I don't think an URL filtering system should be active by default, because it'd break many current use cases (starting with my own website) where the input is trusted with relative links as well as HTML snippets.

But the question is: is this enough to prevent XSS? Assuming I wanted the feature, I wouldn't accept anything less than a complete solution, otherwise it'll lead to people thinking they're safe when they're not. Before we can pretend we can filter XSS properly, we should make sure we really do, with some auditing ideally. A glaring omission I see immediately in your patch is the filtering of image URLs. I wouldn't be surprised there are other ways to leak javascript I haven't thought about.

My recommendation is to run an external XSS filter on Markdown's output. This way you know that whatever weird bug the parser has you're still safe. Bolting XSS filtering in a parser that does all those other unrelated complicated string manipulations is a recipe for security holes. I believe it's better to keep XSS filtering as a separate step.

On the subject:
http://michelf.ca/blog/2010/markdown-and-xss/

@iamcal
iamcal commented Mar 2, 2014

All good points!

I'm using Markdown with markup and entities disabled, so regular HTML is not preserved. I can certainly understand not wanting to get into having to deal with all the possible transform exploits as security vulnerabilities rather than just plain old bugs.

Worth noting that the patch wont break your current usage - it just breaks some non real-world tests in the suite:

url://with spaces
/"style="color:red
/'style='color:red
@iamcal iamcal closed this Mar 2, 2014
@barryvdh barryvdh referenced this pull request in laravelio/laravel.io Jul 2, 2014
Closed

XSS vulnerability #120

@markseu
markseu commented Aug 22, 2014

I want to add my interest.

An external XSS filter like HTML purifier is big and slow. I wonder what would happen if you switch off HTML code and apply a strong filter to all Markdown generated links (anywhere JavaScript can leak trough). Like Michel said, it must be a complete solution, not just pretending to do the job.

Is this something url_filter_func from issue #85 can be used for?

@michelf
Owner
michelf commented Aug 22, 2014

@markseu The new url_filter_func should allow transforming suspicious URLs into innocuous ones. Feel free to experiment.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Something went wrong with that request. Please try again.