Skip to content

Commit

Permalink
Merge pull request #433 from willkg/357-doctest
Browse files Browse the repository at this point in the history
Fix doctest failures
  • Loading branch information
willkg committed Jan 9, 2019
2 parents cabd665 + 245c21c commit 948b745
Show file tree
Hide file tree
Showing 2 changed files with 55 additions and 55 deletions.
46 changes: 23 additions & 23 deletions docs/clean.rst
Original file line number Diff line number Diff line change
Expand Up @@ -63,10 +63,10 @@ For example:
>>> import bleach

>>> bleach.clean(
... u'<b><i>an example</i></b>',
... '<b><i>an example</i></b>',
... tags=['b'],
... )
u'<b>&lt;i&gt;an example&lt;/i&gt;</b>'
'<b>&lt;i&gt;an example&lt;/i&gt;</b>'


The default value is a relatively conservative list found in
Expand Down Expand Up @@ -106,12 +106,12 @@ For example:
>>> import bleach

>>> bleach.clean(
... u'<p class="foo" style="color: red; font-weight: bold;">blah blah blah</p>',
... '<p class="foo" style="color: red; font-weight: bold;">blah blah blah</p>',
... tags=['p'],
... attributes=['style'],
... styles=['color'],
... )
u'<p style="color: red;">blah blah blah</p>'
'<p style="color: red;">blah blah blah</p>'


As a dict
Expand All @@ -135,11 +135,11 @@ and "class" for any tag (including "a" and "img"):
... }

>>> bleach.clean(
... u'<img alt="an example" width=500>',
... '<img alt="an example" width=500>',
... tags=['img'],
... attributes=attrs
... )
u'<img alt="an example">'
'<img alt="an example">'


Using functions
Expand All @@ -161,19 +161,19 @@ For example:
... return name[0] == 'h'

>>> bleach.clean(
... u'<a href="http://example.com" title="link">link</a>',
... '<a href="http://example.com" title="link">link</a>',
... tags=['a'],
... attributes=allow_h,
... )
u'<a href="http://example.com">link</a>'
'<a href="http://example.com">link</a>'


You can also pass a callable as a value in an attributes dict and it'll run for
attributes for specified tags:

.. doctest::

>>> from urlparse import urlparse
>>> from six.moves.urllib.parse import urlparse
>>> import bleach

>>> def allow_src(tag, name, value):
Expand All @@ -185,13 +185,13 @@ attributes for specified tags:
... return False

>>> bleach.clean(
... u'<img src="http://example.com" alt="an example">',
... '<img src="http://example.com" alt="an example">',
... tags=['img'],
... attributes={
... 'img': allow_src
... }
... )
u'<img alt="an example">'
'<img alt="an example">'


.. versionchanged:: 2.0
Expand Down Expand Up @@ -223,12 +223,12 @@ For example, to allow users to set the color and font-weight of text:
>>> styles = ['color', 'font-weight']

>>> bleach.clean(
... u'<p style="font-weight: heavy;">my html</p>',
... '<p style="font-weight: heavy;">my html</p>',
... tags=tags,
... attributes=attrs,
... styles=styles
... )
u'<p style="font-weight: heavy;">my html</p>'
'<p style="font-weight: heavy;">my html</p>'


Default styles are stored in ``bleach.sanitizer.ALLOWED_STYLES``.
Expand All @@ -252,7 +252,7 @@ For example, this sets allowed protocols to http, https and smb:
... '<a href="smb://more_text">allowed protocol</a>',
... protocols=['http', 'https', 'smb']
... )
u'<a href="smb://more_text">allowed protocol</a>'
'<a href="smb://more_text">allowed protocol</a>'


This adds smb to the Bleach-specified set of allowed protocols:
Expand All @@ -265,7 +265,7 @@ This adds smb to the Bleach-specified set of allowed protocols:
... '<a href="smb://more_text">allowed protocol</a>',
... protocols=bleach.ALLOWED_PROTOCOLS + ['smb']
... )
u'<a href="smb://more_text">allowed protocol</a>'
'<a href="smb://more_text">allowed protocol</a>'


Default protocols are in ``bleach.sanitizer.ALLOWED_PROTOCOLS``.
Expand All @@ -284,10 +284,10 @@ and invalid markup. For example:
>>> import bleach

>>> bleach.clean('<span>is not allowed</span>')
u'&lt;span&gt;is not allowed&lt;/span&gt;'
'&lt;span&gt;is not allowed&lt;/span&gt;'

>>> bleach.clean('<b><span>is not allowed</span></b>', tags=['b'])
u'<b>&lt;span&gt;is not allowed&lt;/span&gt;</b>'
'<b>&lt;span&gt;is not allowed&lt;/span&gt;</b>'


If you would rather Bleach stripped this markup entirely, you can pass
Expand All @@ -298,10 +298,10 @@ If you would rather Bleach stripped this markup entirely, you can pass
>>> import bleach

>>> bleach.clean('<span>is not allowed</span>', strip=True)
u'is not allowed'
'is not allowed'

>>> bleach.clean('<b><span>is not allowed</span></b>', tags=['b'], strip=True)
u'<b>is not allowed</b>'
'<b>is not allowed</b>'


Stripping comments (``strip_comments``)
Expand All @@ -317,10 +317,10 @@ By default, Bleach will strip out HTML comments. To disable this behavior, set
>>> html = 'my<!-- commented --> html'

>>> bleach.clean(html)
u'my html'
'my html'

>>> bleach.clean(html, strip_comments=False)
u'my<!-- commented --> html'
'my<!-- commented --> html'


Using ``bleach.sanitizer.Cleaner``
Expand Down Expand Up @@ -353,7 +353,7 @@ Trivial Filter example:
.. doctest::

>>> from bleach.sanitizer import Cleaner
>>> from html5lib.filters.base import Filter
>>> from bleach.html5lib_shim import Filter

>>> class MooFilter(Filter):
... def __iter__(self):
Expand All @@ -371,7 +371,7 @@ Trivial Filter example:
>>> cleaner = Cleaner(tags=TAGS, attributes=ATTRS, filters=[MooFilter])
>>> dirty = 'this is cute! <img src="http://example.com/puppy.jpg" rel="nofollow">'
>>> cleaner.clean(dirty)
u'this is cute! <img rel="moo" src="moo">'
'this is cute! <img rel="moo" src="moo">'


.. Warning::
Expand Down
64 changes: 32 additions & 32 deletions docs/linkify.rst
Original file line number Diff line number Diff line change
Expand Up @@ -80,12 +80,12 @@ For example, you could add a ``title`` attribute to all links:
>>> from bleach.linkifier import Linker

>>> def set_title(attrs, new=False):
... attrs[(None, u'title')] = u'link in user text'
... attrs[(None, 'title')] = 'link in user text'
... return attrs
...
>>> linker = Linker(callbacks=[set_title])
>>> linker.linkify('abc http://example.com def')
u'abc <a href="http://example.com" title="link in user text">http://example.com</a> def'
'abc <a href="http://example.com" title="link in user text">http://example.com</a> def'


This would set the value of the ``rel`` attribute, stomping on a previous value
Expand All @@ -96,21 +96,21 @@ an external link:

.. doctest::

>>> from urlparse import urlparse
>>> from six.moves.urllib.parse import urlparse
>>> from bleach.linkifier import Linker

>>> def set_target(attrs, new=False):
... p = urlparse(attrs[(None, u'href')])
... p = urlparse(attrs[(None, 'href')])
... if p.netloc not in ['my-domain.com', 'other-domain.com']:
... attrs[(None, u'target')] = u'_blank'
... attrs[(None, u'class')] = u'external'
... attrs[(None, 'target')] = '_blank'
... attrs[(None, 'class')] = 'external'
... else:
... attrs.pop((None, u'target'), None)
... attrs.pop((None, 'target'), None)
... return attrs
...
>>> linker = Linker(callbacks=[set_target])
>>> linker.linkify('abc http://example.com def')
u'abc <a class="external" href="http://example.com" target="_blank">http://example.com</a> def'
'abc <a class="external" href="http://example.com" target="_blank">http://example.com</a> def'


Removing Attributes
Expand All @@ -127,17 +127,17 @@ sanitizing attributes.)
>>> def allowed_attrs(attrs, new=False):
... """Only allow href, target, rel and title."""
... allowed = [
... (None, u'href'),
... (None, u'target'),
... (None, u'rel'),
... (None, u'title'),
... u'_text',
... (None, 'href'),
... (None, 'target'),
... (None, 'rel'),
... (None, 'title'),
... '_text',
... ]
... return dict((k, v) for k, v in attrs.items() if k in allowed)
...
>>> linker = Linker(callbacks=[allowed_attrs])
>>> linker.linkify('<a style="font-weight: super bold;" href="http://example.com">link</a>')
u'<a href="http://example.com">link</a>'
'<a href="http://example.com">link</a>'


Or you could remove a specific attribute, if it exists:
Expand All @@ -147,15 +147,15 @@ Or you could remove a specific attribute, if it exists:
>>> from bleach.linkifier import Linker

>>> def remove_title(attrs, new=False):
... attrs.pop((None, u'title'), None)
... attrs.pop((None, 'title'), None)
... return attrs
...
>>> linker = Linker(callbacks=[remove_title])
>>> linker.linkify('<a href="http://example.com">link</a>')
u'<a href="http://example.com">link</a>'
'<a href="http://example.com">link</a>'

>>> linker.linkify('<a title="bad title" href="http://example.com">link</a>')
u'<a href="http://example.com">link</a>'
'<a href="http://example.com">link</a>'


Altering Attributes
Expand All @@ -177,14 +177,14 @@ Example of shortening link text:
... if not new:
... return attrs
... # _text will be the same as the URL for new links
... text = attrs[u'_text']
... text = attrs['_text']
... if len(text) > 25:
... attrs[u'_text'] = text[0:22] + u'...'
... attrs['_text'] = text[0:22] + '...'
... return attrs
...
>>> linker = Linker(callbacks=[shorten_url])
>>> linker.linkify('http://example.com/longlonglonglonglongurl')
u'<a href="http://example.com/longlonglonglonglongurl">http://example.com/lon...</a>'
'<a href="http://example.com/longlonglonglonglongurl">http://example.com/lon...</a>'


Example of switching all links to go through a bouncer first:
Expand All @@ -196,7 +196,7 @@ Example of switching all links to go through a bouncer first:

>>> def outgoing_bouncer(attrs, new=False):
... """Send outgoing links through a bouncer."""
... href_key = (None, u'href')
... href_key = (None, 'href')
... p = urlparse(attrs.get(href_key, None))
... if p.netloc not in ['example.com', 'www.example.com', '']:
... bouncer = 'http://bn.ce/?destination=%s'
Expand All @@ -205,10 +205,10 @@ Example of switching all links to go through a bouncer first:
...
>>> linker = Linker(callbacks=[outgoing_bouncer])
>>> linker.linkify('http://example.com')
u'<a href="http://example.com">http://example.com</a>'
'<a href="http://example.com">http://example.com</a>'

>>> linker.linkify('http://foo.com')
u'<a href="http://bn.ce/?destination=http%3A//foo.com">http://foo.com</a>'
'<a href="http://bn.ce/?destination=http%3A//foo.com">http://foo.com</a>'


Preventing Links
Expand All @@ -230,7 +230,7 @@ write the following callback:
... return attrs
... # If the TLD is '.py', make sure it starts with http: or https:.
... # Use _text because that's the original text
... link_text = attrs[u'_text']
... link_text = attrs['_text']
... if link_text.endswith('.py') and not link_text.startswith(('http:', 'https:')):
... # This looks like a Python file, not a URL. Don't make a link.
... return None
Expand All @@ -239,10 +239,10 @@ write the following callback:
...
>>> linker = Linker(callbacks=[dont_linkify_python])
>>> linker.linkify('abc http://example.com def')
u'abc <a href="http://example.com">http://example.com</a> def'
'abc <a href="http://example.com">http://example.com</a> def'

>>> linker.linkify('abc models.py def')
u'abc models.py def'
'abc models.py def'


.. _Crate: https://crate.io/
Expand All @@ -261,13 +261,13 @@ For example, this removes any ``mailto:`` links:
>>> from bleach.linkifier import Linker

>>> def remove_mailto(attrs, new=False):
... if attrs[(None, u'href')].startswith(u'mailto:'):
... if attrs[(None, 'href')].startswith('mailto:'):
... return None
... return attrs
...
>>> linker = Linker(callbacks=[remove_mailto])
>>> linker.linkify('<a href="mailto:janet@example.com">mail janet!</a>')
u'mail janet!'
'mail janet!'


Skipping links in specified tag blocks (``skip_tags``)
Expand Down Expand Up @@ -308,7 +308,7 @@ instance.

>>> linker = Linker(skip_tags=['pre'])
>>> linker.linkify('a b c http://example.com d e f')
u'a b c <a href="http://example.com" rel="nofollow">http://example.com</a> d e f'
'a b c <a href="http://example.com" rel="nofollow">http://example.com</a> d e f'


.. autoclass:: bleach.linkifier.Linker
Expand Down Expand Up @@ -340,11 +340,11 @@ For example, using all the defaults:

>>> cleaner = Cleaner(tags=['pre'])
>>> cleaner.clean('<pre>http://example.com</pre>')
u'<pre>http://example.com</pre>'
'<pre>http://example.com</pre>'

>>> cleaner = Cleaner(tags=['pre'], filters=[LinkifyFilter])
>>> cleaner.clean('<pre>http://example.com</pre>')
u'<pre><a href="http://example.com">http://example.com</a></pre>'
'<pre><a href="http://example.com">http://example.com</a></pre>'


And passing parameters to ``LinkifyFilter``:
Expand All @@ -362,7 +362,7 @@ And passing parameters to ``LinkifyFilter``:
... )
...
>>> cleaner.clean('<pre>http://example.com</pre>')
u'<pre>http://example.com</pre>'
'<pre>http://example.com</pre>'


.. autoclass:: bleach.linkifier.LinkifyFilter
Expand Down

0 comments on commit 948b745

Please sign in to comment.