Skip to content

Commit

Permalink
Adds 429 Retry Support
Browse files Browse the repository at this point in the history
All retry support is provided by the 'backoff' library which also
provides an asyncio compatible interface.

This updates the docs and docstrings to call out the new retry
functionality.

https://pulp.plan.io/issues/3421
closes #3421
  • Loading branch information
Brian Bouterse committed Apr 10, 2018
1 parent 7931fa1 commit d8ea62f
Show file tree
Hide file tree
Showing 4 changed files with 58 additions and 11 deletions.
28 changes: 18 additions & 10 deletions docs/plugins/plugin-api/download.rst
Expand Up @@ -61,10 +61,10 @@ the downloader's `run()` method when the download is complete.
.. autoclass:: pulpcore.plugin.download.DownloadResult
:no-members:

.. _configuring-from-an-remote:
.. _configuring-from-a-remote:

Configuring from an Remote
----------------------------
Configuring from a Remote
-------------------------

When fetching content during a sync, the remote has settings like SSL certs, SSL validation, basic
auth credentials, and proxy settings. Downloaders commonly want to use these settings while
Expand All @@ -90,17 +90,25 @@ supported urls.
remote instance share an `aiohttp` session, which provides a connection pool, connection
reusage and keep-alives shared across all downloaders produced by a single remote.


.. _automatic-retry:

Automatic Retry
---------------

The :class:`~pulpcore.plugin.download.HttpDownloader` will automatically retry 10 times if the
server responds with one of the following error codes:

* 429 - Too Many Requests


.. _exception-handling:

Exception Handling
------------------

All downloaders are expected to handle recoverable errors automatically. For example, the
:class:`~pulpcore.plugin.download.HttpDownloader` is expected to retry if a server is too
busy or if a redirect occurs.

Unrecoverable errors of several types can be raised during downloading. One example is a
:ref:`validation exception <validation-exceptions>` that are raised if the content downloaded fails
:ref:`validation exception <validation-exceptions>` that is raised if the content downloaded fails
size or digest validation. There can also be protocol specific errors such as an
``aiohttp.ClientResponse`` being raised when a server responds with a 400+ response such as an HTTP
403.
Expand All @@ -119,8 +127,8 @@ recorded as a non-fatal exception on the task. Plugin writers can also choose to
task by allowing the exception be uncaught which would mark the entire task as failed.

.. note::
The :class:`~pulpcore.plugin.download.HttpDownloader` will raise an exception for any
response code that is 400 or greater.
The :class:`~pulpcore.plugin.download.HttpDownloader` automatically retry in some cases, but if
unsuccessful will raise an exception for any HTTP response code that is 400 or greater.

.. _custom-download-behavior:

Expand Down
6 changes: 5 additions & 1 deletion plugin/pulpcore/plugin/download/base.py
@@ -1,13 +1,17 @@
import asyncio
from collections import namedtuple
import hashlib
import logging
import os
import tempfile

from pulpcore.app.models import Artifact
from .exceptions import DigestValidationError, SizeValidationError


log = logging.getLogger(__name__)


DownloadResult = namedtuple('DownloadResult', ['url', 'artifact_attributes', 'path', 'exception'])
"""
Args:
Expand Down Expand Up @@ -40,7 +44,7 @@ def attach_url_to_exception(func):
:class:`~pulpcore.plugin.download.BaseDownloader`
Returns:
A function that will attach the `url` to any exception emitted by `func`
A coroutine that will attach the `url` to any exception emitted by `func`
"""
async def wrapper(downloader):
try:
Expand Down
34 changes: 34 additions & 0 deletions plugin/pulpcore/plugin/download/http.py
@@ -1,8 +1,34 @@
import logging

import aiohttp
import backoff

from .base import attach_url_to_exception, BaseDownloader, DownloadResult


log = logging.getLogger(__name__)


logging.getLogger('backoff').addHandler(logging.StreamHandler())


def giveup(exc):
"""
Inspect a raised exception and determine if we should give up.
Do not give up when the status code is one of the following:
429 - Too Many Requests
Args:
exc (aiohttp.ClientResponseException): The exception to inspect
Returns:
True if the download should give up, False otherwise
"""
return not exc.code == 429


class HttpDownloader(BaseDownloader):
"""
An HTTP/HTTPS Downloader built on `aiohttp`.
Expand Down Expand Up @@ -50,6 +76,10 @@ class HttpDownloader(BaseDownloader):
>>> except Exception as error:
>>> pass # fatal exceptions are raised by result()
The HTTPDownloaders contain automatic retry logic if the server responds with HTTP 429 response.
The coroutine will automatically retry 10 times with exponential backoff before allowing a
final exception to be raised.
Attributes:
session (aiohttp.ClientSession): The session to be used by the downloader.
auth (aiohttp.BasicAuth): An object that represents HTTP Basic Authorization or None
Expand Down Expand Up @@ -118,10 +148,14 @@ async def _handle_response(self, response):
url=self.url, exception=None)

@attach_url_to_exception
@backoff.on_exception(backoff.expo, aiohttp.ClientResponseError, max_tries=10, giveup=giveup)
async def run(self):
"""
Download, validate, and compute digests on the `url`. This is a coroutine.
This method is decorated with a backoff-and-retry behavior to retry HTTP 429 errors. It
retries with exponential backoff 10 times before allowing a final exception to be raised.
This method provides the same return object type and documented in
:meth:`~pulpcore.plugin.download.BaseDownloader.run`.
"""
Expand Down
1 change: 1 addition & 0 deletions plugin/setup.py
Expand Up @@ -4,6 +4,7 @@
'pulpcore',
'aiohttp',
'aiofiles',
'backoff',
]

with open('README.rst') as f:
Expand Down

0 comments on commit d8ea62f

Please sign in to comment.