Skip to content
(not ready yet) A simple but powerful job scheduler for Trio programs
Python Shell
Branch: master
Clone or download
njsmith Merge pull request #14 from dahlia/fix-type-error
Fix missing arguments on _check_positive() calls
Latest commit c080353 Jan 19, 2019
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
ci Run cookiecutter Oct 7, 2018
docs Run cookiecutter Oct 7, 2018
newsfragments Drop newsfragment Jan 19, 2019
trimeter
.coveragerc Run cookiecutter Oct 7, 2018
.gitignore
.readthedocs.yml Run cookiecutter Oct 7, 2018
.style.yapf Run cookiecutter Oct 7, 2018
.travis.yml Trim CI platform list Oct 7, 2018
CHEATSHEET.rst Remove setup checklist Oct 7, 2018
CODE_OF_CONDUCT.md Add CODE_OF_CONDUCT.md and CONTRIBUTING.md Oct 7, 2018
CONTRIBUTING.md Add CODE_OF_CONDUCT.md and CONTRIBUTING.md Oct 7, 2018
LICENSE Run cookiecutter Oct 7, 2018
LICENSE.APACHE2 Run cookiecutter Oct 7, 2018
LICENSE.MIT Run cookiecutter Oct 7, 2018
MANIFEST.in Run cookiecutter Oct 7, 2018
README.rst Remove warning about unreleased trio version Jan 18, 2019
pyproject.toml
setup.py Basic setup Oct 7, 2018
test-requirements.txt Run cookiecutter Oct 7, 2018

README.rst

Join chatroom Documentation Status Latest PyPi version Automated test status Test coverage

Warning

This library isn't ready for release yet. Feedback welcome!

Trimeter

Trio is a friendly Python library for async concurrency and networking. Trimeter is a simple but powerful job scheduler for programs using Trio, released under your choice of the MIT or Apache 2 licenses.

Trimeter's core purpose is to make it easy to execute lots tasks concurrently, with rich options to control the degree of concurrency and to collect the task results.

Say you have 1000 urls that you want to fetch and process somehow:

# Old slow way
for url in urls:
    await fetch_and_process(url)

That's slow, so you want to do several at the same time... but to avoid overloading the network, you want to limit it to at most 5 calls at once. Oh, and there's a request quota, so we have to throttle it down to 1 per second. No problem:

# New and fancy way
await trimeter.run_on_each(
    fetch_and_process, urls, max_at_once=5, max_per_second=1
)

What if we don't know the whole list of urls up front? No worries, just pass in an async iterable instead, and Trimeter will do the right thing.

What if we want to get the result from each call as it finishes, so we can do something further with it? Just use amap (= short for "async map"):

async with trimeter.amap(fetch_and_process, urls, ...) as results:
    # Then iterate over the return values, as they become available
    # (i.e., not necessarily in the original order)
    async for result in results:
        ...

Of course amap also accepts throttling options like max_at_once, max_per_second, etc.

What if we want to use the outcome library to capture exceptions, so one call crashing doesn't terminate the whole program? And also, we want to pass through the original url alongside each result, so we know which result goes with which url?

async with trimeter.amap(
    fetch_and_process,
    urls,
    capture_outcome=True,
    include_value=True,
) as outcomes:
    # Then iterate over the return values, as they become available
    # (i.e., not necessarily in the original order)
    async for url, outcome in outcomes:
        try:
            return_value = outcome.unwrap()
        except Exception as exc:
            print(f"error while processing {url}: {exc!r}")

What if we just want to call a few functions in parallel and then get the results as a list, like asyncio.gather or Promise.all?

return_values = await trimeter.run_all([
    async_fn1,
    async_fn2,
    functools.partial(async_fn3, extra_arg, kwarg="yeah"),
])

Of course, this takes all the same options as the other functions, so you can control the degree of parallelism, use capture_outcome to capture exceptions, and so forth.

For more details, see the fine manual.

Can you summarize that in iambic trimeter?

Iambic trimeter? No problem:

Trimeter gives you tools
for running lots of tasks
to do your work real fast
but not so fast you crash.

Code of conduct

Contributors are requested to follow our code of conduct in all project spaces.

You can’t perform that action at this time.