Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Any plan to suppprt BackgroundTask from Starlette? #79

Closed
oterrier opened this issue Mar 14, 2019 · 14 comments
Closed

Any plan to suppprt BackgroundTask from Starlette? #79

oterrier opened this issue Mar 14, 2019 · 14 comments

Comments

@oterrier
Copy link

Starlette allows to attach a list of background tasks to a response, that will run only once the response has been sent.
If the task is a not a coroutine is it executed on a specific executor to not block the event loop.

    if task.is_coroutine():
        future = asyncio.ensure_future(task())
    else:
        loop = asyncio.get_event_loop()
        future = await loop.run_in_executor(None, task.func)

Is there any possibility to add this great feature to Fastai ?

@oterrier oterrier added the question Question or problem label Mar 14, 2019
@euri10
Copy link
Contributor

euri10 commented Mar 14, 2019 via email

@wshayes
Copy link
Sponsor Contributor

wshayes commented Mar 17, 2019

You can already do this since it's built on Starlette - make sure you return JSONResponse() otherwise it won't start the background tasks.

Example:

from starlette.background import BackgroundTasks
from starlette.responses import UJSONResponse, JSONResponse

@router.post("/nanopubs/flushtosearch", tags=["Nanopubs"])
def flushtosearch():
    """
    Flush all nanopubs to search endpoint
    """

    tasks = BackgroundTasks()
    tasks.add_task(services.search.elasticsearch_index_all_nanopubs)

    message = "Reset Index and Flush to search submitted"
    return JSONResponse(message, background=tasks)

@tiangolo
Copy link
Owner

Again, thanks for your help here guys @euri10 and @wshayes !

It is now integrated into FastAPI in a (probably) more intuitive way (in version 0.10.0).

The new docs are here: https://fastapi.tiangolo.com/tutorial/background-tasks/

In short:

from fastapi import BackgroundTasks, FastAPI

app = FastAPI()


def write_notification(email: str, message=""):
    with open("log.txt", mode="w") as email_file:
        content = f"notification for {email}: {message}"
        email_file.write(content)


@app.post("/send-notification/{email}")
async def send_notification(email: str, background_tasks: BackgroundTasks):
    background_tasks.add_task(write_notification, email, message="some notification")
    return {"message": "Notification sent in the background"}

@wshayes
Copy link
Sponsor Contributor

wshayes commented Mar 24, 2019

Wow - that's a lot more intuitive and will avoid the - "it's not working because you didn't return the JSONResponse()" issue from Starlette - thank you!

@tiangolo
Copy link
Owner

Awesome! I'm glad to hear you like the design/interface.

@madkote
Copy link

madkote commented May 16, 2019

hi all, one similar question:

  • let assume I would like to have a background task, which should NOT be executed after the response has been sent. but before?
  • e.g. in REST, I receive the first chunk of data (multipart/form-data) and would like to start process it in background (e.g. thread - since it is blocking). After a final chunk, I would to await for the result from the executor and return the response to the client.

Is it possible with fastapi?

@tiangolo
Copy link
Owner

@madkote if your processing is in an async function (let's say, created with async def process_something(): ...), you can use an async function for your path operation too, and inside of it, use await process_something().

from starlette.concurrency import run_in_threadpool

@app.get("/items"/)
async def read_items():
    result = await process_something("argument 1", keyword_arg2="keyword argument")
    return result

If it's a normal function, you can use run_in_threadpool from Starlette, pass your standard function and await it, something like:

from starlette.concurrency import run_in_threadpool

@app.get("/items"/)
async def read_items():
    result = await run_in_threadpool(process_something, "argument 1", keyword_arg2="keyword argument")
    return result

If that doesn't solve your problem, please create a new issue for it so we can continue the discussion there.


As the original issue should be solved now with the support for BackgroundTasks, I'll close it now. But feel free to add more comments or create new issues.

@outofnames
Copy link

Hi!
Is it possible to use BackgroundTasks before HTTPException?
I have a scenario in which I need to send a email when an exception occurs, but BackgroundTasks doesn't seem to work when a HTTPException is raised

@madkote
Copy link

madkote commented Jul 8, 2019

@outofnames I guess I would use celery in this scenario.

@outofnames
Copy link

Doesn't using celery only for sending emails seems overkill?

@madkote
Copy link

madkote commented Jul 11, 2019

@outofnames well, otherwise you would need to extend HttpExceptino to be able to run background tasks, which might be difficult. Maybe, additionally, I would ask at starlette project directly.
If you find a solution, please share - I would very appreciate that!

@tiangolo
Copy link
Owner

@outofnames you can probably also use asyncio.ensure_future(awaitable_send_email()) inside of the exception handler, without needing to plug a background task.

https://docs.python.org/3/library/asyncio-future.html#asyncio.ensure_future

@outofnames
Copy link

@tiangolo thanks for the reply, I ended up adding more features to my project just to justify celery :D

@tiangolo tiangolo changed the title [QUESTION] Any plan to suppprt BackgroundTask from Starlette? Any plan to suppprt BackgroundTask from Starlette? Feb 24, 2023
@tiangolo tiangolo reopened this Feb 28, 2023
@github-actions
Copy link
Contributor

Assuming the original need was handled, this will be automatically closed now. But feel free to add more comments or create new issues or PRs.

@tiangolo tiangolo reopened this Feb 28, 2023
Repository owner locked and limited conversation to collaborators Feb 28, 2023
@tiangolo tiangolo converted this issue into discussion #8238 Feb 28, 2023

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Projects
None yet
Development

No branches or pull requests

6 participants