Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make search_companies async #12

Merged
merged 2 commits into from
May 29, 2024
Merged

Conversation

sanders41
Copy link
Contributor

The search_filers route is async however the call to Meilisearch for the search is not. This makes the call to Meilisearch async in order to not block the event loop.

Copy link

vercel bot commented May 29, 2024

@sanders41 is attempting to deploy a commit to the bruhbruhroblox's projects Team on Vercel.

A member of the Team first needs to authorize it.

Comment on lines -306 to -308
results = []
for result in hits:
results.append(result)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I removed this because it doesn't look to be doing anything, results is never used. Or maybe I'm missing something?

@leftmove
Copy link
Owner

You're correct in assuming the for loop iterating through hits is redundant. It's probably there because I forgot I wasn't using MongoDB.

As for async Meilisearch - is it necessary? The following is from the FastAPI docs.

You can mix def and async def in your path operation functions as much as you need and define each one using the best option for you. FastAPI will do the right thing with them.

Anyway, in any of the cases above, FastAPI will still work asynchronously and be extremely fast.

Forgive my amateur understanding of asynchrnous Python if this is not the case, but won't FastAPI run asychronously regardless of whether Meilisearch has an asynchronous client (especially with multiple workers)? If there is a benefit though (even a nominal one), I'd be willing to merge this.

Thank you for contributing!

@sanders41
Copy link
Contributor Author

For a "regular" def FastAPI run it in a thread pool, while it awaits an async def. I have not looked through the FastAPI code to verify this, but the "normal" way to do this would be to look at the return value and if it is a coroutine then await, otherwise take the alternate path (thread pool in this case). If this is correct this means that search_filters is not run in the thread pool because it is an async def, but there is also nothing to await so it just blocks.

I have not done any specific FastAPI benchmarking but typically awaiting an async def would be faster than a thread pool. I do have some benchmarks you can see here and the async search was significantly faster when handling multiple requests.

This said another option would be to change search_filters from an async def to a def. Under light load my guess is you wouldn't see much difference if any.

@leftmove
Copy link
Owner

Understood, thank you! I will run a test and merge this.

@leftmove
Copy link
Owner

leftmove commented May 29, 2024

I got two different errors.

The first was easy to fix. In _prepare_meilisearch the create_index function only takes a string for the primary_key. This is different from the main Meilisearch library because that takes an object.

The second problem was that task_uid was not an available property of the task variable. Looking at the code, I saw that the intention was to basically await the create_index function, so I just did that. It looks a little rough but it works.

Here is the (minified) code before and after.

def _prepare_meilisearch()
	if not indexes or "companies" not in [index.uid for index in indexes]:
	    task = client.create_index("companies", {"primaryKey": "cik"})
	    client.wait_for_task(task.task_uid, timeout=None)

_prepare_meilisearch()
import asyncio

async def _prepare_meilisearch()
	if not indexes or "companies" not in [index.uid for index in indexes]:
	    await client.create_index("companies", primary_key="cik")

asyncio.new_event_loop().run_until_complete(_prepare_meilisearch())

I'll add these commits and merge the whole thing, but I just want to make sure I'm following best practices.

Co-authored-by: Paul Sanders <psanders1@gmail.com>
@leftmove leftmove merged commit 94d8ea9 into leftmove:main May 29, 2024
1 check failed
@sanders41 sanders41 deleted the async-meilisearch branch May 29, 2024 21:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants