-
-
Notifications
You must be signed in to change notification settings - Fork 8.4k
Description
First check
- I used the GitHub search to find a similar issue and didn't find it.
- I searched the FastAPI documentation, with the integrated search.
- I already searched in Google "How to X in FastAPI" and didn't find any information.
TL;DR
When trying to get long (~200-300 items) list of entity with it's relationships using response_model the request is extremely slow (10-20 seconds). How can I avoid it?
Description
In ML ploject I have an entity "intent" with examples of user's requests representing this intent.
My current database model looks so:
class Intent(Base):
__tablename__ = 'intent'
id = Column(GUID(), primary_key=True, default=uuid.uuid4, unique=True, nullable=False, index=True)
name = Column(String, unique=True)
examples = relationship("IntentExample", back_populates="intent", cascade="all, delete-orphan")
class IntentExample(Base):
__tablename__ = 'intent_example'
text = Column(String, primary_key=True)
intent_id = Column(GUID(), ForeignKey('intent.id'))
intent = relationship("Intent", back_populates="examples")
I try to get full list of intents with following code (using crud_base.py from full-stack-fastapi-postgresql project):
@router.get("/", response_model=List[Intent])
def get_intents(db: Session = Depends(get_db)):
data = crud.intent.list(db)
print(data)
return data
I see that data is selected from postgres in a second and printed to stdout in upper example code.
After that something inside fastapi / pydantic / ??? is converting response to response_model and it takes a lot of time.
If I do just @router.get("/") without response_model the response is returned very fast, but it doesn't contain list of examples which I need.
Pydantic schemas for Intent:
class IntentExample(BaseModel):
text: str
class Config:
orm_mode = True
class Intent(BaseModel):
id: UUID4
name: str
examples: List[IntentExample] = []
class Config:
orm_mode = True
Please help me to find a correct way to do this using fastapi.
Thanks in advance!