This issue was moved to a discussion.
You can continue the conversation there. Go to discussion →
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
FastApi & MongoDB - the full guide #1515
Comments
Firstly, nice work! As you said, this is a fully working solution to using MongoDB with FastAPI that I'm sure will benefit people going forward. I would highly recommend that if this is to become the "recommended" way of working with MongoDB that we recommend an ODM (object-document-mapper) and show any potential issues with using those with Pydantic/FastAPI. The main reasons are:
The existing ODMs are not great. I don't think any of the major ones include type annotations or bulk write support. But they are fairly lightweight and get us most of the way there, and allow you to reach down into raw Mongo queries when you need to. I think if we're going to put some development effort into making Mongo easier to use with Pydantic/FastAPI, it would be best spent writing docs that are as accessible as possible and maybe contributing to existing ODMs to clear up any sticking points. Obviously ODMs can be a contentious topic, but so can ORMs and FastAPI does not shy away from showing them as the easier way to get started. I think in an ideal world, we'd include the more straight forward "here's an ODM, point and click" approach first and the more advanced "DIY" approach after for people who want to wander into the deep end. |
Correct me if I'm wrong but isn't the real key missing part for all of this the serializers/deserializers/validators for all the Mongo/Bson datatypes in Pydantic. If Pydantic added support for all the extra datatypes then you could just return a MongoEngine instance directly no? |
Interested to know how people are handling creation of indexes in mongo db . Does anyone know of suitable way to define index on a Pydantic model? |
Here Comes! I gived up the json_encoder in the fastapi, and developed a more handy one, specialized for mongodb. Keep in mind that if you has one _id field in the document, the mongodb won't generate one ObjectID. So it's better that we always generate our own _id. # -*- coding: utf-8 -*-
# -----------------------------------
# @CreateTime : 2020/7/25 0:27
# @Author : Mark Shawn
# @Email : shawninjuly@gmail.com
# ------------------------------------
import json
from datetime import datetime, date
from uuid import UUID
from bson import ObjectId
from pydantic import BaseModel
def mongo_json_encoder(record: [dict, list, BaseModel]):
"""
This is a json_encoder designed specially for dump mongodb records.
It can deal with both record_item and record_list type queried from mongodb.
You can extend the encoder ability in the recursive function `convert_type`.
I just covered the following datatype: datetime, date, UUID, ObjectID.
Contact me if any further support needs.
Attention: it will change the raw record, so copy it before operating this function if necessary.
Parameters
----------
**record**: a dict or a list, like the queried documents from mongodb.
Returns
-------
json formatted data.
"""
def convert_type(data):
if isinstance(data, (datetime, date)):
# ISO format: data.isoformat()
return str(data)
elif isinstance(data, (UUID, ObjectId)):
return str(data)
elif isinstance(data, list):
return list(map(convert_type, data))
elif isinstance(data, dict):
return mongo_json_encoder(data)
try:
json.dumps(data)
return data
except TypeError:
raise TypeError({
"error_msg": "暂不支持此类型序列化",
"key": key,
"value": value,
"type": type(value)
})
# add support for BaseModel
if isinstance(record, BaseModel):
return mongo_json_encoder(record.dict(by_alias=True))
elif isinstance(record, dict):
for key, value in record.items():
record[key] = convert_type(value)
return record
else:
return list(map(mongo_json_encoder, record))
def mongo_json_encoder_decorator(func):
"""
this is a decorator for converting the queried documents from mongodb
Parameters
----------
func
Returns
-------
"""
def wrapper(*args, **kwargs):
res = func(*args, **kwargs)
return mongo_json_encoder(res)
return wrapper and the test script is passed as the following: # -*- coding: utf-8 -*-
# -----------------------------------
# @CreateTime : 2020/7/25 0:47
# @Author : Mark Shawn
# @Email : shawninjuly@gmail.com
# ------------------------------------
import uuid
from uuid import UUID
from bson import ObjectId
from typing import List, Union
from pydantic import BaseModel, Field
from utils.json import mongo_json_encoder
class FriendBase(BaseModel):
class Config:
arbitrary_types_allowed = True
allow_population_by_field_name = True
id: Union[str, UUID, ObjectId] = Field(alias='_id')
name: str
class Friend(FriendBase):
friends: List[FriendBase] = []
f_1 = Friend(id='test', name='test')
f_2 = Friend(id=uuid.uuid1(), name='test', friends=[f_1])
f_3 = Friend(id=ObjectId(), name='test', friends=[f_1, f_2])
i_1 = f_1.dict(by_alias=True)
i_2 = f_2.dict(by_alias=True)
i_3 = f_3.dict(by_alias=True)
j_1 = mongo_json_encoder(i_1.copy())
j_2 = mongo_json_encoder(i_2.copy())
j_3 = mongo_json_encoder(i_3.copy())
j_all = [f_1, f_2, f_3]
assert i_1 == j_1
assert i_2 == j_2, "this should not pass"
assert i_3 == j_3, "this should not pass" It just runs well! |
I hope @tiangolo would adapt FastAPI to have less boilerplate code when using MongoDB.. This would be fantastic. |
I recently wrote ODMantic to ease the integration of FastAPI/Pydantic and MongoDB. There is a FastAPI example in the documentation if you wanna have a look 😃 |
@art049 Hey, this looks very promising as an all in one solution the problems discussed in this thread. Would be great to get some buy in from the major players the Python world and see the project grow more mature. I'm always hesitant pulling in relatively new libraries (looks like your project is ~6-7months old) especially into production code until it is proven to be relatively mature and well maintained. Either way, this does look like it address pretty much all of this issues people have brought up. Looking forward to how this progresses. |
Dumb question Doesn't this problem go away if you just allow your mongo engine to auto place |
This was my solution - just define a new 'out' schema with 'id' on there, then set that it to '_id' from the object which comes out of the database on a query. It allowed me to use a standard response model, class UserBase(BaseModel):
id: Optional[PyObjectId] = Field(alias='_id')
username: str
class UserOut(UserBase):
id: Optional[PyObjectId]
@core.get('/user', response_model=users.UserOut)
async def userfake() -> users.UserFull:
user = UserBase()
result = await mdb.users.insert_one(user.dict())
in_db = await mdb.users.find_one({'_id': result.inserted_id})
out_db = users.UserOut(**in_db)
out_db.id = in_db['_id']
return out_db |
I actually improved on that slightly so it kinda 'just works'. I've created a class MongoBase(BaseModel):
id: Optional[PyObjectId] = Field(alias='_id')
class Config(BaseConfig):
orm_mode = True
allow_population_by_field_name = True
json_encoders = {
datetime: lambda dt: dt.isoformat(),
ObjectId: lambda oid: str(oid),
}
class MongoOut(MongoBase):
id: Optional[PyObjectId]
def __init__(self, **pydict):
super().__init__(**pydict)
self.id = pydict.pop('_id')
class UserOut(MongoOut, UserBase):
pass
@core.get('/user', response_model=users.UserOut)
async def userfake():
user = fake_user()
result = await mdb.users.insert_one(user.dict())
in_db = await mdb.users.find_one({'_id': result.inserted_id})
return in_db |
Sorry to add more, hopefully this is useful. The hackiest but simplest solution I've found is below - you don't actually need the alias when using motor engine. Motor automatically adds ObjectID to every object if its not there, so you can actually drop the class MongoBase(BaseModel):
id: Optional[PyObjectId]
class Config(BaseConfig):
orm_mode = True
allow_population_by_field_name = True
json_encoders = {
datetime: datetime.isoformat,
ObjectId: str
}
def __init__(self, **pydict):
super().__init__(**pydict)
self.id = pydict.get('_id')
class UserBase(MongoBase):
username: str
email: str = None
first_name: str = None
last_name: str = None
@core.get('/user', response_model=users.UserBase)
async def userfake():
user = fake_user()
result = await mdb.users.insert_one(user.dict())
in_db = await mdb.users.find_one({'_id': result.inserted_id})
return in_db The downside (is it a downside?) is that in the DB there's a redundant 'id' which isn't being used. Below is what {'_id': ObjectId('5fb9f4c00d1263cc1555d197'), 'id': None, 'username': 'Denise Garcia', } |
@NomeChomsky I use a mixin that works in a similar way.
with pydantic classes like this...
|
How to paginate MongoDB cursor object in FastApi? |
Both regular |
Hi guys {
"_id": ObjectId("6031523be7ff2bb4e5294211"),
"name": "mahdi"
} Error:
I browsed other solutions but they needs change entire project and apis # fix ObjectId & FastApi conflict
import pydantic
from bson.objectid import ObjectId
pydantic.json.ENCODERS_BY_TYPE[ObjectId]=str This work fine for me, to serialize ObjectId with native fastApi methods |
And this is my user model, to support pydantic and mongodb, from pydantic import BaseModel
import struct
import pydantic
from bson.objectid import ObjectId
class BeeObjectId(ObjectId):
# fix for FastApi/docs
__origin__ = pydantic.typing.Literal
__args__ = (str, )
@property
def timestamp(self):
timestamp = struct.unpack(">I", self.binary[0:4])[0]
return timestamp
@classmethod
def __get_validators__(cls):
yield cls.validate
@classmethod
def validate(cls, v):
if not isinstance(v, ObjectId):
raise ValueError("Not a valid ObjectId")
return v
# fix ObjectId & FastApi conflict
pydantic.json.ENCODERS_BY_TYPE[ObjectId]=str
pydantic.json.ENCODERS_BY_TYPE[BeeObjectId]=str
class User(BaseModel):
id: BeeObjectId
name: str
class Config:
fields = {'id': '_id'} |
Having issue with FastApi and Mongo , especially when it comes to date format configured in basemodel , |
Is there a way working with formatted dates with fastapi/mongo? any suggestion?
Mongo connection:
for input :
Routing:
|
@mannawar But when it comes to how to model your data, I would suggest a single generic import enum
class AudoType(str, enum.Enum):
song = "song"
podcast = "podcast"
audiobook = "audiobook"
class Audio(BaseModel):
id: Annotated[str, Field(default_factory=lambda: uuid4().hex)]
name: str = Field(..., exclusiveMaximum=10)
duration: int = Field(...)
uploaded_time: datetime = Field(...)
type: AudioType = Field(...) If the models need to be significantly different, you could also use |
Hey guys, I created a small ORM that would solve this issue. It's based on Pydantic and wraps PyMongo, and it's nothing fancy yet just something that I use in a small production environment. It's still in development, but hopefully the core API is stable enough now. It lacks some features like index creation, fancy query operations, etc. but would get the job done better than what's already in FastAPI contrib and some other wrappers that I found. Docs (kind of): https://ramiawar.github.io/Mongomantic The inspiration for this was ditching mongoengine, which mongomantic is heavily inspired by. Using Pydantic and Mongoengine requires the definition of two schemas, one being a Pydantic model and the other a Mongoengine model. Mongomantic would solve this problem by relying solely on one Pydantic model. Feel free to submit any issues! |
if you want to use model as 'SQLAlchmey way' based on pymongo + pydantic, as follows. looks like extreamly simple. from typing import Union
from abc import abstractmethod, ABCMeta
from datetime import datetime
from pydantic import BaseModel, Field
from bson.objectid import ObjectId as PyObjectId
import os
from motor.motor_asyncio import AsyncIOMotorClient
from config import settings
client = AsyncIOMotorClient(config.MONGO_URL)
db = client[config.MONGO_DB]
class Collection(metaclass=ABCMeta):
__collection__ = None
@classmethod
@property
def collection(cls):
if cls.__collection__ is None:
raise ValueError('collection name is invalid')
return cls.__collection__
@abstractmethod
def document(self) -> dict:
raise NotImplementedError('you have to define document()')
@classmethod
async def find(cls) -> list:
return await db[cls.collection].find().to_list(None)
@classmethod
async def get(cls, query: dict) -> dict:
return await db[cls.collection].find_one(query)
async def create(self) -> dict:
await db[self.collection].insert_one(self.document())
return self.document()
async def update(self, id: ObjectId, update: dict = {}) -> dict:
update = self.document()
del update['_id'], update['updated_at']
await db[self.collection].update_one({'_id': id}, {'$set': update})
return await self.get({'_id': id})
async def delete(self, id: ObjectId) -> None:
return await db[self.collection].delete_one({'_id': id})
class Model(BaseModel):
id: ObjectId = Field(default_factory=ObjectId, alias="_id")
created_at: datetime = datetime.now()
updated_at: datetime = datetime.now()
def document(self) -> dict:
return self.dict(by_alias=True)
@classmethod
def model(cls, document: Union[dict, list, None]):
if not document:
return document
if type(document) is dict:
return cls(**dict(document))
if type(document) is list:
return [cls(**dict(doc)) for doc in document]
class Config:
allow_population_by_field_name = True
arbitrary_types_allowed = True
json_encoders = {
ObjectId: str,
datetime: lambda dt: dt.isoformat()
}
class Document(Model, Collection):
pass |
Motor and other async mongo libs are using pymongo under the hood, which is not async in any way. Apparently, they are all using import asyncio
import concurrent.futures
import functools
executor = concurrent.futures.ThreadPoolExecutor()
def aio(f):
@functools.wraps(f)
async def aio_wrapper(*args, **kwargs):
f_bound = functools.partial(f, *args, **kwargs)
loop = asyncio.get_running_loop()
return await loop.run_in_executor(executor, f_bound)
return aio_wrapper
class AsyncQuerySet(mongoengine.QuerySet):
_get = mongoengine.QuerySet.get
get = aio(mongoengine.QuerySet.get)
_count = mongoengine.QuerySet.count
count = aio(mongoengine.QuerySet.count)
_first = mongoengine.QuerySet.first
first = aio(mongoengine.QuerySet.first)
class Document(mongoengine.Document):
meta = {
'abstract': True,
'queryset_class': AsyncQuerySet,
}
_save = mongoengine.Document.save
save = aio(mongoengine.Document.save)
_update = mongoengine.Document.update
update = aio(mongoengine.Document.update)
_modify = mongoengine.Document.modify
modify = aio(mongoengine.Document.modify)
_delete = mongoengine.Document.delete
delete = aio(mongoengine.Document.delete)
class MyDoc(Document):
....
await MyDoc.objects(id='...').first() # async version
# or
MyDoc.objects(id='...')._first() # sync version Notice, this doesn't cover all the cases yet. In particular, things like |
Having tried a couple of alternatives myself, I have now settled for the ODM library Beanie. It's very much in line with FastAPI philosophies (async, pydantic, etc), well documented and actively maintained. |
Can you compare it to ODMantic? I have settled with ODMantic, but I don't like, that the project is rather inactive (the latest release is relatively old) and it does not work with the latest motor version. |
I haven't done a thorough analysis, but from what I can tell, the two are very similar in how they work internally. I also tried ODMantic, but indeed the fact that there has been zero recent activity steered me off. I also prefer the API of Beanie, as it allows you to do the database interactions right from the document model (e.g., |
How did you solve the |
I think what you want is already the default behaviour for Beanie. All you need to do is to inherit your model from When converting the model to json (for storing in MongoDB, or for your JavaScript frontend) it gets automatically converted to |
Don't like the idea of jsoning FastApi Models, so came up with the following pydantic validator:
|
Beanie is a great library but it suffers from the same issue. While returning the document instance it returns "_id" in the json response while we expect "id" to be returned. Here is how I solved the problem taking tips from all the issues Use separate Response model and set "id" explicitly in "init" method to avoid using dict method class InvestorResponse(Investor):
"""
Investor Response Model
"""
class Config:
fields = {'id': 'id'}
def __init__(self, **pydict):
super(InvestorResponse, self).__init__(**pydict)
self.id = pydict.get('_id') |
While using beanie, you can get the correct "id"-field instead of "_id", by unsetting the @app.get("/todos", response_model=list[Todo], response_model_by_alias=False)
async def get_todos():
...
``` |
Simple and effective, thanks! |
You can improve on this solution by just adding a conditional when you initialize your MongoBase. I ended up going in this direction, where I let Mongo handle all "_id" at document creation time (i.e. I don't pass any kind of ID on The sample below is almost a full working sample. It only needs a mongo client connector object - which is what I import in line 8: from datetime import datetime
from typing import Union
from bson import ObjectId
from pydantic import BaseModel, EmailStr
from fastapi import APIRouter, Depends, HTTPException, status
from app.services.mongo_db_client import client
router = APIRouter(prefix='/users')
# The 3 classes below would be in your `models.py`, or modules inside your `models` subpackage
class BaseMongoModel(BaseModel):
def __init__(self, **data: dict):
data = self._reformat_mongo_id_key(data)
super(BaseMongoModel, self).__init__(**data)
@staticmethod
def _reformat_mongo_id_key(data):
if not data:
return data
if '_id' in data and 'id' not in data:
data['id'] = data.pop('_id', None)
return data
class BaseUserModel(BaseMongoModel):
first_name: str
last_name: str
email: EmailStr
class Config:
json_encoders = {ObjectId: str, datetime: str}
class ResponseUserModel(BaseUserModel):
id: ObjectId
date_created: datetime
# These classes would be in `services.py`, or in modules inside your `services` subpackage
class CrudBase:
def __init__(self, db_name: str, collection_name: str) -> None:
self.client = client.get_client()
self.db = self.client[db_name]
self.collection = self.db[collection_name]
self._id_field = '_id'
self._base_date_fields = ['date_created']
async def create(self, data: dict) -> dict:
data = self._assign_date_fields(data)
data = await self.collection.insert_one(data)
return data
async def fetch_by_id(self, item_id: Union[ObjectId, str]) -> dict:
item = await self.collection.find_one({self._id_field: ObjectId(item_id)})
return item
async def _assign_date_fields(self, data: dict) -> dict:
utc_now = datetime.utcnow()
data.update({date_field: utc_now for date_field in self._base_date_fields})
return data
# Additional Crud operations would go in this class, e.g. insert_many, delete, update_one, update_many, etc...
class CrudUser(CrudBase):
# Inherits all from CrudBase
pass
# This class would be in your dependencies.py. Checks that the record exists in the db. if not, raises HTTPException.
class IdValidators:
def __init__(self, crud_service: CrudUser):
self.crud_service = crud_service
async def validate_id(self, item_id: str) -> dict:
item = await self.crud_service.fetch_by_id(item_id)
if item is None:
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="<Your Error Message Here>")
return item
# The functions below would be in your routers.py
crud = CrudUser(db_name='<Your DB Name>', collection_name='<Your Collection Name>')
validator = IdValidators(crud_service=crud)
@router.get("/{item_id}", response_model=ResponseUserModel)
async def get_user_by_id(user: dict = Depends(validator.validate_id)) -> dict:
return user
@router.post("/", response_model=ResponseUserModel, status_code=status.HTTP_201_CREATED)
async def create_user(user: BaseUserModel) -> dict:
user = user.dict()
user = await crud.create(user)
return user The solution above covers all 6 points addressed in the original post:
In addition:
Unfortunately, OP's 2nd point is very true, there is quite a bit of boilerplate needed to make Mongo work "nicely" with FastAPI. Even if you go with my design of letting MongoDB handle all ID creations and only using the API to handle response, you still need 3 different classes to define a ResponseModel, with one class overriding Pydantic's BaseModel constructor. I agree that it is a high barrier to entry. While much of the material I found in different online forums (and Mongo's own blog) was quite helpful, it still took me a couple of days to figure out a good working solution for my use case. |
To solve the problem, I would report the very convenient pydantic-mongo package: It implements the management of
from pydantic import BaseModel
from pydantic_mongo import AbstractRepository, ObjectIdField
from pymongo import MongoClient
class Spam(BaseModel):
id: ObjectIdField = None
foo: Foo
bars: List[Bar]
class Config:
# The ObjectIdField creates an bson ObjectId value, so its necessary to setup the json encoding
json_encoders = {ObjectId: str}
class SpamRepository(AbstractRepository[Spam]):
class Meta:
collection_name = 'spams'
client = MongoClient(os.environ["MONGODB_URL"])
database = client[os.environ["MONGODB_DATABASE"]]
spam = Spam(foo=Foo(count=1, size=1.0),bars=[Bar()])
spam_repository = SpamRepository(database=database)
# Insert / Update
spam_repository.save(spam)
# Delete
spam_repository.delete(spam)
# Find One By Id
result = spam_repository.find_one_by_id(spam.id) |
This issue was moved to a discussion.
You can continue the conversation there. Go to discussion →
Description
In this issue i'd like to gather all the information about the use of MongoDB, FastApi and Pydantic. At this point this is a "rather complete" solution, but i'd like to gather feedback and comments from the community to se how it can be improved.
The biggest pain point that started this and several other threads when trying to use FastAPI with mongo is the
_id
field. There are several issues here:_id
field beingObjectId
, which is not very JSON-friendly_id
field by it's naming is not very python-friendly (that is, written as is in Pydantic model, it would become a private field - many IDEs will point that)Below i'll try to describe solutions i've found in different places and see what cases do the cover and what's left unsolved.
Let's say, we have some Joe, who's a regular developer. Joe just discovered FastAPI and is familiar with mongo (to the extend that he can create and fetch documents from DB). Joe wants to build clean and fast api that would:
1️⃣ Be able to define mongo-compatible documents as regular Pydantic models (with all the proper validations in place):
2️⃣ Write routes that would use native Pydantic models as usual:
3️⃣ Have api to return json like
{"id": "5ed8b7eaccda20c1d4e95bb0", "name": "Joe"}
(it's quite expected in the "outer world" to haveid
field for the document rather than_id
. And it just looks nicer.)4️⃣ Have Swagger and ReDoc documentation to display fields
id
(str),name
(str)5️⃣ Be able to save Pydantic documents into Mongo with proper
id
field substitution:6️⃣ Should be able to fetch documents from Mongo with proper
id
matching:Known solutions
Validating ObjectId
As proposed in #452, one can define custom field for
ObjectId
and apply validations to it. One can also create base model that would encodeObjectId
into strings:Now we have:
Dealing with
_id
Another suggested option would be to use
alias="_id"
on Pydantic model:Now are able to save to DB using
User.id
field as_id
- that solves 5️⃣.However, how Swagger and ReDoc show id field as
_id
, and json that is returned looks like this:{"_id":"5ed803afba6455fd78659988","name":"Joe"}
. This is a regression for 3️⃣ and 4️⃣Now we have:
Hacking our way through
We can do some extra coding to keep
id
field and make proper inserting into DB. Effectively, we're shufflingid
and_id
field inMongoModel
upon dumping/loading.This brings back documentation and proper output and solves the insertion:
Looks like we're getting closer...
Fetching docs from DB
Now, let's try to fetch doc from DB and return it:
The workaround for this is to use
User.from_mongo
:This seem to cover fetching from DB. Now we have:
Conclusion and questions
Under the spoiler one can find final code to make FastApi work with mongo in the most "native" way:
Full code
And the list of things that are sub-optimal with given code:
response_model
validation. Have to useUser.from_mongo
with every return. This is somewhat a code duplication. Would be nice to get rid of this somehowid
field, while all mongo queries are built using_id
. Afraid there is no way to get rid of this though... (I'm aware that MongoEngine and other ODM engines cover this, but specifically decided to stay out of this subject and focus on "native" code)The text was updated successfully, but these errors were encountered: