If you don't want to use pydantic's BaseModel
you can instead get the same data validation on standard
dataclasses (introduced in python 3.7).
Dataclasses work in python 3.6 using the dataclasses backport package.
{!.tmp_examples/dataclasses_main.py!}
(This script is complete, it should run "as is")
!!! note
Keep in mind that pydantic.dataclasses.dataclass
is a drop-in replacement for dataclasses.dataclass
with validation, not a replacement for pydantic.BaseModel
(with a small difference in how initialization hooks work). There are cases where subclassing
pydantic.BaseModel
is the better choice.
For more information and discussion see
[samuelcolvin/pydantic#710](https://github.com/samuelcolvin/pydantic/issues/710).
You can use all the standard pydantic field types, and the resulting dataclass will be identical to the one
created by the standard library dataclass
decorator.
The underlying model and its schema can be accessed through __pydantic_model__
.
Also, fields that require a default_factory
can be specified by a dataclasses.field
.
{!.tmp_examples/dataclasses_default_schema.py!}
(This script is complete, it should run "as is")
pydantic.dataclasses.dataclass
's arguments are the same as the standard decorator, except one extra
keyword argument config
which has the same meaning as Config.
!!! warning After v1.2, The Mypy plugin must be installed to type check pydantic dataclasses.
For more information about combining validators with dataclasses, see dataclass validators.
Nested dataclasses are supported both in dataclasses and normal models.
{!.tmp_examples/dataclasses_nested.py!}
(This script is complete, it should run "as is")
Dataclasses attributes can be populated by tuples, dictionaries or instances of the dataclass itself.
Stdlib dataclasses (nested or not) can be easily converted into pydantic dataclasses by just decorating
them with pydantic.dataclasses.dataclass
.
{!.tmp_examples/dataclasses_stdlib_to_pydantic.py!}
(This script is complete, it should run "as is")
Stdlib dataclasses (nested or not) can also be inherited and pydantic will automatically validate all the inherited fields.
{!.tmp_examples/dataclasses_stdlib_inheritance.py!}
(This script is complete, it should run "as is")
Bear in mind that stdlib dataclasses (nested or not) are automatically converted into pydantic dataclasses
when mixed with BaseModel
!
{!.tmp_examples/dataclasses_stdlib_with_basemodel.py!}
(This script is complete, it should run "as is")
Since stdlib dataclasses are automatically converted to add validation using
custom types may cause some unexpected behaviour.
In this case you can simply add arbitrary_types_allowed
in the config!
{!.tmp_examples/dataclasses_arbitrary_types_allowed.py!}
(This script is complete, it should run "as is")
When you initialize a dataclass, it is possible to execute code after validation
with the help of __post_init_post_parse__
. This is not the same as __post_init__
, which executes
code before validation.
{!.tmp_examples/dataclasses_post_init_post_parse.py!}
(This script is complete, it should run "as is")
Since version v1.0, any fields annotated with dataclasses.InitVar
are passed to both __post_init__
and
__post_init_post_parse__
.
{!.tmp_examples/dataclasses_initvars.py!}
(This script is complete, it should run "as is")
Note that the dataclasses.dataclass
from python stdlib implements only the __post_init__
method since it doesn't run a validation step.
When substituting usage of dataclasses.dataclass
with pydantic.dataclasses.dataclass
, it is recommended to move the code executed in the __post_init__
method to the __post_init_post_parse__
method, and only leave behind part of code which needs to be executed before validation.
Pydantic dataclasses do not feature a .json()
function. To dump them as JSON, you will need to make use of the pydantic_encoder
as follows:
{!.tmp_examples/dataclasses_json_dumps.py!}