You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I introduced dataclasses to replace Python dict in order to leverage more the static analyzer.
Now that they are in place (or even better, after #172) we could take one step more and make them pydantic.dataclasses.
They are designed as a drop-in replacement of standard library ones, and they perform a type check also at runtime, giving more helpful errors in case of issues.
Possible cons all stem from the consideration that "it is a further dependency, better one less than one more", but this consideration underlying motivations are mainly:
reliability, but pydantic is reliable, since it is used by big Python packages, and backed by known sponsors; so it might be not as reliable as NumPy, but reliable enough
extra complexity, but being a drop-in replacement, it should not require extra effort than usual dataclass, we just need to change imports
Another concern might be the performance deterioration, but we never use dataclasses in performance critical code (or Python constructs in general), so this really does not apply.
On the other hand, the code where dataclasses are used is really a complex part, so the benefit of improving runtime errors is possibly huge.
Runtime errors are not needed if we have static analysis always in place, but I am rather convinced that there are several occasions in which static analysis is not really performed (including during development).
In this respect, more about this in #178.
The text was updated successfully, but these errors were encountered:
I introduced
dataclasses
to replace Pythondict
in order to leverage more the static analyzer.Now that they are in place (or even better, after #172) we could take one step more and make them
pydantic.dataclasses
.They are designed as a drop-in replacement of standard library ones, and they perform a type check also at runtime, giving more helpful errors in case of issues.
Possible cons all stem from the consideration that "it is a further dependency, better one less than one more", but this consideration underlying motivations are mainly:
pydantic
is reliable, since it is used by big Python packages, and backed by known sponsors; so it might be not as reliable as NumPy, but reliable enoughdataclass
, we just need to change importsAnother concern might be the performance deterioration, but we never use dataclasses in performance critical code (or Python constructs in general), so this really does not apply.
On the other hand, the code where dataclasses are used is really a complex part, so the benefit of improving runtime errors is possibly huge.
Runtime errors are not needed if we have static analysis always in place, but I am rather convinced that there are several occasions in which static analysis is not really performed (including during development).
In this respect, more about this in #178.
The text was updated successfully, but these errors were encountered: