-
-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
arbitrary_types_allowed not respected when validating stdlib dataclasses #2054
Comments
I had this same problem yesterday. it's not a bug since it matches the behaviour of models or dataclasses in the past. In this situation a model didn't know where it was going to be used when it was created (which is when this check takes place), so couldn't respect I think it might be possible to find a work around for this, but it might not be trivial. @PrettyWood do you have any idea how hard it would be to respect Also, are there any other |
I think the main issue in this specific example is that it is using the standard lib So, here Extending the example from @ines : from typing import List
from dataclasses import dataclass
from pydantic import BaseModel
class ArbitraryType:
def __init__(self, name: str):
self.name = name
@dataclass
class Test:
foo: ArbitraryType
bar: List[ArbitraryType]
class TestModel(BaseModel):
a: ArbitraryType # this is fine
b: Test # this raises RuntimeError
class Config:
arbitrary_types_allowed = True
foo = ArbitraryType(name='Foo')
bar = [ArbitraryType(name='Bar'), ArbitraryType(name='Baz')]
test = Test(foo=foo, bar=bar)
test_model = TestModel(a=ArbitraryType(name='A'), b=test)
assert test_model.a.name == 'A'
assert test_model.b.bar[1].name == 'Baz' Running this script with Pydantic < 1.7 passes. But raises in 1.7. But interestingly, in Pydantic < 1.7, changing the line: from dataclasses import dataclass to from pydantic.dataclasses import dataclass then makes it raise the same way as it is raising now. I feel like this could be related to @PrettyWood 's fix at #2051, but I tried locally with the code from his PR and sadly it didn't fix this specific case. |
It's because prior to 1.7 you couldn't use dataclasses as field types (well you could, but you had to add With 1.7, pydantic will inspect the dataclass and do full validation on the dataclass fields, but problem is when the dataclass as unknown field types, the step of converting the stardard library dataclass to a pydantic dataclass doesn't respect |
that's awesome. Thank you. |
Thanks for the clarification @samuelcolvin ! I didn't know that was possible now, very cool! 🚀 🎉 And thanks @PrettyWood , you rock! 🎸 |
Checks
Bug
Output of
python -c "import pydantic.utils; print(pydantic.utils.version_info())"
:As of v1.7, Pydantic validates dataclasses and their contents, but it fails if the dataclass specifies arbitrary types (which is pretty common in our code bases). Here's an example:
Even though the
BaseModel
subclass setsarbitrary_types_allowed
toTrue
, this configuration doesn't seem to be taken into account when validating the dataclass fields.Adding custom validation to our dataclasses isn't really an option, since we wouldn't want those Pydantic specifics to leak into the rest of the code base. I'm also wondering whether there should be an option to not validate stdlib dataclasses as Pydantic dataclasses and just use a basic instance check instead, like it previously did (?) before v1.7?
The text was updated successfully, but these errors were encountered: