You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@joamatab thanks for opening an issue here for these questions.
Q - How do you recommend creating tables that can store pandas dataframes with arbitrary columns?
A - DataBaseModel s are a sub class of pydantic BaseModel and thus share the same constraints that you might face when creating a model. Fortunately custom classes and subsequently pandas DataFrame type objects can be used combined with the Config class attribute and arbitrary_types_allowed = True as documented here
In future releases I am considering applying a default arbitrary_types_allowed to True, to avoid the need to set this yourself. Since DataFrame type objects are serializable via pickle, they will be stored as bytes in the database within a Binary Field in the database.
Q: How about metadata where the metadata has some JSON fields?
A : If you expect the metadata fields to change or vary, then set the field type to dict, if it will always follow a set schema, consider creating a BaseModel for the metadata, both will be stored on Binary database fields ( of course de-serialized / serialized for you).
Thank you Josh for your work with pydbantic,
I want to store both data and metadata for my data
How do you recommend creating tables that can store pandas dataframes with arbitrary columns?
How about metadata where the metadata has some JSON fields?
The text was updated successfully, but these errors were encountered: