You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
However, there is no guarantee that values within the dictionary are JSON serializable (e.g. dictionaries of DataFrames will fail).
Although it may not be best practice to upload dictionaries of objects instead of uploading each object individually, there are good use cases for the former. It would be nice to have error handling that defaults to pickling objects if auto_pickle is true and standard data type uploading fails.
The text was updated successfully, but these errors were encountered:
Although it may not be best practice to upload dictionaries of objects instead of uploading each object individually, there are good use cases for the former. It would be nice to have error handling that defaults to pickling objects if auto_pickle is true and standard data type uploading fails.
This is a good idea. So let's assume we try to upload a non-jsonable dict, we output a warning (e.g JSON serialization failed, falling back to pickle), then we would just pickle the dict itself, and upload the pkl file as artifact.
Is this what you had in mind?
Currently
upload_artifact
only checks whether an object is adict
before dumping to JSON:clearml/clearml/binding/artifacts.py
Line 380 in 51d70ef
However, there is no guarantee that values within the dictionary are JSON serializable (e.g. dictionaries of
DataFrames
will fail).Although it may not be best practice to upload dictionaries of objects instead of uploading each object individually, there are good use cases for the former. It would be nice to have error handling that defaults to pickling objects if
auto_pickle
is true and standard data type uploading fails.The text was updated successfully, but these errors were encountered: