Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Branch: 0.7.0 - RuntimeError: Cannot serialize type diffusers.schedulers #146

Open
ProRedCat opened this issue Sep 12, 2023 · 0 comments
Open

Comments

@ProRedCat
Copy link

After the change to using the diffusers library for the scheduler in the TimeGrad model in version 0.7.0 the serialization of models has been broken. The scheduler is unable to be serialized by GluonTS.

predictor.serialize(Path("./model"))

Results in the following runtime error:

RuntimeError: Cannot serialize type diffusers.schedulers.scheduling_deis_multistep.DEISMultistepScheduler. See the documentation of the `encode` and
`validate` functions at

    http://gluon-ts.mxnet.io/api/gluonts/gluonts.html

and the Python documentation of the `__getnewargs_ex__` magic method at

    https://docs.python.org/3/library/pickle.html#object.__getnewargs_ex__

for more information how to make this type serializable

Based off of the HuggingFace documentation there does appear to be a way to serialize the scheduler but this is separate from how GluonTS saves its models.

scheduler.save_config(Path("./model"))
{
  "_class_name": "DEISMultistepScheduler",
  "_diffusers_version": "0.20.2",
  "algorithm_type": "deis",
  "beta_end": 0.1,
  "beta_schedule": "linear",
  "beta_start": 0.0001,
  "dynamic_thresholding_ratio": 0.995,
  "lower_order_final": true,
  "num_train_timesteps": 150,
  "prediction_type": "epsilon",
  "sample_max_value": 1.0,
  "solver_order": 2,
  "solver_type": "logrho",
  "steps_offset": 0,
  "thresholding": false,
  "timestep_spacing": "linspace",
  "trained_betas": null,
  "use_karras_sigmas": false
}

This can then be loaded in again, but you require to know the scheduler you want

scheduler = DEISMultistepScheduler.load_config(Path("./model"))

A solution to this problem could involve unassigning the scheduler when it is time to save the model but this does not seem like an appropriate solution. I am also unsure how to unassign this in the predictor as it does not seem we are able to access this. Any suggestions on how to serialize the model would be appreciated.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant