New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Experimenting with included demo.jpeg #14
Comments
Please confirm whether --disable-safe-unpickle is loaded in the start command. At the same time, confirm whether the model is downloaded completely. |
you can download the model from here then put to |
Hello again! Had to redownload all the models to be sure that the problem is not with them, Restarted WebUI once more. Forgot to report that it works with "artistic" option activated but still throwing an error without it: 2023-08-07 22:42:09 INFO [modules.shared] Starting job extras 2023-08-07 22:42:22 INFO [httpx] HTTP Request: POST http://127.0.0.1:7860/api/predict "HTTP/1.1 200 OK" |
from the error message, it is indeed an error caused by incomplete model files. You can try to compare the SHA256 of the model. For example, ColorizeArtistic_gen.pth's SHA256 is: 3f750246fa220529323b85a8905f9b49c0e5d427099185334d048fb |
HugginFace shows another hash number for this model - it's The one that I have locally has the same hash (other 2 models also has same hash as HugginFace's) - but it's different from the hash you've posted above (I think your's is truncated somehow). |
Yes, he has been phased out. This way, hash is correct. You can try using other black and white images to test it |
Your question seems like this, you can try deleting the demo model directory and restarting it to download the model again |
Experimenting with included demo.jpeg
Starting job extras
*** Error completing request
*** Arguments: (0, <PIL.Image.Image image mode=RGB size=793x468 at 0x190C2038730>, None, '', '', True, 0, 1, 512, 512, True, 'None', 'None', 0, 0, 0, 0, True, 35, False) {}
Traceback (most recent call last):
File "D:\stable-diffusion-webui\modules\call_queue.py", line 58, in f
res = list(func(*args, **kwargs))
File "D:\stable-diffusion-webui\modules\call_queue.py", line 37, in f
res = func(*args, **kwargs)
File "D:\stable-diffusion-webui\modules\postprocessing.py", line 62, in run_postprocessing
scripts.scripts_postproc.run(pp, args)
File "D:\stable-diffusion-webui\modules\scripts_postprocessing.py", line 130, in run
script.process(pp, **process_args)
File "D:\stable-diffusion-webui\extensions\sd-webui-deoldify\scripts\postprocessing_deoldify.py", line 63, in process
pp.image = self.process_image(pp.image, render_factor, artistic)
File "D:\stable-diffusion-webui\extensions\sd-webui-deoldify\scripts\postprocessing_deoldify.py", line 55, in process_image
vis = get_image_colorizer(root_folder=Path(paths_internal.models_path),render_factor=render_factor, artistic=artistic)
File "D:\stable-diffusion-webui\extensions\sd-webui-deoldify\deoldify\visualize.py", line 417, in get_image_colorizer
return get_stable_image_colorizer(root_folder=root_folder, render_factor=render_factor)
File "D:\stable-diffusion-webui\extensions\sd-webui-deoldify\deoldify\visualize.py", line 426, in get_stable_image_colorizer
learn = gen_inference_wide(root_folder=root_folder, weights_name=weights_name)
File "D:\stable-diffusion-webui\extensions\sd-webui-deoldify\deoldify\generators.py", line 19, in gen_inference_wide
learn.load(weights_name)
File "D:\stable-diffusion-webui\extensions\sd-webui-deoldify\fastai\basic_train.py", line 271, in load
state = torch.load(source, map_location=device)
File "D:\stable-diffusion-webui\modules\safe.py", line 108, in load
return load_with_extra(filename, *args, extra_handler=global_extra_handler, **kwargs)
File "D:\stable-diffusion-webui\modules\safe.py", line 156, in load_with_extra
return unsafe_torch_load(filename, *args, **kwargs)
File "D:\stable-diffusion-webui\venv\lib\site-packages\torch\serialization.py", line 815, in load
return _legacy_load(opened_file, map_location, pickle_module, **pickle_load_args)
File "D:\stable-diffusion-webui\venv\lib\site-packages\torch\serialization.py", line 1051, in _legacy_load
typed_storage._untyped_storage._set_from_file(
RuntimeError: unexpected EOF, expected 7913014 more bytes. The file might be corrupted.
2023-08-07 18:32:29 INFO [httpx] HTTP Request: POST http://127.0.0.1:7860/api/predict "HTTP/1.1 200 OK"
HTTP Request: POST http://127.0.0.1:7860/api/predict "HTTP/1.1 200 OK"
2023-08-07 18:32:29 INFO [httpx] HTTP Request: POST http://127.0.0.1:7860/reset "HTTP/1.1 200 OK"
HTTP Request: POST http://127.0.0.1:7860/reset "HTTP/1.1 200 OK"
The text was updated successfully, but these errors were encountered: