Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failed to load the extension "coqui_tts" #4718

Closed
1 task done
TerryZWZ opened this issue Nov 24, 2023 · 1 comment
Closed
1 task done

Failed to load the extension "coqui_tts" #4718

TerryZWZ opened this issue Nov 24, 2023 · 1 comment
Labels
bug Something isn't working

Comments

@TerryZWZ
Copy link

TerryZWZ commented Nov 24, 2023

Describe the bug

I am getting an error when starting start_windows.bat with the coqui_tts extension enabled. I have already tried redownloading a fresh text-generation-webui, and testing TTS from the command line which does produce .wav files normally.

The error is RuntimeError: PytorchStreamReader failed reading zip archive: failed finding central directory.

What steps do I need to do in order to fix this error?

Is there an existing issue for this?

  • I have searched the existing issues

Reproduction

I followed all the steps from the link below, using the one-click installer on Windows 11:
https://www.reddit.com/r/Oobabooga/comments/1807tsl/new_builtin_extension_coqui_tts_runs_the_new/

Screenshot

No response

Logs

[XTTS] Loading XTTS...
 > tts_models/multilingual/multi-dataset/xtts_v2 is already downloaded.
 > Using model: xtts
2023-11-23 23:55:16 ERROR:Failed to load the extension "coqui_tts".
Traceback (most recent call last):
  File "B:\AI\text-generation-webui\modules\extensions.py", line 41, in load_extensions
    extension.setup()
  File "B:\AI\text-generation-webui\extensions\coqui_tts\script.py", line 180, in setup
    model = load_model()
            ^^^^^^^^^^^^
  File "B:\AI\text-generation-webui\extensions\coqui_tts\script.py", line 76, in load_model
    model = TTS(params["model_name"]).to(params["device"])
            ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "B:\AI\text-generation-webui\installer_files\env\Lib\site-packages\TTS\api.py", line 81, in __init__
    self.load_tts_model_by_name(model_name, gpu)
  File "B:\AI\text-generation-webui\installer_files\env\Lib\site-packages\TTS\api.py", line 185, in load_tts_model_by_name
    self.synthesizer = Synthesizer(
                       ^^^^^^^^^^^^
  File "B:\AI\text-generation-webui\installer_files\env\Lib\site-packages\TTS\utils\synthesizer.py", line 109, in __init__
    self._load_tts_from_dir(model_dir, use_cuda)
  File "B:\AI\text-generation-webui\installer_files\env\Lib\site-packages\TTS\utils\synthesizer.py", line 164, in _load_tts_from_dir
    self.tts_model.load_checkpoint(config, checkpoint_dir=model_dir, eval=True)
  File "B:\AI\text-generation-webui\installer_files\env\Lib\site-packages\TTS\tts\models\xtts.py", line 755, in load_checkpoint
    checkpoint = self.get_compatible_checkpoint_state_dict(model_path)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "B:\AI\text-generation-webui\installer_files\env\Lib\site-packages\TTS\tts\models\xtts.py", line 705, in get_compatible_checkpoint_state_dict
    checkpoint = load_fsspec(model_path, map_location=torch.device("cpu"))["model"]
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "B:\AI\text-generation-webui\installer_files\env\Lib\site-packages\TTS\utils\io.py", line 86, in load_fsspec
    return torch.load(f, map_location=map_location, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "B:\AI\text-generation-webui\installer_files\env\Lib\site-packages\torch\serialization.py", line 993, in load
    with _open_zipfile_reader(opened_file) as opened_zipfile:
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "B:\AI\text-generation-webui\installer_files\env\Lib\site-packages\torch\serialization.py", line 447, in __init__
    super().__init__(torch._C.PyTorchFileReader(name_or_buffer))
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
RuntimeError: PytorchStreamReader failed reading zip archive: failed finding central directory
Running on local URL:  http://127.0.0.1:7860

To create a public link, set `share=True` in `launch()`.

System Info

Windows 11
NVIDIA GeForce RTX 3060 12GB
@TerryZWZ TerryZWZ added the bug Something isn't working label Nov 24, 2023
@erew123
Copy link
Contributor

erew123 commented Nov 24, 2023

I notice you are on a B drive, which is normally a reserved, well, kind of a reserved drive letter for very historic DOS purposes (back from 1980's era).

I'm assuming therefore that the B drive is a virtual drive of some kind? Rather than a physical disk partition.

If so, that may well be causing some kind of issue. Maybe try putting at least the install on your C, D, E etc drive...

You can still have your models folder on your B drive by editing the CMD_FLAGS.txt file and using --model path
e.g. --model B:\AI\models (or wherever your models folder is)

https://github.com/oobabooga/text-generation-webui#basic-settings

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants