Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

After downloading 2.4 GB of data and restarting the UI, the language selection did not appear. #4

Closed
Torbat opened this issue Mar 29, 2023 · 4 comments

Comments

@Torbat
Copy link

Torbat commented Mar 29, 2023

Hi,

015 - Stable Diffusion - 127 0 0 1 - 2023-03-29_18-08-16

Building translator
Loading generator
Loading tokenizer
Traceback (most recent call last):
  File "e:\Program Files\Stable_Diffusion\stable-diffusion-webui\venv\lib\site-packages\gradio\routes.py", line 394, in run_predict
    output = await app.get_blocks().process_api(
  File "e:\Program Files\Stable_Diffusion\stable-diffusion-webui\venv\lib\site-packages\gradio\blocks.py", line 1075, in process_api
    result = await self.call_function(
  File "e:\Program Files\Stable_Diffusion\stable-diffusion-webui\venv\lib\site-packages\gradio\blocks.py", line 884, in call_function
    prediction = await anyio.to_thread.run_sync(
  File "e:\Program Files\Stable_Diffusion\stable-diffusion-webui\venv\lib\site-packages\anyio\to_thread.py", line 31, in run_sync
    return await get_asynclib().run_sync_in_worker_thread(
  File "e:\Program Files\Stable_Diffusion\stable-diffusion-webui\venv\lib\site-packages\anyio\_backends\_asyncio.py", line 937, in run_sync_in_worker_thread
    return await future
  File "e:\Program Files\Stable_Diffusion\stable-diffusion-webui\venv\lib\site-packages\anyio\_backends\_asyncio.py", line 867, in run
    result = context.run(func, *args)
  File "E:\Program Files\Stable_Diffusion\stable-diffusion-webui\extensions\prompt_translator\scripts\main.py", line 206, in set_active
    self.translator = MBartTranslator()
  File "E:\Program Files\Stable_Diffusion\stable-diffusion-webui\extensions\prompt_translator\scripts\main.py", line 99, in __init__
    self.tokenizer = MBart50TokenizerFast.from_pretrained(model_name, src_lang=src_lang, tgt_lang=tgt_lang)
  File "e:\Program Files\Stable_Diffusion\stable-diffusion-webui\venv\lib\site-packages\transformers\tokenization_utils_base.py", line 1804, in from_pretrained
    return cls._from_pretrained(
  File "e:\Program Files\Stable_Diffusion\stable-diffusion-webui\venv\lib\site-packages\transformers\tokenization_utils_base.py", line 1959, in _from_pretrained
    tokenizer = cls(*init_inputs, **init_kwargs)
  File "e:\Program Files\Stable_Diffusion\stable-diffusion-webui\venv\lib\site-packages\transformers\models\mbart50\tokenization_mbart50_fast.py", line 135, in __init__
    super().__init__(
  File "e:\Program Files\Stable_Diffusion\stable-diffusion-webui\venv\lib\site-packages\transformers\tokenization_utils_fast.py", line 114, in __init__
    fast_tokenizer = convert_slow_tokenizer(slow_tokenizer)
  File "e:\Program Files\Stable_Diffusion\stable-diffusion-webui\venv\lib\site-packages\transformers\convert_slow_tokenizer.py", line 1162, in convert_slow_tokenizer
    return converter_class(transformer_tokenizer).converted()
  File "e:\Program Files\Stable_Diffusion\stable-diffusion-webui\venv\lib\site-packages\transformers\convert_slow_tokenizer.py", line 438, in __init__
    from .utils import sentencepiece_model_pb2 as model_pb2
  File "e:\Program Files\Stable_Diffusion\stable-diffusion-webui\venv\lib\site-packages\transformers\utils\sentencepiece_model_pb2.py", line 92, in <module>
    _descriptor.EnumValueDescriptor(
  File "e:\Program Files\Stable_Diffusion\stable-diffusion-webui\venv\lib\site-packages\google\protobuf\descriptor.py", line 796, in __new__
    _message.Message._CheckCalledFromGeneratedFile()
TypeError: Descriptors cannot not be created directly.
If this call came from a _pb2.py file, your generated code is out of date and must be regenerated with protoc >= 3.19.0.
If you cannot immediately regenerate your protos, some other possible workarounds are:
 1. Downgrade the protobuf package to 3.20.x or lower.
 2. Set PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=python (but this will use pure-Python parsing and will be much slower).
@ParisNeo
Copy link
Owner

Thank you for your message.
It seems to be some problem with your protobuf version. How did you install the webui app? Are you using a custom python initerpreter that you installed by yourself or are you using your own virtual environment or conda environment?

If you use the installer provided with the tool, it should work. I have tested it on two different PCs and I didn't have any trouble with that.
Normally, after downloading the model and the tokenizer, you should see:
Translator ready
On the UI, you should see the list of languages.

The process takes some time to download at first.

My tool uses Facebook's MBart translator model. It allows you to translate offline once downloaded. I didn't want to use google because you need connection and also I have to pay to use it. Since I'm doing this all for free, I've chosen to use an offline model.

@Torbat
Copy link
Author

Torbat commented Mar 29, 2023

@ParisNeo
I used to install
[EmpireMediaScience/A1111-Web-UI-Installer: Complete installer for Automatic1111's infamous Stable Diffusion WebUI]
(https://github.com/EmpireMediaScience/A1111-Web-UI-Installer)

As far as I understand, your code uses an older version of protobuf?

If this call came from a _pb2.py file, your generated code is out of date and must be regenerated with protoc >= 3.19.0.If you cannot immediately regenerate your protos, some other possible workarounds are:
 1. Downgrade the protobuf package to 3.20.x or lower.
 2. Set PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=python (but this will use pure-Python parsing and will be much slower).

More information: https://developers.google.com/protocol-buffers/docs/news/2022-05-06#python-updates

I will try to reinstall webui app.

@Torbat
Copy link
Author

Torbat commented Mar 29, 2023

After reinstalling webui app it now works.

I think the problem was that I had Python 3.10.9.
Now it's Python 3.10.6.

Now I can test this extension.
I think using Facebook's MBart translator model is an interesting idea.

@Torbat Torbat closed this as completed Mar 29, 2023
@ParisNeo
Copy link
Owner

I hope you like it. This evening, I am going to do some upgrades to solve some issues. Thank you for testing this extension :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants