Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PGPT doesn't see Torch, TensorFlow, or Flax on startup, will not ingest files correctly #1959

Closed
maximumquacks opened this issue Jun 4, 2024 · 3 comments
Labels
needs confirmation A potential bug that needs to be confirmed

Comments

@maximumquacks
Copy link

Reposting/moving this from pgpt-python


using WSL
running vanilla ollama with default config, no issues with ollama
pyenv python 3.11.9 installed and running with Torch, TensorFlow, Flax, and PyTorch added
all install steps followed without error.
here is what results:

~/private-gpt$ PGPT_PROFILES=ollama make run
poetry run python -m private_gpt
15:23:33.547 [INFO ] private_gpt.settings.settings_loader - Starting application with profiles=['default', 'ollama']
None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
--- Logging error ---
Traceback (most recent call last):
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/injector/init.py", line 798, in get
return self._context[key]

KeyError: <class 'private_gpt.ui.ui.PrivateGptUi'>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/injector/init.py", line 798, in get
return self._context[key]
~~~~~~~~~~~~~^^^^^
KeyError: <class 'private_gpt.server.ingest.ingest_service.IngestService'>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/injector/init.py", line 798, in get
return self._context[key]
~~~~~~~~~~~~~^^^^^
KeyError: <class 'private_gpt.components.llm.llm_component.LLMComponent'>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/huggingface_hub/utils/_errors.py", line 270, in hf_raise_for_status
response.raise_for_status()
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/requests/models.py", line 1021, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2/resolve/main/config.json

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/transformers/utils/hub.py", line 398, in cached_file
resolved_file = hf_hub_download(
^^^^^^^^^^^^^^^^
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 1374, in hf_hub_download
raise head_call_error
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 1247, in hf_hub_download
metadata = get_hf_file_metadata(
^^^^^^^^^^^^^^^^^^^^^
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 1624, in get_hf_file_metadata
r = _request_wrapper(
^^^^^^^^^^^^^^^^^
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 402, in _request_wrapper
response = _request_wrapper(
^^^^^^^^^^^^^^^^^
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 426, in _request_wrapper
hf_raise_for_status(response)
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/huggingface_hub/utils/_errors.py", line 286, in hf_raise_for_status
raise GatedRepoError(message, response) from e
huggingface_hub.utils._errors.GatedRepoError: 401 Client Error. (Request ID: Root=1-665e1838-1fd0ea9e164d5e2406f3086b;e0342b1a-f0b1-482f-a2ab-c1336bdbdf04)

Cannot access gated repo for url https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2/resolve/main/config.json.
Access to model mistralai/Mistral-7B-Instruct-v0.2 is restricted. You must be authenticated to access it.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/home/me/private-gpt/private_gpt/components/llm/llm_component.py", line 30, in init
AutoTokenizer.from_pretrained(
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/transformers/models/auto/tokenization_auto.py", line 782, in from_pretrained
config = AutoConfig.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/transformers/models/auto/configuration_auto.py", line 1111, in from_pretrained
config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/transformers/configuration_utils.py", line 633, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/transformers/configuration_utils.py", line 688, in _get_config_dict
resolved_config_file = cached_file(
^^^^^^^^^^^^
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/transformers/utils/hub.py", line 416, in cached_file
raise EnvironmentError(
OSError: You are trying to access a gated repo.
Make sure to have access to it at https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2.
401 Client Error. (Request ID: Root=1-665e1838-1fd0ea9e164d5e2406f3086b;e0342b1a-f0b1-482f-a2ab-c1336bdbdf04)

Cannot access gated repo for url https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2/resolve/main/config.json.
Access to model mistralai/Mistral-7B-Instruct-v0.2 is restricted. You must be authenticated to access it.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/usr/lib/python3.11/logging/init.py", line 1110, in emit
msg = self.format(record)
^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.11/logging/init.py", line 953, in format
return fmt.format(record)
^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.11/logging/init.py", line 687, in format
record.message = record.getMessage()
^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.11/logging/init.py", line 377, in getMessage
msg = msg % self.args
~~~~^~~~~~~~~~~
TypeError: not all arguments converted during string formatting
Call stack:
File "", line 198, in _run_module_as_main
File "", line 88, in _run_code
File "/home/me/private-gpt/private_gpt/main.py", line 5, in
from private_gpt.main import app
File "", line 1178, in _find_and_load
File "", line 1149, in _find_and_load_unlocked
File "", line 690, in _load_unlocked
File "", line 940, in exec_module
File "", line 241, in _call_with_frames_removed
File "/home/me/private-gpt/private_gpt/main.py", line 6, in
app = create_app(global_injector)
File "/home/me/private-gpt/private_gpt/launcher.py", line 63, in create_app
ui = root_injector.get(PrivateGptUi)
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/injector/init.py", line 91, in wrapper
return function(*args, **kwargs)
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/injector/init.py", line 974, in get
provider_instance = scope_instance.get(interface, binding.provider)
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/injector/init.py", line 91, in wrapper
return function(*args, **kwargs)
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/injector/init.py", line 800, in get
instance = self._get_instance(key, provider, self.injector)
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/injector/init.py", line 811, in _get_instance
return provider.get(injector)
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/injector/init.py", line 264, in get
return injector.create_object(self.cls)
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/injector/init.py", line 998, in create_object
self.call_with_injection(init, self=instance, kwargs=additional_kwargs)
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/injector/init.py", line 1031, in call_with_injection
dependencies = self.args_to_inject(
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/injector/init.py", line 91, in wrapper
return function(*args, **kwargs)
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/injector/init.py", line 1079, in args_to_inject
instance: Any = self.get(interface)
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/injector/init.py", line 91, in wrapper
return function(*args, **kwargs)
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/injector/init.py", line 974, in get
provider_instance = scope_instance.get(interface, binding.provider)
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/injector/init.py", line 91, in wrapper
return function(*args, **kwargs)
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/injector/init.py", line 800, in get
instance = self._get_instance(key, provider, self.injector)
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/injector/init.py", line 811, in _get_instance
return provider.get(injector)
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/injector/init.py", line 264, in get
return injector.create_object(self.cls)
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/injector/init.py", line 998, in create_object
self.call_with_injection(init, self=instance, kwargs=additional_kwargs)
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/injector/init.py", line 1031, in call_with_injection
dependencies = self.args_to_inject(
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/injector/init.py", line 91, in wrapper
return function(*args, **kwargs)
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/injector/init.py", line 1079, in args_to_inject
instance: Any = self.get(interface)
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/injector/init.py", line 91, in wrapper
return function(*args, **kwargs)
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/injector/init.py", line 974, in get
provider_instance = scope_instance.get(interface, binding.provider)
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/injector/init.py", line 91, in wrapper
return function(*args, **kwargs)
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/injector/init.py", line 800, in get
instance = self._get_instance(key, provider, self.injector)
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/injector/init.py", line 811, in _get_instance
return provider.get(injector)
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/injector/init.py", line 264, in get
return injector.create_object(self.cls)
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/injector/init.py", line 998, in create_object
self.call_with_injection(init, self=instance, kwargs=additional_kwargs)
File "/home/me/.cache/pypoetry/virtualenvs/private-gpt-RKtlENRP-py3.11/lib/python3.11/site-packages/injector/init.py", line 1040, in call_with_injection
return callable(*full_args, **dependencies)
File "/home/me/private-gpt/private_gpt/components/llm/llm_component.py", line 37, in init
logger.warning(
Message: 'Failed to download tokenizer %s. Falling back to default tokenizer.'
Arguments: ('mistralai/Mistral-7B-Instruct-v0.2', OSError('You are trying to access a gated repo.\nMake sure to have access to it at [https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2.\n401](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2.%5Cn401) Client Error. (Request ID: Root=1-665e1838-1fd0ea9e164d5e2406f3086b;e0342b1a-f0b1-482f-a2ab-c1336bdbdf04)\n\nCannot access gated repo for url [https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2/resolve/main/config.json.\nAccess](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2/resolve/main/config.json.%5CnAccess) to model mistralai/Mistral-7B-Instruct-v0.2 is restricted. You must be authenticated to access it.'))
15:23:35.428 [INFO ] private_gpt.components.llm.llm_component - Initializing the LLM in mode=ollama
15:23:35.860 [INFO ] private_gpt.components.embedding.embedding_component - Initializing the embedding model in mode=ollama
15:23:35.861 [INFO ] llama_index.core.indices.loading - Loading all indices.
15:23:36.101 [INFO ] private_gpt.ui.ui - Mounting the gradio UI, at path=/
15:23:36.125 [INFO ] uvicorn.error - Started server process [19799]
15:23:36.125 [INFO ] uvicorn.error - Waiting for application startup.
15:23:36.126 [INFO ] uvicorn.error - Application startup complete.
15:23:36.126 [INFO ] uvicorn.error - Uvicorn running on http://0.0.0.0:8001/ (Press CTRL+C to quit)
@Rubiel1
Copy link

Rubiel1 commented Jun 18, 2024

I have the same problem in Linux Fedora 40

@Pierrelouis2
Copy link

Hello, I ran into the same problem (None of PyTorch, TensorFlow >= 2.0, or Flax have been found. )
I activated the virtual env of the project like this source .cache/pypoetry/virtualenvs/private-gpt/bin/activate on ubuntu and then run pip install torch.
Worked form me but not the best solution I think

@jaluma jaluma added the needs confirmation A potential bug that needs to be confirmed label Jul 8, 2024
@jaluma
Copy link
Collaborator

jaluma commented Jul 8, 2024

Can you try again? I just tried the latest version and it worked fine :) If it was a version problem, it would be fixed in #1987. If no, can you give us more details to try to re

@jaluma jaluma closed this as completed Jul 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
needs confirmation A potential bug that needs to be confirmed
Projects
None yet
Development

No branches or pull requests

4 participants