You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
Hi,
I'm new to LLM-App. context, trying to get familiar with the tools and applications in this domain, attempting to use Haystack integration for Hugging Face.
Running the example code published in https://docs.haystack.deepset.ai/docs/huggingfacetgigenerator#in-a-pipeline in a Poetry (.venv) environment with (haystack-ai 2.0.0, transformers 4.39.3, torch 2.2.2) installed, failed due to an Import error.
Error message
Traceback (most recent call last):
File "C:\pyprojects\AIprjs\LLM\Haystack\pipelines.venv\lib\site-packages\haystack\components\generators\hugging_face_tgi.py", line 13, in
from huggingface_hub.inference._text_generation import TextGenerationResponse, TextGenerationStreamResponse, Token
ModuleNotFoundError: No module named 'huggingface_hub.inference._text_generation'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "c:/pyprojects/AIprjs/LLM/Haystack/pipelines/rag.py", line 31, in
pipe.add_component("llm", HuggingFaceTGIGenerator(model="mistralai/Mistral-7B-v0.1", token=Secret.from_token("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")))
File "C:\pyprojects\AIprjs\LLM\Haystack\pipelines.venv\lib\site-packages\haystack\core\component\component.py", line 132, in call
instance = super().call(*args, **kwargs)
File "C:\pyprojects\AIprjs\LLM\Haystack\pipelines.venv\lib\site-packages\haystack\components\generators\hugging_face_tgi.py", line 99, in init
transformers_import.check()
File "C:\pyprojects\AIprjs\LLM\Haystack\pipelines.venv\lib\site-packages\lazy_imports\try_import.py", line 107, in check
raise ImportError(message) from exc_value
ImportError: Failed to import 'huggingface_hub.inference._text_generation'. Run 'pip install transformers'. Original error: No module named 'huggingface_hub.inference._text_generation'
Expected behavior
To seamlessly run the pipeline and produce the result!
Additional context
Add any other context about the problem here, like document types / preprocessing steps / settings of reader etc.
Thanks for reporting the issue.
Simply put, the latest version of huggingface_hub broke several things, including our components. 🙂
We keep track of this in #7417 and #7418 and fixed the problems in #7425.
In the next few days, we will release haystack 2.0.1, that includes the bugfix.
In the meantime, you can try using the previous version of huggingface_hub in your environment: pip install "huggingface_hub<0.22.0".
Thanks for the quick and kind response.
I've tried huggingface_hub (0.24.1) in Colab Notebook, worked as a charm :)
Looking forward the next/fixed release.
All the Best
Describe the bug
Hi,
I'm new to LLM-App. context, trying to get familiar with the tools and applications in this domain, attempting to use Haystack integration for Hugging Face.
Running the example code published in https://docs.haystack.deepset.ai/docs/huggingfacetgigenerator#in-a-pipeline in a Poetry (.venv) environment with (haystack-ai 2.0.0, transformers 4.39.3, torch 2.2.2) installed, failed due to an Import error.
Error message
Traceback (most recent call last):
File "C:\pyprojects\AIprjs\LLM\Haystack\pipelines.venv\lib\site-packages\haystack\components\generators\hugging_face_tgi.py", line 13, in
from huggingface_hub.inference._text_generation import TextGenerationResponse, TextGenerationStreamResponse, Token
ModuleNotFoundError: No module named 'huggingface_hub.inference._text_generation'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "c:/pyprojects/AIprjs/LLM/Haystack/pipelines/rag.py", line 31, in
pipe.add_component("llm", HuggingFaceTGIGenerator(model="mistralai/Mistral-7B-v0.1", token=Secret.from_token("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")))
File "C:\pyprojects\AIprjs\LLM\Haystack\pipelines.venv\lib\site-packages\haystack\core\component\component.py", line 132, in call
instance = super().call(*args, **kwargs)
File "C:\pyprojects\AIprjs\LLM\Haystack\pipelines.venv\lib\site-packages\haystack\components\generators\hugging_face_tgi.py", line 99, in init
transformers_import.check()
File "C:\pyprojects\AIprjs\LLM\Haystack\pipelines.venv\lib\site-packages\lazy_imports\try_import.py", line 107, in check
raise ImportError(message) from exc_value
ImportError: Failed to import 'huggingface_hub.inference._text_generation'. Run 'pip install transformers'. Original error: No module named 'huggingface_hub.inference._text_generation'
Expected behavior
To seamlessly run the pipeline and produce the result!
Additional context
Add any other context about the problem here, like document types / preprocessing steps / settings of reader etc.
To Reproduce
Just run the script as is!
FAQ Check
System:
The text was updated successfully, but these errors were encountered: