New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Issue: HuggingFaceEmbeddings can not take trust_remote_code argument #6080
Comments
i also meet the same issue |
i fixed by this code: from langchain.llms import HuggingFacePipeline
from langchain.memory.buffer import ConversationBufferMemory
from transformers import AutoTokenizer, AutoModelForCausalLM, pipeline, AutoModelForSeq2SeqLM
local_llm = HuggingFacePipeline.from_model_id(
model_id="chatglm-6b-int4",
task="text-generation",
model_kwargs={
"temperature": 0, "max_length": 64,
+ "trust_remote_code": True
},
)
print(local_llm('What is the capital of France? ')) |
The above solution doesn't work for from langchain.embeddings import HuggingFaceEmbeddings
model_name = "jinaai/jina-embeddings-v2-small-en"
model_kwargs = {"device": "cpu", "trust_remote_code": True}
encode_kwargs = {
"normalize_embeddings": False,
}
hf = HuggingFaceEmbeddings(
model_name=model_name,
model_kwargs=model_kwargs,
encode_kwargs=encode_kwargs,
)
text = "This is a test document."
res = hf.embed_query(text)
print(len(res)) Results in the following error:
|
Facing the same issue! Any solution for this? |
Has anyone found a solution to this? Seems like we should be able to pass 'trust_remote_code' into kwargs of the HugginfFaceEmbeddings. |
Found the following work around for the problem by loading the embedding model via Transformers class. |
workaround is good but a fix for the underlying issue pass
is welcome |
is this fixed? |
same isuse here, still waiting for the fix :) |
Same issue! Need fix for the same |
so there is the same performance when loading the embeddings model with: from transformers import AutoModel instead of: model_name = "PATH_TO_LOCAL_EMBEDDING_MODEL_FOLDER" I figured out that some embeddings have a sligthly different value, so enabling "trust_remote_code=True" would be appreciated! |
Update sentence-transformers to >=2.3.1 is working for me. |
This worked for me...
|
are you using Langchain Huggingface Embeddings here? |
Yes |
Worked fine for me. As mentioned earlier, be sure to update the versions being used. I've used transformer 2.4 models with no issues. |
|
I am still facing another error which seems like similar things |
HuggingFaceEmbeddings can not take trust_remote_code argument
Suggestion:
No response
The text was updated successfully, but these errors were encountered: