You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I searched the LangChain documentation with the integrated search.
I used the GitHub search to find a similar question and didn't find it.
I am sure that this is a bug in LangChain rather than my code.
The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).
Example Code
# this code works just fine with local deployed embedding model with LM Studio server with OPENAI API.
from openai import OpenAI
client = OpenAI(base_url="http://localhost:8999/v1", api_key="lm-studio")
def get_embeddings(texts, model="nomic-ai/nomic-embed-text-v1.5-GGUF"):
texts = [text.replace("\n", " ") for text in texts]
return client.embeddings.create(input=texts, model=model).data
print(get_embeddings(["how to find out how LLM applications are performing in real-world scenarios?"]))
This is what I see on the server side and the server returns embedding data back to the code:
[2024-05-06 13:51:34.227] [INFO] Received POST request to /v1/embeddings with body:
{
"input": [
"how to find out how LLM applications are performing in real-world scenarios?"
],
"model": "nomic-ai/nomic-embed-text-v1.5-GGUF",
"encoding_format": "base64"
}
#however if I switch to OpenAIEmbeddings, this code does not work
from langchain_openai import OpenAIEmbeddings
embeddings = OpenAIEmbeddings(openai_api_key="sk-1234", base_url="http://localhost:8999/v1", model="nomic-ai/nomic-embed-text-v1.5-GGUF")
test = embeddings.embed_query("how to find out how LLM applications are performing in real-world scenarios?")
# this is what I see on the server side:
[2024-05-06 13:52:08.629] [INFO] Received POST request to /v1/embeddings with body:
{
"input": [
[
5269,
311,
1505,
704,
1268,
445,
11237,
8522,
527,
16785,
304,
1972,
31184,
26350,
30
]
],
"model": "nomic-ai/nomic-embed-text-v1.5-GGUF",
"encoding_format": "base64"
}
Error Message and Stack Trace (if applicable)
Error on the server side:
[ERROR] 'input' field must be a string or an array of strings
Description
I encountered an issue with the langchain_openai library where using OpenAIEmbeddings to embed a text query results in a malformed POST request payload to the API endpoint. Below is a comparison of the expected and actual requests.
System Info
System Information
OS: Windows
OS Version: 10.0.22631
Python Version: 3.12.3 (tags/v3.12.3:f6650f9, Apr 9 2024, 14:05:25) [MSC v.1938 64 bit (AMD64)]
Packages not installed (Not Necessarily a Problem)
The following packages were not found:
langgraph
langserve
The text was updated successfully, but these errors were encountered:
dosubotbot
added
🔌: openai
Primarily related to OpenAI integrations
🤖:bug
Related to a bug, vulnerability, unexpected error with an existing feature
labels
May 6, 2024
I have the same issue, and it is not recognizing the suggested parameter
Embeddings.create() got an unexpected keyword argument 'check_embedding_ctx_length'
I have the same issue, and it is not recognizing the suggested parameter Embeddings.create() got an unexpected keyword argument 'check_embedding_ctx_length'
hey @ticoAg, thanks for the suggestion!
The check_embedding_ctx_length=False worked for me but is this going to be fixed? It sounds like a compatibility issue.
Checked other resources
Example Code
This is what I see on the server side and the server returns embedding data back to the code:
Error Message and Stack Trace (if applicable)
Error on the server side:
[ERROR] 'input' field must be a string or an array of strings
Description
I encountered an issue with the langchain_openai library where using OpenAIEmbeddings to embed a text query results in a malformed POST request payload to the API endpoint. Below is a comparison of the expected and actual requests.
System Info
System Information
Package Information
Packages not installed (Not Necessarily a Problem)
The following packages were not found:
The text was updated successfully, but these errors were encountered: