-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
DOC: Following Quick Start and Im facing a lot of issues #1271
Comments
To reproduce the error, Im using |
Thanks for reporting the issue @VIGNESHinZONE. Seems like this is happening due to the mismatch between the dimensions of embedding models used for open source and OpenAI embedding model. You can resolve this issue by changing the way you import the OpenAI based app like this: from embedchain import App
conifg = {
"app": {
"collection_name": "openai-model",
"id": "my-app-id"
}
}
app = App.from_config(config=config)
app.add("https://www.forbes.com/profile/elon-musk")
app.add("https://en.wikipedia.org/wiki/Elon_Musk")
app.query("What is the net worth of Elon Musk today?") |
@deshraj thanks for the quick response, but im getting new error -
Hope I'm not missing anything. Also would you know why mistral is spitting out the input query string to the llm instead of an answer |
Ah my bad. Here is the correct code: from embedchain import App
config = {
"app": {
"config": {
"collection_name": "openai-model",
"id": "my-app-id"
},
}
}
app = App.from_config(config=config)
app.add("https://www.forbes.com/profile/elon-musk")
app.add("https://en.wikipedia.org/wiki/Elon_Musk")
app.query("What is the net worth of Elon Musk today?") |
Can you try to remove the |
But i fixed the error with new config -
code -
|
Thanks. You probably want to delete the OpenAI key that you posted accidently (if valid). |
@deshraj thanks for the help, also would like to know your take on the performance of Mistral
|
Can you please reach out on our slack for these kinds of queries? Thanks |
Issue with current documentation:
Having a very buggy experience,
Tried following Mistral-
Response -
This is basically the input to the LLM, why am I get this as an response
I tried using the OpenAI version and these results -
I got this error -
The text was updated successfully, but these errors were encountered: