Skip to content

Best way to use an OpenAI-compatible embedding API #11809

You must be logged in to vote

@BeautyyuYanli @WinPooh32 @david1542 @juanluisrosaramos @vikrantdeshpande09876

Here's the non-langchain method for using openai-like API's

pip install llama-index-llms-openai-like llama-index-embeddings-openai
from llama_index.embeddings.openai import OpenAIEmbedding
from llama_index.llms.openai_like import OpenAILike

embed_model = OpenAIEmbedding(
  model="some model", # use `model` instead of `model_name` -- janky I know
  api_base="...",
  api_key="fake",
  embed_batch_size=10,
)

llm = OpenAILike(
  model_name="my model",
  api_key="fake",
  api_base="...",
  # context window should match whatever llm you are using
  context_window=32000,  
  # specifies whether or not to use chat c…

Replies: 6 comments 8 replies

You must be logged in to vote
1 reply
@BeautyyuYanli

You must be logged in to vote
1 reply
@juanluisrosaramos

You must be logged in to vote
0 replies

You must be logged in to vote
2 replies
@Searcherr

@vikrantdeshpande09876

You must be logged in to vote
0 replies

You must be logged in to vote
4 replies
@WinPooh32

@WinPooh32

@WinPooh32

@logan-markewich

Answer selected by logan-markewich
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
7 participants