-
Notifications
You must be signed in to change notification settings - Fork 23
Closed
Description
So, I've been exploring ways to use OpenAI.jl package's provider to have TogetherAI as mentioned in the README.md.
provider = OpenAI.OpenAIProvider(
api_key = ENV["TOGETHER_API_KEY"],
base_url = "https://api.together.xyz/v1" # Must override OpenAI base
)
I can use create_chat
function easily to chat with this model
response = create_chat(
provider,
"meta-llama/Llama-3.3-70B-Instruct-Turbo-Free", # Use Together-compatible model name
[Dict("role" => "user", "content" => "Write some ancient Mongolian poetry")]
)
println(response.response["choices"][begin]["message"]["content"])
However, there is no provider
interface to the create_embeddings
function, so I created my own:
import OpenAI: create_embeddings
function create_embeddings(
provider::OpenAI.AbstractOpenAIProvider,
input,
model_id::String = DEFAULT_TOGETHERAI_EMBEDDING_MODEL_ID;
http_kwargs::NamedTuple = NamedTuple(),
kwargs...)
return OpenAI.openai_request("embeddings",
provider;
method = "POST",
http_kwargs = http_kwargs,
model = model_id,
input,
kwargs...)
end
resp = create_embeddings(
provider,
# ENV["TOGETHER_API_KEY"],
horror_movies_tidier.overview[1];
#model_id = DEFAULT_TOGETHERAI_EMBEDDING_MODEL_ID
)
Maybe we should add this function into the library itself?
roryl23
Metadata
Metadata
Assignees
Labels
No labels