-
Notifications
You must be signed in to change notification settings - Fork 289
Closed
Labels
help wantedExtra attention is neededExtra attention is needed
Description
CocoIndex 🥥 currently supports OpenAI and Ollama ✨ , to add LLM as part of the data pipeline. https://cocoindex.io/docs/ai/llm
Here is an example of how Cocoindex uses Ollama to extract structured information from PDF.
https://cocoindex.io/blogs/cocoindex-ollama-structured-extraction-from-pdf/
We would like to add support for Anthropic API - https://docs.anthropic.com/en/api/getting-started
Related code to support OpenAI
Steps:
- Update Rust code:
- Add Enum to LlmApiType in Rust:
Lines 9 to 12 in 801ae8f
pub enum LlmApiType { Ollama, OpenAi, } - Create Anthropic client similar to OpenAI client: https://github.com/cocoindex-io/cocoindex/blob/main/src/llm/openai.rs
- Update the router to connect to the Anthropic client:
Lines 53 to 60 in 801ae8f
let client = match spec.api_type { LlmApiType::Ollama => { Box::new(ollama::Client::new(spec).await?) as Box<dyn LlmGenerationClient> } LlmApiType::OpenAi => { Box::new(openai::Client::new(spec).await?) as Box<dyn LlmGenerationClient> } };
- Add Enum to LlmApiType in Rust:
- Add Enum to LlmApiType in Python]:
cocoindex/python/cocoindex/llm.py
Lines 4 to 7 in 0fbfd3f
class LlmApiType(Enum): """The type of LLM API to use.""" OPENAI = "OpenAi" OLLAMA = "Ollama" - Test with the existing
manuals_llm_extractionexample. You can add a few lines similar to OpenAI:cocoindex/examples/manuals_llm_extraction/main.py
Lines 93 to 95 in 555d328
# Replace by this spec below, to use OpenAI API model instead of ollama # llm_spec=cocoindex.LlmSpec( # api_type=cocoindex.LlmApiType.OPENAI, model="gpt-4o"), - Update documentation: https://cocoindex.io/docs/ai/llm
Metadata
Metadata
Assignees
Labels
help wantedExtra attention is neededExtra attention is needed