Skip to content

[FEATURE] Integrate with Antropic API #176

@badmonster0

Description

@badmonster0

CocoIndex 🥥 currently supports OpenAI and Ollama ✨ , to add LLM as part of the data pipeline. https://cocoindex.io/docs/ai/llm

Here is an example of how Cocoindex uses Ollama to extract structured information from PDF.
https://cocoindex.io/blogs/cocoindex-ollama-structured-extraction-from-pdf/

We would like to add support for Anthropic API - https://docs.anthropic.com/en/api/getting-started

Related code to support OpenAI

Steps:

  1. Update Rust code:
  2. Add Enum to LlmApiType in Python]:
    class LlmApiType(Enum):
    """The type of LLM API to use."""
    OPENAI = "OpenAi"
    OLLAMA = "Ollama"
  3. Test with the existing manuals_llm_extraction example. You can add a few lines similar to OpenAI:
    # Replace by this spec below, to use OpenAI API model instead of ollama
    # llm_spec=cocoindex.LlmSpec(
    # api_type=cocoindex.LlmApiType.OPENAI, model="gpt-4o"),
  4. Update documentation: https://cocoindex.io/docs/ai/llm

Metadata

Metadata

Assignees

Labels

help wantedExtra attention is needed

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions