Skip to content

simple-eiffel/simple_ai_client

Repository files navigation

# simple_ai_client

Unified AI provider library for Eiffel applications.

## Features

- **Multi-provider support**: Ollama (local), Claude, OpenAI

- **Vector embeddings**: Semantic similarity search with local computation

- **SQLite storage**: Persistent embedding store for error resolution patterns

## Dependencies

- simple\_json - JSON parsing

- simple\_sql - SQLite database access

- base - EiffelBase library

- time - Time library (for tests)

## Quick Start

### Chat Completion (Ollama)

```eiffel

local

  client: OLLAMA_CLIENT

  response: AI_RESPONSE

do

  create client.make

  response := client.chat ("Explain recursion in one sentence")

  if response.is_success then

  print (response.content)

  end

end

```

### Vector Embeddings

```eiffel

local

  client: OLLAMA_EMBEDDING_CLIENT

  response: AI_EMBEDDING_RESPONSE

  emb1, emb2: AI_EMBEDDING

  similarity: REAL_64

do

  create client.make

 

  -- Generate embeddings

  response := client.embed ("The cat sat on the mat")

  if response.is_success and then attached response.embedding as emb1 then

  response := client.embed ("A feline rested on the rug")

  if response.is_success and then attached response.embedding as emb2 then

  -- Compare (pure local math, no AI call)

  similarity := emb1.cosine_similarity (emb2)

  print ("Similarity: " + similarity.out) -- ~0.85+

  end

  end

end

```

### Embedding Store (Error Resolution)

```eiffel

local

  db: SIMPLE_SQL_DATABASE

  client: OLLAMA_EMBEDDING_CLIENT

  store: AI_EMBEDDING_STORE

  matches: LIST [TUPLE [id: INTEGER; error_text: STRING_32; resolution_code: STRING_32; similarity: REAL_64]]

do

  create db.make ("eifmate.db")

  create client.make

  create store.make (db, client)

 

  -- Store a resolved error (one Ollama call)

  store.store_error_resolution (

  "VEVI: Feature `make' not found in class FOO",

  "Add creation procedure `make' to class FOO"

  )

 

  -- Later: find similar errors (one Ollama call + local search)

  matches := store.find_similar_errors (

  "VEVI: Feature `default_create' not found in class BAR",

  0.7, -- threshold

  5 -- max results

  )

 

  across matches as m loop

  print ("Similar error (%.2f): " + m.similarity.out)

  print ("Resolution: " + m.resolution_code)

  end

end

```

## Classes

| Class | Purpose |

|-------|---------|

| AI\_EMBEDDING | Vector with similarity operations (cosine, euclidean) |

| AI\_EMBEDDING\_RESPONSE | Response wrapper for embedding operations |

| AI\_EMBEDDING\_STORE | SQLite-backed semantic search storage |

| OLLAMA\_CLIENT | Chat completions via Ollama |

| OLLAMA\_EMBEDDING\_CLIENT | Embeddings via Ollama /api/embeddings |

| AI\_RESPONSE | Response wrapper for chat operations |

## Embedding Models

Run ollama pull <model> to install:

| Model | Dimensions | Notes |

|-------|------------|-------|

| nomic-embed-text | 768 | Recommended, good balance |

| mxbai-embed-large | 1024 | Highest quality |

| all-minilm | 384 | Fastest, smallest |

## Performance

- **Embedding generation**: ~100-500ms per text (Ollama API call)

- **Similarity search**: ~1ms per 1000 stored items (pure Eiffel math)

- **Storage**: ~6KB per embedding (768 dims as JSON TEXT)

## License

MIT License - Copyright (c) 2025, Larry Rix

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •