Skip to content

Latest commit

 

History

History
48 lines (35 loc) · 1.92 KB

writing-plugins.md

File metadata and controls

48 lines (35 loc) · 1.92 KB

(embeddings-writing-plugins)=

Writing plugins to add new embedding models

Read the {ref}plugin tutorial <tutorial-model-plugin> for details on how to develop and package a plugin.

This page shows an example plugin that implements and registers a new embedding model.

There are two components to an embedding model plugin:

  1. An implementation of the register_embedding_models() hook, which takes a register callback function and calls it to register the new model with the LLM plugin system.

  2. A class that extends the llm.EmbeddingModel abstract base class.

    The only required method on this class is embed(text), which takes a string and returns a list of floating point numbers.

The following example uses the sentence-transformers package to provide access to the MiniLM-L6 embedding model.

import llm
from sentence_transformers import SentenceTransformer


@llm.hookimpl
def register_embedding_models(register):
    model_id = "sentence-transformers/all-MiniLM-L6-v2"
    register(SentenceTransformerModel(model_id, model_id, 384), aliases=("all-MiniLM-L6-v2",))


class SentenceTransformerModel(llm.EmbeddingModel):
    def __init__(self, model_id, model_name, embedding_size):
        self.model_id = model_id
        self.model_name = model_name
        self.embedding_size = embedding_size
        self._model = None

    def embed(self, text):
        if self._model is None:
            self._model = SentenceTransformer(self.model_name)
        return list(map(float, self._model.encode([text])[0]))

Once installed, the model provided by this plugin can be used with the {ref}llm embed <embeddings-llm-embed> command like this:

cat file.txt | llm embed -m sentence-transformers/all-MiniLM-L6-v2

Or via its registered alias like this:

cat file.txt | llm embed -m all-MiniLM-L6-v2