Skip to content

Collection of examples of using the python llamacpp library

Notifications You must be signed in to change notification settings

bs7280/py-llama-cpp-examples

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Py Llamacpp examples

Repo to store examples of using local LLMs

Setup

Will depend on machine, everything is running on my m1 max with 64 GB of Ram

Useful links:

Models:

Usage:

Mistral Json Schema - Basic example

  • mistral_json_schema.py - Demonstrates using a 7B model to generate valid output for a given JsonSchema. Couple things to note
    • output should always match json schema given
    • the given Json Schema Does NOT count towards the context limit.
    • Using a 7B model which is pretty fast on my machine.

Output:

Prompt: Give me an API spec for a 28 year old named Ben Json Schema:

{
    "$schema": "http://json-schema.org/draft-07/schema#",
    "title": "Name and Age API",
    "description": "An API that accepts a name and age as input parameters.",
    "type": "object",
    "properties": {
        "name": {
            "type": "string",
            "description": "The name of the person."
        },
        "age": {
            "type": "integer",
            "description": "The age of the person."
        }
    },
    "required": [
        "name",
        "age"
    ]
}

Response:

{
    "name": "Ben",
    "age": 28
}

About

Collection of examples of using the python llamacpp library

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages