Skip to content

Add LiteLLM or llama.cpp support to Generate skills #23

@elcronos

Description

@elcronos

So far, only models like Anthropic or OpenAI are available for the generation of skills. I noticed that Eval has a way to load custom models and use them for evaluation. However, it doesn't seem possible to use custom models to generate skills. I'd like to have support to also add open source models or custom models to generate skills as oppose to proprietary only.

# Evaluate on local model (llama.cpp server)
upskill generate "parse YAML" \
    --eval-model "unsloth/GLM-4.7-Flash-GGUF:Q4_0" \
    --eval-base-url http://localhost:8080/v1

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions