This repository was archived by the owner on Jul 4, 2025. It is now read-only.

Description
Goal
Tasklist
Design
API
CLI
# CLI
> cortex engines update llama.cpp
# API
POST /engines/{engine}/update
Open question: should we allow users to run different versions of llama.cpp?
> cortex engines llama.cpp versions
1. b3919
2. b3909
> cortex engines
Release Management
Cortex Stable and Nightly defines a llama.cpp version that it supports
cortex update will update llama.cpp to the supported version
Cortex Nightly automatically pulls the latest llama.cpp, and forces us to fix it?