Wingman is the fastest and easiest way to run Llama models on your PC or Mac.
-
Updated
Jun 2, 2024 - TypeScript
Wingman is the fastest and easiest way to run Llama models on your PC or Mac.
A knowledge graph based forward chain inferencing engine in typescript/node.
(WIP) 🚀 High performance nerual network inference engine running on Web.
Filling in the missing gaps with langchain, and creating OO wrappers to simplify some workloads.
Add a description, image, and links to the inference-engine topic page so that developers can more easily learn about it.
To associate your repository with the inference-engine topic, visit your repo's landing page and select "manage topics."