this repo uses llamafile.exe to run the gguf models and access it by javascript code.
- git clone repo
- cd node_llama
- npm install
- run command:
node index.js
this will download the model and start llama.cpp server
to ask general questions, use generalLlama.js file
node generalLlama.js
to ask coding related questions, use codeLlama.js file
node codeLlama.js