RESTful API Bridge for Ollama.
$ go build .
$ ./llamash
Before starting the bridge server, you need a running Ollama server which the address is http://127.0.0.1:11434
in default.
$ podman run --network host ollama serve
$ ./llamash -p 11444 -i 'http://127.0.0.1:11434'
$ curl 'http://127.0.0.1:11444/generate?model=codellama&prompt=sayhi'
GET Form:
generate
Generate content.model
The LLaMA model you gonna use.prompt
The content will send to the model.
Responds in pure text.