Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how can I start a new session when using the api ? #84

Open
kleine2 opened this issue Apr 11, 2024 · 3 comments
Open

how can I start a new session when using the api ? #84

kleine2 opened this issue Apr 11, 2024 · 3 comments
Labels
enhancement New feature or request

Comments

@kleine2
Copy link

kleine2 commented Apr 11, 2024

Hello, how can I start a new session when using the api please?
Thank you!

@kleine2 kleine2 added the enhancement New feature or request label Apr 11, 2024
@jmanhype
Copy link

Please Someone ANser This I want to use the backend as an API

I want to use the Stream endpoint

these currently do not work
curl -X GET "http://localhost:8080/modellist"

curl -X POST "http://localhost:8001/stream?prompt=Hello%20world&session=12345&modelname=knoopx%2Fhermes-2-pro-mistral%3A7b-q8_0"

@jmanhype
Copy link

I run the back end docker file like this docker run -d -p 8080:8080
-e OLLAMA_HOST=http://192.168.1.163:11435
-e MAX_ITERATIONS=30
-e CHROMA_DB_URL=http://chromadb:8000
-e SEARXNG_DOMAIN=http://searxng:8080
-e SEARXNG_HOSTNAME=localhost
--network llocalsearch_llm_network
--name llocalsearch-backend
nilsherzig/llocalsearch-backend:latest

@nilsherzig
Copy link
Owner

nilsherzig commented Apr 13, 2024

Hi, I just rewrote the API to use a JSON string for its arguments.

{"maxIterations":30,"contextSize":8192,"temperature":0,"modelName":"adrienbrault/nous-hermes2pro:Q8_0","prompt":"how much does a llama weigh?","toolNames":[],"webSearchCategories":[],"session":"6b901452-4230-427a-a3c1-c89de7de6039","amountOfResults":4,"minResultScore":0.5,"amountOfWebsites":10,"chunkSize":300,"chunkOverlap":100}
http://localhost:3000/api/stream?settings=%7B%22maxIterations%22%3A30%2C%22contextSize%22%3A8192%2C%22temperature%22%3A0%2C%22modelName%22%3A%22adrienbrault%2Fnous-hermes2pro%3AQ8_0%22%2C%22prompt%22%3A%22how%20much%20does%20a%20llama%20weigh%3F%22%2C%22toolNames%22%3A%5B%5D%2C%22webSearchCategories%22%3A%5B%5D%2C%22session%22%3A%226b901452-4230-427a-a3c1-c89de7de6039%22%2C%22amountOfResults%22%3A4%2C%22minResultScore%22%3A0.5%2C%22amountOfWebsites%22%3A10%2C%22chunkSize%22%3A300%2C%22chunkOverlap%22%3A100%7D

The whole session thing is pretty cursed atm. You can start a new session by using "default" as the session string in your first message. The backend will respond with a "HandleNewSession" Event which contains the UUID string which you need to supply as a session for future quesions. You can see my code to handle this here

if (log.stepType == StepType.HandleNewSession) {
console.log('new session', log.session);
clientValues.session = log.session;
return;
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants