-
Notifications
You must be signed in to change notification settings - Fork 181
Feat/unload model api #97
Conversation
tikikun
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Correct approach , please do some manual test and merge @vuonghoainam tks
|
Hi @thunhu99 However, the server is not working as expected as the server stops working after I sent the DELETE request to unload model
However, after the 3rd step, the server stops working and I cannot send the 1st step again (which is similar to killing process) Could you please check and make some changes, thanks |
| { | ||
| Json::Value jsonResp; | ||
| if (model_loaded) { | ||
| llama.unloadModel(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As I tested, the server stops working after this line.
The below lines do not execute to return result
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh, sorry my mistake. The API endpoint map to the wrong handler:
it should be METHOD_ADD(llamaCPP::unloadModel, "unloadmodel", Delete);
instead of METHOD_ADD(llamaCPP::loadModel, "unloadmodel", Delete);
|
I have fixed the issue, and test it locally, the changes should work now @vuonghoainam @tikikun |
|
Hi @thunhuanh there has been quite intense i need to refactor this PR into another PR a bit to merge, will credit back to this issue. |
|
Hi @thunhuanh I have added your change #122 to this PR with a little bit of change, thank you very much to take the time to contribute to the project |
Add API to unload model.
Resolve #86