You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Originally posted by zaynpatel December 4, 2023
I'm currently running the following command, copied from the documentation with the exception of a new localhost address:
Discussed in #235
Originally posted by zaynpatel December 4, 2023
I'm currently running the following command, copied from the documentation with the exception of a new localhost address:
I'm getting a 404 error which references that the
inferences/llamacpp/loadmodel
is not an available route.I'm curious about how to proceed and wonder how I can test what other load model links might be correct?
The text was updated successfully, but these errors were encountered: