Local Model deployment is not working. #11163
Unanswered
Abhilekhnathdas
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Code:
from langchain_community.llms import CTransformers
from langchain.chains import LLMChain
from langchain.prompts import PromptTemplate
import mlflow
import pandas as pd
from mlflow.deployments import get_deploy_client
#from langchain.llms import Mlflow
llm = CTransformers(model="C:/Users/PycharmProjects/pythonProject5/TheBloke/Mistral-7B-Instruct-v0.1-GGUF/mistral-7b-instruct-v0.1.Q3_K_S.gguf",model_type = "mistral")
template = """Answer question in more than 50 words
Question: {question}
Answer:"""
prompt = PromptTemplate.from_template(template)
llm_chain = LLMChain(prompt=prompt, llm=llm)
#response = llm_chain.invoke("What is AI?")
#print(response)
with mlflow.start_run():
model_info = mlflow.langchain.log_model(llm_chain, "model_artifact")
model_uri=model_info.model_uri
model = mlflow.pyfunc.load_model(model_uri)
print(model.predict([{"question": "What is good about newtons law of motion"}]))
I have deployed it in local server using command added below:
mlflow models serve -m runs:/754eb2b143133418cytf0b7d675d0542c/model_artifact -p 8001 --no-conda
Now I am trying to get the result using below command, but getting error.
curl http://localhost:8001/invocations -H 'Content-Type: application/json' -d '{"question":"What is AI?"}'
Error:
This predictor only supports the following content types: Types: ['text/csv', 'application/json']. Got 'application/x-www-form-urlencoded'.curl: (6) Could not resolve host: application
Please guide what mistake I am doing to fetch the result. How to fix this issue?
Beta Was this translation helpful? Give feedback.
All reactions