You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
MODEL_NAME = "huggingface/TheBloke/leo-hessianai-70B-chat-GPTQ"
messages = [{"content": "C", "role": "user"}] # LiteLLM follows the OpenAI format
api_base = "http://127.0.0.1:8080"
CALLING ENDPOINT
response=completion(model=MODEL_NAME, messages=messages, api_base=api_base,stream=True)
for part in response:
print(part.choices[0].delta.content or "")
The text was updated successfully, but these errors were encountered:
Hi,
the Error is thrown in line 61 litellm.py:
results.append(result["choices"][0]["message"]["content"])
'CustomStreamWrapper' object is not subscriptable
Call is
from txtai.pipeline import LLM
MODEL_NAME = "huggingface/TheBloke/leo-hessianai-70B-chat-GPTQ"
llm = LLM(path=MODEL_NAME,method="litellm", api_base=api_base,stream=True)
This works fine:
import litellm
from litellm import completion
MODEL_NAME = "huggingface/TheBloke/leo-hessianai-70B-chat-GPTQ"
messages = [{"content": "C", "role": "user"}] # LiteLLM follows the OpenAI format
api_base = "http://127.0.0.1:8080"
CALLING ENDPOINT
response=completion(model=MODEL_NAME, messages=messages, api_base=api_base,stream=True)
for part in response:
print(part.choices[0].delta.content or "")
The text was updated successfully, but these errors were encountered: