-
-
Notifications
You must be signed in to change notification settings - Fork 3.8k
Closed as not planned
Labels
Description
What happened?
I can't get Deepseek deployed on Vertex AI Model Garden to work with streaming.
This:
response = completion(
model="vertex_ai/<MY-MODEL-ID>",
messages=[{"role": "user", "content": "Tell me a joke."}],
vertex_credentials=vertex_credentials_json,
vertex_project="<MY-PROJECT-ID>",
vertex_location="<MY-LOCATION>",
stream=True
)
for chunk in response:
print(chunk)
print("Response:", response)
Relevant log output
litellm.exceptions.APIConnectionError: litellm.APIConnectionError: 400 The output data is not valid json. Original output: {"predictions": ["?\n\n"]}{"predictions": ["</think>"]}{"predictions": ["<think>"]}{"predictions": ["\n\n"]}{"predictions": ["</think>"]}{"predictions": ["\n\n"]}{"predictions": ["Sure"]}{"predictions": [","]}{"predictions": [" here"]}{"predictions": ["'s"]}{"predictions": [" a"]}{"predictions": [" light"]}{"predictions": ["-hearted"]}{"predictions": [" joke"]}{"predictions": [" for"]}{"predictions": [" you"]}.
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1739618750.912123 1044321 init.cc:232] grpc_wait_for_shutdown_with_timeout() timed out.
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.61.3
Twitter / LinkedIn details
No response