Skip to content

[Bug]: Watsonx.ai does not allow space ID using the 'deployment/' endpoint #10941

@fisser001

Description

@fisser001

What happened?

We are trying to call the Watsonx.ai custom deployment API for an deployed LLM. We are running into the issue that the implementation in LiteLLm when using the /deployment endpoints wants an the "space id" to be set as env paramater. If we set this parameter then the implementation adds the space id also in the request body of the api call. However, the API return an error that "space id" cannot be set in the request body.

We are using the following code:

`from litellm import completion

os.environ["WATSONX_URL"] = ""
os.environ["WATSONX_DEPLOYMENT_SPACE_ID"] = "743ff63d-2f26-4261-9d11-ca963864bf96"
os.environ["WATSONX_ZENAPIKEY"]=""

response = completion(
model="watsonx_text/deployment/d634c401-6dc5-4acb-9c41-3395b535ffa4",
messages=[{"content": "what is your favorite colour?","role": "user"}],
)
print(response)`

We attached the error message when executing the above code.

@ongkhaiwei: Mentioned you here because I have seen that you are active developing that implementation.

Relevant log output

13:48:51 - LiteLLM:DEBUG: utils.py:334 - 

13:48:51 - LiteLLM:DEBUG: utils.py:334 - Request to litellm:
13:48:51 - LiteLLM:DEBUG: utils.py:334 - litellm.completion(model='watsonx\_text/deployment/d634c401-6dc5-4acb-9c41-3395b535ffa4', messages=\[{'content': 'what is your favorite colour?', 'role': 'user'}\], api\_version='2024-04-18')
13:48:51 - LiteLLM:DEBUG: utils.py:334 - 

13:48:51 - LiteLLM:DEBUG: litellm\_logging.py:455 - self.optional\_params: {}
13:48:51 - LiteLLM:DEBUG: utils.py:334 - SYNC kwargs\[caching\]: False; litellm.cache: None; kwargs.get('cache')\['no-cache'\]: False
13:48:51 - LiteLLM:INFO: utils.py:2900 - 
LiteLLM completion() model= deployment/d634c401-6dc5-4acb-9c41-3395b535ffa4; provider = watsonx\_text
13:48:51 - LiteLLM:DEBUG: utils.py:2903 - 
LiteLLM: Params passed to completion() {'model': 'deployment/d634c401-6dc5-4acb-9c41-3395b535ffa4', 'functions': None, 'function\_call': None, 'temperature': None, 'top\_p': None, 'n': None, 'stream': None, 'stream\_options': None, 'stop': None, 'max\_tokens': None, 'max\_completion\_tokens': None, 'modalities': None, 'prediction': None, 'audio': None, 'presence\_penalty': None, 'frequency\_penalty': None, 'logit\_bias': None, 'user': None, 'custom\_llm\_provider': 'watsonx\_text', 'response\_format': None, 'seed': None, 'tools': None, 'tool\_choice': None, 'max\_retries': None, 'logprobs': None, 'top\_logprobs': None, 'extra\_headers': None, 'api\_version': '2024-04-18', 'parallel\_tool\_calls': None, 'drop\_params': None, 'allowed\_openai\_params': None, 'reasoning\_effort': None, 'additional\_drop\_params': None, 'messages': \[{'content': 'what is your favorite colour?', 'role': 'user'}\], 'thinking': None}
13:48:51 - LiteLLM:DEBUG: utils.py:2906 - 
LiteLLM: Non-Default params passed to completion() {}
13:48:51 - LiteLLM:DEBUG: utils.py:334 - Final returned optional params: {}
13:48:51 - LiteLLM:DEBUG: litellm\_logging.py:455 - self.optional\_params: {}
13:48:51 - LiteLLM:DEBUG: litellm\_logging.py:898 - POST Request Sent from LiteLLM:
curl -X POST \\
[https://abc.net/deployments/d634c401-6dc5-4acb-9c41-3395b535ffa4/text/generation?version=2024-03-13](https://abc.net/ml/v1/deployments/d634c401-6dc5-4acb-9c41-3395b535ffa4/text/generation?version=2024-03-13) \\
-H 'Content-Type: ap\*\*\*\*on' -H 'Accept: ap\*\*\*\*on' -H 'Authorization: Ze\*\*\*\*o=' \\
-d '{'input': 'what is your favorite colour?', 'moderations': {}, 'parameters': {}, 'space\_id': '743ff63d-2f26-4261-9d11-ca963864bf96'}'

13:48:51 - LiteLLM:DEBUG: get\_api\_base.py:62 - Error occurred in getting api base - litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=deployment/d634c401-6dc5-4acb-9c41-3395b535ffa4
 Pass model as E.g. For 'Huggingface' inference endpoints pass in \`completion(model='huggingface/starcoder',..)\` Learn more: [https://docs.litellm.ai/docs/providers](https://docs.litellm.ai/docs/providers)
13:48:51 - LiteLLM:DEBUG: exception\_mapping\_utils.py:2261 - Logging Details: logger\_fn - None | callable(logger\_fn) - False
13:48:51 - LiteLLM:DEBUG: litellm\_logging.py:2156 - Logging Details LiteLLM-Failure Call: \[\]

APIConnectionError: litellm.APIConnectionError: Watsonx\_textException - {"errors":\[{"code":"json\_validation\_error","message":"Json document validation error: 'project\_id' or 'space\_id' cannot be specified in the request body","more\_info":"[https://cloud.ibm.com/apidocs/watsonx-ai-cp](https://cloud.ibm.com/apidocs/watsonx-ai-cp)"}\],"trace":"471392fb-75ca-408a-bd4c-a81a8336e174","status\_code":400}

Are you a ML Ops Team?

Yes

What LiteLLM version are you on ?

v1.69.3

Twitter / LinkedIn details

No response

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions