-
-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🔧 add credentials parameter to completion #2463
base: main
Are you sure you want to change the base?
🔧 add credentials parameter to completion #2463
Conversation
This adds a custom `credentials` parameter that users can construct from Google auth libs themselves. This is useful when loading a service account from JSON or other by-hand methods.
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
litellm/llms/vertex_ai.py
Outdated
@@ -261,6 +261,7 @@ def completion( | |||
litellm_params=None, | |||
logger_fn=None, | |||
acompletion: bool = False, | |||
credentials = None, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you don't need to explicitly add this - this would then require an update in main.py too.
If you pass in credentials, it should be accessible inside vertex_ai as part of optional_params
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
something like optional_params["credentials"]
would exist if you pass it in as a param
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@krrishdholakia This isn't working, code example below and what I think is going on:
vertex_ai.py
vertexai.init(
project=vertex_project, location=vertex_location, credentials=creds
)
Vertex AI has to be initialized like so and creds is currently only set via a single creds
value. Before this PR that's only done via on-disk application credentials.
Here's some failing demo code using your proposed solution (which is much more succinct and elegant, IMO).
failing code
import base64
import json
import litellm
import os
from google.oauth2 import service_account
litellm.vertex_project = "blah-blah" # This is my project in the test code
litellm.vertex_location = "us-central1"
# ignore - this simply grabs the service account that is stored as a base64 string
service_account_json_string_b64: str = (
os.getenv("GCP_APPLICATION_DEFAULT_CREDENTIALS") or ""
)
service_account_json_string = base64.b64decode(
service_account_json_string_b64
).decode("utf-8")
# relevant code below
service_account_json = json.loads(service_account_json_string)
credentials = service_account.Credentials.from_service_account_info(
service_account_json
)
response = litellm.completion(
model="gemini-pro",
messages=[
{
"role": "user",
"content": "Tell me a joke about Douglas Adams in 42 characters.",
}
],
optional_params={"credentials": credentials},
)
print(response.choices[0].message.content)
I'll paste the debug output and stacktrace in the proceeding comment.
For this (optional_params["credentials"]
) to correctly work we could check the value of the credentials
key in optional_params
inside the completion
function for the vertex_ai
module (and remove the previous code I modified), where VertexAI is initialized.
If this isn't what you're looking for I'm happy to help implement the functionality how you envision it. I anticipate leveraging LiteLLM heavily in my businesses so more than happy to get my hands dirty here.
creds, _ = google.auth.default(quota_project_id=vertex_project) if credentials is None else (credentials, None)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
failing code
import base64 import json import litellm import os from google.oauth2 import service_account litellm.vertex_project = "blah-blah" # This is my project in the test code litellm.vertex_location = "us-central1" # ignore - this simply grabs the service account that is stored as a base64 string service_account_json_string_b64: str = ( os.getenv("GCP_APPLICATION_DEFAULT_CREDENTIALS") or "" ) service_account_json_string = base64.b64decode( service_account_json_string_b64 ).decode("utf-8") # relevant code below service_account_json = json.loads(service_account_json_string) credentials = service_account.Credentials.from_service_account_info( service_account_json ) response = litellm.completion( model="gemini-pro", messages=[ { "role": "user", "content": "Tell me a joke about Douglas Adams in 42 characters.", } ], optional_params={"credentials": credentials}, ) print(response.choices[0].message.content)
Here's the stacktrace and debug output (using litellm.set_verbose = True
):
debug
�[92mRequest to litellm:�[0m
�[92mlitellm.completion(model='gemini-pro', messages=[{'role': 'user', 'content': 'Tell me a joke about Douglas Adams in 42 characters.'}], optional_params={'credentials': <google.oauth2.service_account.Credentials object at 0x720249bbf250>})�[0m
self.optional_params: {}
kwargs[caching]: False; litellm.cache: None
(start) INSIDE THE VERTEX AI OPTIONAL PARAM BLOCK
(end) INSIDE THE VERTEX AI OPTIONAL PARAM BLOCK - optional_params: {}
Final returned optional params: {'optional_params': {'credentials': <google.oauth2.service_account.Credentials object at 0x720249bbf250>}}
self.optional_params: {'optional_params': {'credentials': <google.oauth2.service_account.Credentials object at 0x720249bbf250>}}
VERTEX AI: vertex_project=<blah-replaced>; vertex_location=us-central1
VERTEX AI: creds=<google.oauth2.credentials.Credentials object at 0x72021ffb3d10>; google application credentials: None
Making VertexAI Gemini Pro Vision Call
Processing input messages = [{'role': 'user', 'content': 'Tell me a joke about Douglas Adams in 42 characters.'}]
�[92m
Request Sent from LiteLLM:
llm_model = GenerativeModel(gemini-pro)
response = llm_model.generate_content(['Tell me a joke about Douglas Adams in 42 characters.'])
�[0m
�[1;31mGive Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new�[0m
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.
Logging Details: logger_fn - None | callable(logger_fn) - False
Logging Details LiteLLM-Failure Call
self.failure_callback: []
stacktrace
Traceback (most recent call last):
File "/home/jason/.local/share/virtualenvs/content_pipeline-ZmGPsggM/lib/python3.11/site-packages/litellm/llms/vertex_ai.py", line 461, in completion
generation_config=GenerationConfig(**optional_params),
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: GenerationConfig.__init__() got an unexpected keyword argument 'optional_params'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/jason/.local/share/virtualenvs/content_pipeline-ZmGPsggM/lib/python3.11/site-packages/litellm/main.py", line 1625, in completion
model_response = vertex_ai.completion(
^^^^^^^^^^^^^^^^^^^^^
File "/home/jason/.local/share/virtualenvs/content_pipeline-ZmGPsggM/lib/python3.11/site-packages/litellm/llms/vertex_ai.py", line 676, in completion
raise VertexAIError(status_code=500, message=str(e))
litellm.llms.vertex_ai.VertexAIError: GenerationConfig.__init__() got an unexpected keyword argument 'optional_params'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/jason/repos/litellm/content_pipeline/test.py", line 23, in <module>
response = litellm.completion(
^^^^^^^^^^^^^^^^^^^
File "/home/jason/.local/share/virtualenvs/content_pipeline-ZmGPsggM/lib/python3.11/site-packages/litellm/utils.py", line 2727, in wrapper
raise e
File "/home/jason/.local/share/virtualenvs/content_pipeline-ZmGPsggM/lib/python3.11/site-packages/litellm/utils.py", line 2628, in wrapper
result = original_function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/jason/.local/share/virtualenvs/content_pipeline-ZmGPsggM/lib/python3.11/site-packages/litellm/main.py", line 2052, in completion
raise exception_type(
^^^^^^^^^^^^^^^
File "/home/jason/.local/share/virtualenvs/content_pipeline-ZmGPsggM/lib/python3.11/site-packages/litellm/utils.py", line 8107, in exception_type
raise e
File "/home/jason/.local/share/virtualenvs/content_pipeline-ZmGPsggM/lib/python3.11/site-packages/litellm/utils.py", line 7324, in exception_type
raise RateLimitError(
litellm.exceptions.RateLimitError: VertexAIException - GenerationConfig.__init__() got an unexpected keyword argument 'optional_params'
@thinkjrs did my suggested fix solve your problem? If so, can i close this PR? |
@krrishdholakia That didn't work unfortunately. That said, I submitted another commit removing my previous implementation and now using |
) | ||
|
||
creds, _ = google.auth.default(quota_project_id=vertex_project) if optional_params.get("credentials") is None else (optional_params.get("credentials"), None) | ||
if optional_params.get("credentials") is None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why print verbose only on None?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Refactoring principle of not touching anything that's currently working/avoiding any api behavior change at all.
@krrishdholakia Want me to remove this?
@thinkjrs i think that's totally fine - just bump me once you have a working PR. I can take a look and merge it in, then! |
@krrishdholakia This should be good to go! |
First of all, thanks for such an awesome library and OSS project. It's a life saver. Secondly, rather than post an issue and spend your time on that for such a simple change, I am just submitting this PR -- so if this is something you don't want or need to close it, zero hard feelings.
High-level changes
This merge adds a
credentials
input parameter to thevertex_ai
module'scompletion
method and uses the user providedcredentials
in thevertexai.init(...)
call, ifcredentials
is notNone
.It should have no behavior changes for current users.
Motivation
Google allows the use of the
Credentials
class directly, which I use to pass service accounts in JSON format directly to client libraries.With vertexai here's what that looks like using Google's libraries, where the
GCP_APPLICATION_DEFAULT_CREDENTIALS
is a JSON string:Changes herein
Since Litellm's
vertex_ai
module usesvertexai
from google under the hood, very little needs to change. What I've implemented should be non-breaking for current users.Add parameter
credentials
First I've added the
credentials
parameter which is default set toNone
.No change to existing behavior
The previous code called the
vertexai.init
method using the default application credentials. This addition does not change that behavior by default.If downstream users opt to create their own
credentials
they can now pass that in and it will be used internally.If there's something different you'd like to see let me know and I'll get it fixed up.