-
Notifications
You must be signed in to change notification settings - Fork 318
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Including Tools prevents Gemini from providing a natural language (generalized) response #3775
Comments
Update on this, including a Setting the value to NONE does provide natural language responses, but then again no functions work. # ...
gemini_model = GenerativeModel(MODEL_ID, tools=[tool, ], tool_config=tool_config)
# ...
tool_config = ToolConfig(
function_calling_config=ToolConfig.FunctionCallingConfig(
mode=ToolConfig.FunctionCallingConfig.Mode.AUTO,
allowed_function_names=[],
)
) Other things I've tried:
|
In my experience, the models can respond with text when tools are present. In fact sometimes the models ignore the provided tool that can be useful for the answer. AFAIK, the Would |
We need tools or a natural response. I tested once again this morning with this full code example on 1.51.0 (the latest available by pip3) and gemini-1.5-pro-preview-0409 and regrettably the behaviour is the same. The code is below and as you can see is fairly simple use case:
The calls look to match the documentation... Unless I'm doing something very wrong here, it looks like function calling with Natural generation is completely broken. '''
Created on May 10, 2024
@author: Kevin
'''
from django.core.management.base import BaseCommand
import vertexai
from vertexai.generative_models._generative_models import GenerativeModel, Tool, ToolConfig
from vertexai.preview import generative_models as preview_generative_models
from vertexai.generative_models import (
Content,
FunctionDeclaration,
Part,
)
from pprint import pprint
PROJECT_ID = ""
LOCATION_ID = "us-east1"
AGENT_ID = ""
MODEL_ID = 'gemini-1.5-pro-preview-0409'
SYSTEM_PROMPT_NOFUNCTION = ''
SYSTEM_PROMPT_NOREALTIME = ''
SYSTEM_PROMPT_GROUNDING = ''
class ModelFunction():
def __init__(self):
self.function_name = self.__class__.__name__
self.data = {}
class MFGetCurrentWeather(ModelFunction):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.description = "Get the current weather in a given location"
self.parameters = [("location", "string", "Location"), ]
def set_parameter_values(self, location=None):
self.data['location'] = location
def get_functions_gemini(self):
properties = {}
for item in self.parameters: properties[item[0]] = {"type": item[1], "description": item[2]}
return [
FunctionDeclaration(
name=self.function_name,
description=self.description,
parameters={
"type": "object",
"properties": properties,
},
)
]
def get_response_gemini(self):
return """{ "location": "Boston, MA", "temperature": 38, "description": "Partly Cloudy", "icon": "partly-cloudy", "humidity": 65, "wind": { "speed": 10, "direction": "NW" } }"""
class MFGetCurrentTimezone(ModelFunction):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.description = "Get the timezone in a given location"
self.parameters = [("location", "string", "Location"), ]
def set_parameter_values(self, location=None):
self.data['location'] = location
def get_functions_gemini(self):
properties = {}
for item in self.parameters: properties[item[0]] = {"type": item[1], "description": item[2]}
return [
FunctionDeclaration(
name=self.function_name,
description=self.description,
parameters={
"type": "object",
"properties": properties,
},
)
]
def get_response_gemini(self):
return """{ "location": "Boston, MA", "timezone": {"value": "US/Eastern", "valueText": "Eastern Time Zone"}} """
class Command(BaseCommand):
help = 'Test VCP functionality'
def handle(self, *args, **options):
print('--> Test...')
generation_config = {
'candidate_count': 1,
'temperature': 0,
'top_p': 1.0,
'max_output_tokens': 1024,
}
messages = []
# prompt = 'What is the weather like in Boston today?'
prompt = 'What is life like on Mars?'
messages.append({'role': 'user', 'parts': [{'text': prompt}]})
mfgcw = MFGetCurrentWeather()
mfgct = MFGetCurrentTimezone()
tool = Tool(function_declarations=
mfgcw.get_functions_gemini()
+ mfgct.get_functions_gemini()
)
# system_instructions = [
# 'You can any and all questions using the MFFallback Tool'
# ]
# google_search_tool = Tool.from_google_search_retrieval(google_search_retrieval=preview_generative_models.grounding.GoogleSearchRetrieval(disable_attribution=True))
tool_config = ToolConfig(
function_calling_config=ToolConfig.FunctionCallingConfig(
mode=ToolConfig.FunctionCallingConfig.Mode.AUTO,
allowed_function_names=[],
)
)
vertexai.init(project=PROJECT_ID, location=LOCATION_ID)
gemini_model = GenerativeModel(MODEL_ID, tools=[tool, ], tool_config=tool_config)
model_response = gemini_model.generate_content(messages, generation_config=generation_config)
pprint(model_response)
function_call = None if (model_response.candidates[0].function_calls) == 0 else model_response.candidates[0].function_calls[0]
if function_call: print(f'--> Model requests function call: \n{function_call}')
# import ipdb; ipdb.set_trace();
if function_call:
if function_call.name == mfgcw.function_name:
mfgcw.set_parameter_values(location=function_call.args['location'])
function_response = mfgcw.get_response_gemini()
elif function_call.name == mfgct.function_name:
mfgct.set_parameter_values(location=function_call.args['location'])
function_response = mfgct.get_response_gemini()
# Return the API response to Gemini so it can generate a model response or request another function call
response = gemini_model.generate_content(
[
Content(role="user", parts=[Part.from_text(prompt)]),
model_response.candidates[0].content, # Function call response
Content(
parts=[
Part.from_function_response(
name=function_call.name,
response={
"content": function_response,
},
),
],
),
],
tools=[tool,],
)
pprint(response)
pprint(response.candidates[0].content.parts[0].text) This is a question about the planet Mars. I can't answer that, as I can only access information about the weather and time. |
Hello, I can confirm here on the latest gemini 1.5 pro model and python API available from PIP3 (v.1.5.3) that this basic and probably very common use case is still an issue... Add a function via tools, and the API can't return natural queries anymore. 3 weeks + Anyone from google? |
It seems there is a bug where the specified model is not actually used when the tools argument is provided to When trying the following example code, the returned result states this is only for tool = # Include your tool here
tool_config = ToolConfig(
function_calling_config=ToolConfig.FunctionCallingConfig(
mode=ToolConfig.FunctionCallingConfig.Mode.ANY,
allowed_function_names=[],
)
)
model = GenerativeModel("gemini-pro-1.5")
print(model.generate_content(
"What is the weather like in Boston?",
tools=[ tool ],
tool_config=tool_config,
)) Returns
Uhhh, Gemini seems very broken compared to ChatGPT... |
Apologies, please use the |
Hi @matthew29tang, thanks for the reply... I should say that even using the preview version of the model it will only:
For example with one Tool included, and no ToolConfig, or System Instructions the following simple query > "Why is the sky blue"? Results in nonsense...
|
Please forgive my venting, as I really love working with GCP, gemini is a fantastic model and you are all are doing awesome work but trying to build a product around gemini (using python SDK) has been really challenging. Function calling in a particular has been tough:
tool_config=ToolConfig(
function_calling_config=ToolConfig.FunctionCallingConfig(
mode=ToolConfig.FunctionCallingConfig.Mode.AUTO,
),
) vs tool_choice: "required" Anyway, perhaps this is all unrelated so I apologize. but ya fixing this issue in particular, would mean a ton to me. Thanks so much! |
Sorry to hear that you have been experiencing troubles with GCP/gemini. I filed an internal ticket with the backend team and they will run some evals regarding the performance degradation when a tool is provided but not used, and I will also pass on your feedback. |
Thank you so much! and I want to emphasize, I have compared vision 4-o and gemini and for our use case, gemini performs so much better. So great work, I appreciate everything. |
Hey @matthew29tang , Do you know if preview.1-5 is getting deprecated 06/24 per the life cycle document? and do you know if this issue will get fixed before then? https://cloud.google.com/vertex-ai/generative-ai/docs/learn/model-versioning |
Environment details
Steps to reproduce
I can't answer that question. I can get the current weather, time zone, and realtime information about stocks.
The documentation says the default should apply, yet it does not.
FunctionCallingConfig is unspecified, thus internally AUTO should be set and the following should be true "Default model behavior, model decides to predict either a function call or a natural language response."
https://cloud.google.com/vertex-ai/docs/reference/rpc/google.cloud.aiplatform.v1beta1#mode
The text was updated successfully, but these errors were encountered: