Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Disable caching for individual calls #1231

Closed
tom-doerr opened this issue Jul 1, 2024 · 3 comments
Closed

Disable caching for individual calls #1231

tom-doerr opened this issue Jul 1, 2024 · 3 comments

Comments

@tom-doerr
Copy link
Contributor

I'm trying to disable the cache for an inference call.
I tried setting different temperature values but DSPy still uses the cache for those calls.

                temperature = 2.7 + (1 * random.random())
                result = self.predictor_cot(input_text=input_text, temperature=temperature)
                temperature = 2.7 + (1 * random.random())
                with dspy.settings.context(lm=model, trace=[], temperature=temperature):
                    result = self.predictor_cot(input_text=input_text)
@okhat
Copy link
Collaborator

okhat commented Jul 4, 2024

@tom-doerr you need to pass temperature to the LM, not the context or the predictor. That said, I think it would be nice if one could set the LM kwargs in easier ways.

@vection
Copy link

vection commented Jul 30, 2024

@tom-doerr you need to pass temperature to the LM, not the context or the predictor. That said, I think it would be nice if one could set the LM kwargs in easier ways.

@okhat can you elaborate please how to do it?
did you mean like this:

llm = dspy.OpenAI(model=model_type, max_tokens=self.MAX_TOKENS, temperature=temperature+0.001)
dspy.settings.configure(lm=llm)

whenever I want to clear the cache for specific response? because its not working for me.

In my opinion need solution to run-time cases as sometimes need to send same prompt to different LLMs but it still pulling the cached solution without notice the change of LLM.

@okhat
Copy link
Collaborator

okhat commented Sep 25, 2024

@okhat okhat closed this as completed Sep 25, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants