Skip to content

Conversation

@tom-doerr
Copy link
Contributor

This is part of #879.
Is this way of adding support for logprobs acceptable?

lm = dspy.OpenAI(model='gpt-3.5-turbo-instruct', max_tokens=6, api_key=config['openai']['secret_key'])
test_text = "This is a test article."
test_output = lm(test_text, logprobs=1)
print("test_output:", test_output)
test_output: [{'text': '\n\nLorem ipsum dolor sit amet', 'logprobs': {'text_offset': [23, 25, 30, 36, 42, 46], 'token_logprobs': [-0.26951057, -0.28366262, -0.0010195904, -5.4312077e-05, -0.0001788689, -0.00023739056], 'tokens': ['\n\n', 'Lorem', ' ipsum', ' dolor', ' sit', ' amet'], 'top_logprobs': [{'\n\n': -0.26951057, '\n': -2.4457762}, {'Lorem': -0.28366262, 'Test': -2.9440362}, {' ipsum': -0.0010195904, ' Ipsum': -6.9318686}, {' dolor': -5.4312077e-05, ' d': -11.2004175}, {' sit': -0.0001788689, ' amet': -9.340998}, {' amet': -0.00023739056, ' am': -8.778377}]}}]

If this is okay I would look into adding it to dspy.Predict and the vllm backend.

@arnavsinghvi11 arnavsinghvi11 merged commit 5bc17d8 into stanfordnlp:main May 11, 2024
@arnavsinghvi11
Copy link
Collaborator

Thanks @tom-doerr !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants