Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update generate_prompt in Task subclasses to always return Prompt #199

Merged
merged 7 commits into from
Dec 27, 2023

Conversation

alvarobartt
Copy link
Member

Description

This PR ensures that all the methods generate_prompt defined within the abstract class Task, return a Prompt with no formatting, as the formatting is handled internally by the Prompt via the format_as method, which is used by the user via the arg prompt_format within each LLM. This way, we ensure that the Task implementations are LLM-agnostic, meaning that any task format can be used with any LLM.

So on, the above implies that both OpenAITextGenerationTask and Llama2TextGenerationTask have been removed too, as those were constrained to the formatting of an LLM, in particular OpenAI and Llama2, respectively.

Closes #186

…e_prompt`

Since the default prompt format will be applied within `Prompt` if not format is specified, and it implies joining both the `system_prompt` and `formatted_prompt` with a line-break, then the leading line-break is not needed within the `formatted_prompt`
@alvarobartt alvarobartt merged commit 0407bd0 into main Dec 27, 2023
1 check passed
@alvarobartt alvarobartt deleted the task-are-llm-agnostic branch December 27, 2023 10:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[BUG] UltraCMTask fails when using it with OpenAILLM
2 participants