-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature/prompt blueprint #48
Merged
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
- Create explanations.py with detailed CO-STAR framework explanation - Add questions.py with CO-STAR framework question - Implement string variables using Final[Text] typing - Use dedent for multiline strings in both files
- Move assistant explanations to new `assistant.py` file - Rename `questions.py` to `user.py` for user-related prompts - Add `request_to_rewrite_as_costar` function in `user.py` - Remove redundant `explanations.py` file This refactoring improves code organization by separating assistant and user-related prompt functions, enhancing maintainability and readability of the CO-STAR framework implementation.
- Introduce `from_description` class method in PromptTemplate - Implement OpenAI-based prompt generation with example handling - Add support for verbose output and error handling - Enhance type hinting and import necessary utilities
…ocumentation - Rename languru/prompts/base.py to languru/prompts/prompt_template.py - Update import statements in __init__.py files to reflect the new module name - Add comprehensive docstring to PromptTemplate class method in prompt_template.py - Refine instructions for CO-STAR framework in user.py This change improves code organization and enhances documentation for better maintainability and user understanding.
- Create new test file tests/prompts/test_prompt_template.py - Implement test_prompt_template_from_description function - Verify PromptTemplate creation with PerplexityOpenAI client - Use "llama-3-sonar-small-32k-chat" model for testing - Assert the existence of the prompt attribute in the template This test ensures the correct functionality of the PromptTemplate class when creating instances from descriptions, improving the overall test coverage of the prompts module.
- Move OpenAI-specific utilities to dedicated module - Add MD5 hash properties to PromptTemplate for message fingerprinting - Update imports across affected modules and tests - Remove redundant prompt utility module
- Move MD5 hash generation logic to `messages_to_md5` function in openai_utils.py - Update PromptTemplate to use the new function for hash generation - Add comprehensive test cases for PromptTemplate operations - Remove unused imports and perform minor code cleanup
- Implement `chat_completion_once` function in new file `languru/utils/chat.py` - Support multiple input types: message list, query, and system message - Handle prompt variable replacement and message conversion - Integrate with OpenAI's latest Python client - Add verbose logging option for debugging
- Import `chat_completion_once` and remove unused OpenAI utility functions - Update PromptTemplate initialization: - Increase default temperature from 0.3 to 0.7 - Simplify chat completion process using `chat_completion_once` - Improve example generation: - Streamline message creation and formatting - Enhance prompt construction with examples - Adjust prompt formatting in user repository - Optimize multiple_replace function usage in chat completion
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #48 +/- ##
==========================================
+ Coverage 79.98% 81.12% +1.14%
==========================================
Files 62 64 +2
Lines 2143 2215 +72
==========================================
+ Hits 1714 1797 +83
+ Misses 429 418 -11 ☔ View full report in Codecov by Sentry. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Summary of Changes
This PR introduces several modifications to the
languru
package, primarily focusing on restructuring and enhancing the prompt template functionality. The changes can be categorized as follows:Prompt Template Enhancements
base.py
toprompt_template.py
and updated related imports.PromptTemplate
class with new methods and properties:from_description
class method for creating instances from prompt descriptions.md5
andmd5_formatted
properties for generating hash representations.chat.py
for handling chat completions.Code Organization
ensure_openai_chat_completion_content
andensure_openai_chat_completion_message_params
fromcommon.py
toopenai_utils.py
.assistant.py
anduser.py
in theprompts/repositories/
directory to store prompt-related constants.Dependency Updates
poetry.lock
, includinganthropic
,google-ai-generativelanguage
,google-api-python-client
,google-auth
,google-generativeai
,mkdocs-material
,openai
,orjson
, andpillow
.Minor Adjustments