-
Notifications
You must be signed in to change notification settings - Fork 368
baml-language: implement specialize_prompt #3052
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 81e1c42103
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| if !seen_first_system { | ||
| seen_first_system = true; | ||
| PromptAst::Message { | ||
| role, | ||
| content, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Only allow system messages in position 0
In consolidate_system_prompts, the “first” system message is kept even if it appears after a user message because the code tracks only the first system encountered, not the message index. For providers with max_one_system_prompt (e.g., Anthropic/Vertex), the runtime logic only permits a system message in slot 0 and converts any later system messages to user. With the current implementation, a prompt like [user, system, user] will preserve the system message in the middle, which violates those providers’ ordering requirements and changes behavior vs. the existing runtime semantics.
Useful? React with 👍 / 👎.
CodSpeed Performance ReportMerging this PR will not alter performanceComparing
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🤖 AI Code Review
Assessment ⚠️
Follow-up Review: Previous Issues Not Addressed
The update from 2ef6f36 to b092e5c contains only cosmetic changes (formatting, import ordering, comment backticks). Both critical issues from the previous review remain unaddressed:
- High severity: Metadata loss in
merge_adjacent_messages- silently discards metadata from merged messages (line 56 in transformations.rs) - Medium severity: Missing bounds check in
execute_specialize_prompt- can panic on invalid args (line 1410 in lib.rs)
The implementation is otherwise well-structured with comprehensive tests and clear documentation. However, the metadata loss issue is a correctness bug that will lose user data without warning, and should be fixed before merging.
Recommendation: Address the metadata handling in message merging before proceeding. The bounds check is also recommended for defensive programming.
Note
Tag @mendral-app with feedback or questions.
Note
Implements
specialize_promptoperation for LLM clients, applying provider-specific transformations (message merging, system prompt consolidation, metadata filtering) based on model capabilities.Written by Mendral for commit b092e5c.