-
Notifications
You must be signed in to change notification settings - Fork 55
Simple script to ask LLM #431
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Warning Rate limit exceeded@tisnik has exceeded the limit for the number of commits or files that can be reviewed per hour. Please wait 22 minutes and 55 seconds before requesting another review. ⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. 📒 Files selected for processing (1)
WalkthroughAdds a new CLI Python script at Changes
Sequence Diagram(s)sequenceDiagram
autonumber
actor User
participant Script as query_llm.py
participant API as Local LLM API (DEFAULT_URL)
Note over Script: Build JSON payload {query, system_prompt}<br/>start timer
User->>Script: Run script (CLI opts)
Script->>API: POST /v1/query/ with JSON + timeout
alt HTTP error (non-2xx)
API-->>Script: HTTP error
Script-->>User: stderr failure msg (elapsed) and exit 1
else 200 OK
API-->>Script: 200 OK + body (text)
Script->>Script: parse JSON
alt invalid JSON
Script-->>User: stderr JSON parse error + response snippet and exit 2
else JSON parsed
alt missing "response"
Script-->>User: stderr missing-field + print JSON and exit 3
else has "response"
Script-->>User: stdout response + "Response time ..." and exit 0
end
end
end
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~20 minutes Poem
✨ Finishing Touches
🧪 Generate unit tests
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. CodeRabbit Commands (Invoked using PR/Issue comments)Type Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🧹 Nitpick comments (4)
scripts/query_llm.py (4)
1-1: Use env-based shebang for portability.Switching to /usr/bin/env python3 makes the script work across environments where Python may not live at /usr/bin/python3. Covered in the comprehensive diff below.
3-3: Tighten the docstring (what/why/how).A short “what” plus a hint on usage helps future readers; consider mentioning the default URL and that it’s a quick CLI helper. Covered in the comprehensive diff below.
5-5: Prefer perf_counter() over time() for elapsed timing.perf_counter() is monotonic and higher resolution; better for measuring request latency. Covered in the comprehensive diff below.
9-9: Minor grammar and configurability.
- Grammar nit: “You are helpful assistant” → “You are a helpful assistant”.
- Hardcoding the payload makes the script less useful; add argparse to pass query/system prompt, with sensible defaults.
Covered in the comprehensive diff below.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
💡 Knowledge Base configuration:
- MCP integration is disabled by default for public repositories
- Jira integration is disabled by default for public repositories
- Linear integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
📒 Files selected for processing (1)
scripts/query_llm.py(1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: build-pr
- GitHub Check: e2e_tests
🔇 Additional comments (1)
scripts/query_llm.py (1)
7-7: All set: dependency and executable bit are correctly configuredThe
requestslibrary is already declared inpyproject.toml(requests>=2.32.4), andscripts/query_llm.pyhas the executable bit set (-rwxr-xr-x). No further changes needed.
ecd03b1 to
b9f7ad5
Compare
|
@CodeRabbit fullreview |
|
@tisnik I'll conduct a full review of all the changes in this PR. ✅ Actions performedFull review triggered. |
b9f7ad5 to
ae8f979
Compare
Description
Simple script to ask LLM. Simpler to type than
curlcommand with all its quoting/escaping on CLI.Type of change
Summary by CodeRabbit