-
Notifications
You must be signed in to change notification settings - Fork 1.1k
Support with_llm_proxy and with_store in algorithms
#398
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
This PR introduces two new decorators, with_store and with_llm_proxy, to simplify dependency injection in algorithm classes. These decorators automatically inject the LightningStore and LLMProxy dependencies into algorithm methods, eliminating manual get_store() and get_llm_proxy() calls and providing automatic lifecycle management for the LLM proxy.
Key changes:
- Added
with_storedecorator to injectLightningStoreinto coroutine methods - Added
with_llm_proxydecorator with optional/required modes and auto-start/stop capabilities - Updated
FastAlgorithm.run()andAPO.run()to use the new decorators
Reviewed changes
Copilot reviewed 4 out of 4 changed files in this pull request and generated 6 comments.
| File | Description |
|---|---|
| agentlightning/algorithm/utils.py | Implements the with_store and with_llm_proxy decorators with type-safe overloads for optional vs required LLM proxy injection |
| tests/algorithm/test_utils.py | Adds comprehensive test coverage for both decorators, including optional/required modes, auto-start/stop behavior, and error handling |
| agentlightning/algorithm/fast.py | Applies decorators to the run() method and removes manual get_store() call |
| agentlightning/algorithm/apo/apo.py | Applies decorators to both run() and get_rollout_results() methods, removing manual get_store() calls |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
agentlightning/algorithm/utils.py
Outdated
| import logging | ||
| import random | ||
| from typing import Iterator, List, Sequence, TypeVar | ||
| from types import CoroutineType |
Copilot
AI
Dec 11, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Using CoroutineType from the types module for type annotations is incorrect. CoroutineType is a runtime type object, not intended for static type hints. Use Coroutine[Any, Any, R] from typing or collections.abc.Coroutine instead. This applies to all the decorator type signatures in this file.
| # then `func` expects a non-optional LLMProxy. | ||
| return await func(self, llm_proxy, *args, **kwargs) | ||
| finally: | ||
| if auto_started and llm_proxy is not None: |
Copilot
AI
Dec 11, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The condition if auto_started and llm_proxy is not None is redundant. Since auto_started is only set to True when llm_proxy is not None (line 156), the second condition is unnecessary. Simplify to if auto_started: for cleaner code.
| if auto_started and llm_proxy is not None: | |
| if auto_started: |
agentlightning/algorithm/apo/apo.py
Outdated
| from agentlightning.store.base import LightningStore | ||
| from agentlightning.llm_proxy import LLMProxy | ||
|
|
||
| from agentlightning.algorithm.utils import with_llm_proxy, with_store |
Copilot
AI
Dec 11, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The import statement is placed after the TYPE_CHECKING block, which violates PEP 8 import ordering conventions. Standard imports should be grouped before TYPE_CHECKING conditional imports. Move this import to line 40, right after the other imports from agentlightning.algorithm.utils.
agentlightning/algorithm/utils.py
Outdated
| llm_proxy = self.get_llm_proxy() | ||
|
|
||
| if required and llm_proxy is None: | ||
| raise ValueError("LLM proxy not found") |
Copilot
AI
Dec 11, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The error message "LLM proxy not found" is vague and doesn't provide actionable guidance. Consider changing it to something more descriptive like "LLM proxy is required but not configured. Call set_llm_proxy() before using this method." to help users understand what action they need to take.
| raise ValueError("LLM proxy not found") | |
| raise ValueError("LLM proxy is required but not configured. Call set_llm_proxy() before using this method.") |
agentlightning/algorithm/fast.py
Outdated
| store: LightningStore, # This param will be stripped by the decorator | ||
| llm_proxy: Optional[LLMProxy], # This param will be stripped by the decorator |
Copilot
AI
Dec 11, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The comment states "This param will be stripped by the decorator" but this is misleading. The decorator doesn't "strip" the parameter - rather, it injects it automatically, so callers don't need to provide it. Consider rephrasing to "Injected by decorator - callers should not provide this parameter" for clarity.
| store: LightningStore, # This param will be stripped by the decorator | |
| llm_proxy: Optional[LLMProxy], # This param will be stripped by the decorator | |
| store: LightningStore, # Injected by decorator - callers should not provide this parameter | |
| llm_proxy: Optional[LLMProxy], # Injected by decorator - callers should not provide this parameter |
agentlightning/algorithm/apo/apo.py
Outdated
| store: LightningStore, # This param will be stripped by the decorator | ||
| llm_proxy: Optional[LLMProxy], # This param will be stripped by the decorator |
Copilot
AI
Dec 11, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The comment states "This param will be stripped by the decorator" but this is misleading. The decorator doesn't "strip" the parameter - rather, it injects it automatically, so callers don't need to provide it. Consider rephrasing to "Injected by decorator - callers should not provide this parameter" for clarity.
| store: LightningStore, # This param will be stripped by the decorator | |
| llm_proxy: Optional[LLMProxy], # This param will be stripped by the decorator | |
| store: LightningStore, # Injected by decorator - callers should not provide this parameter | |
| llm_proxy: Optional[LLMProxy], # Injected by decorator - callers should not provide this parameter |
|
/ci |
|
🚀 CI Watcher for correlation id-3640594509-mj142g11 triggered by comment 3640594509
✅ All runs completed. |
No description provided.