Skip to content

Conversation

@ultmaster
Copy link
Contributor

No description provided.

Copilot AI review requested due to automatic review settings December 11, 2025 02:29
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR introduces two new decorators, with_store and with_llm_proxy, to simplify dependency injection in algorithm classes. These decorators automatically inject the LightningStore and LLMProxy dependencies into algorithm methods, eliminating manual get_store() and get_llm_proxy() calls and providing automatic lifecycle management for the LLM proxy.

Key changes:

  • Added with_store decorator to inject LightningStore into coroutine methods
  • Added with_llm_proxy decorator with optional/required modes and auto-start/stop capabilities
  • Updated FastAlgorithm.run() and APO.run() to use the new decorators

Reviewed changes

Copilot reviewed 4 out of 4 changed files in this pull request and generated 6 comments.

File Description
agentlightning/algorithm/utils.py Implements the with_store and with_llm_proxy decorators with type-safe overloads for optional vs required LLM proxy injection
tests/algorithm/test_utils.py Adds comprehensive test coverage for both decorators, including optional/required modes, auto-start/stop behavior, and error handling
agentlightning/algorithm/fast.py Applies decorators to the run() method and removes manual get_store() call
agentlightning/algorithm/apo/apo.py Applies decorators to both run() and get_rollout_results() methods, removing manual get_store() calls

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

import logging
import random
from typing import Iterator, List, Sequence, TypeVar
from types import CoroutineType
Copy link

Copilot AI Dec 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Using CoroutineType from the types module for type annotations is incorrect. CoroutineType is a runtime type object, not intended for static type hints. Use Coroutine[Any, Any, R] from typing or collections.abc.Coroutine instead. This applies to all the decorator type signatures in this file.

Copilot uses AI. Check for mistakes.
# then `func` expects a non-optional LLMProxy.
return await func(self, llm_proxy, *args, **kwargs)
finally:
if auto_started and llm_proxy is not None:
Copy link

Copilot AI Dec 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The condition if auto_started and llm_proxy is not None is redundant. Since auto_started is only set to True when llm_proxy is not None (line 156), the second condition is unnecessary. Simplify to if auto_started: for cleaner code.

Suggested change
if auto_started and llm_proxy is not None:
if auto_started:

Copilot uses AI. Check for mistakes.
from agentlightning.store.base import LightningStore
from agentlightning.llm_proxy import LLMProxy

from agentlightning.algorithm.utils import with_llm_proxy, with_store
Copy link

Copilot AI Dec 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The import statement is placed after the TYPE_CHECKING block, which violates PEP 8 import ordering conventions. Standard imports should be grouped before TYPE_CHECKING conditional imports. Move this import to line 40, right after the other imports from agentlightning.algorithm.utils.

Copilot uses AI. Check for mistakes.
llm_proxy = self.get_llm_proxy()

if required and llm_proxy is None:
raise ValueError("LLM proxy not found")
Copy link

Copilot AI Dec 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The error message "LLM proxy not found" is vague and doesn't provide actionable guidance. Consider changing it to something more descriptive like "LLM proxy is required but not configured. Call set_llm_proxy() before using this method." to help users understand what action they need to take.

Suggested change
raise ValueError("LLM proxy not found")
raise ValueError("LLM proxy is required but not configured. Call set_llm_proxy() before using this method.")

Copilot uses AI. Check for mistakes.
Comment on lines 192 to 193
store: LightningStore, # This param will be stripped by the decorator
llm_proxy: Optional[LLMProxy], # This param will be stripped by the decorator
Copy link

Copilot AI Dec 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The comment states "This param will be stripped by the decorator" but this is misleading. The decorator doesn't "strip" the parameter - rather, it injects it automatically, so callers don't need to provide it. Consider rephrasing to "Injected by decorator - callers should not provide this parameter" for clarity.

Suggested change
store: LightningStore, # This param will be stripped by the decorator
llm_proxy: Optional[LLMProxy], # This param will be stripped by the decorator
store: LightningStore, # Injected by decorator - callers should not provide this parameter
llm_proxy: Optional[LLMProxy], # Injected by decorator - callers should not provide this parameter

Copilot uses AI. Check for mistakes.
Comment on lines 807 to 808
store: LightningStore, # This param will be stripped by the decorator
llm_proxy: Optional[LLMProxy], # This param will be stripped by the decorator
Copy link

Copilot AI Dec 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The comment states "This param will be stripped by the decorator" but this is misleading. The decorator doesn't "strip" the parameter - rather, it injects it automatically, so callers don't need to provide it. Consider rephrasing to "Injected by decorator - callers should not provide this parameter" for clarity.

Suggested change
store: LightningStore, # This param will be stripped by the decorator
llm_proxy: Optional[LLMProxy], # This param will be stripped by the decorator
store: LightningStore, # Injected by decorator - callers should not provide this parameter
llm_proxy: Optional[LLMProxy], # Injected by decorator - callers should not provide this parameter

Copilot uses AI. Check for mistakes.
@ultmaster
Copy link
Contributor Author

/ci

@github-actions
Copy link

github-actions bot commented Dec 11, 2025

🚀 CI Watcher for correlation id-3640594509-mj142g11 triggered by comment 3640594509
🏃‍♀️ Tracking 3 workflow run(s):

✅ All runs completed.

@ultmaster ultmaster merged commit 1e36e66 into main Dec 11, 2025
14 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants