Refine chat cleanup, streaming suspend, and fallback handling#60
Merged
JohnRichard4096 merged 3 commits intomainfrom Apr 26, 2026
Merged
Refine chat cleanup, streaming suspend, and fallback handling#60JohnRichard4096 merged 3 commits intomainfrom
JohnRichard4096 merged 3 commits intomainfrom
Conversation
Contributor
Reviewer's GuideAdjusts chat session cleanup semantics, improves fallback event triggering and streaming suspend/resume robustness, and bumps the package patch version. Sequence diagram for updated streaming suspend and resume flowsequenceDiagram
actor Client
participant StreamingController
participant SuspendSignal as Suspend_Future
participant ResumeSignal as Resume_Future
Client->>StreamingController: call _wait_for_continue(tag)
alt first_suspend
StreamingController->>SuspendSignal: await __suspend_signal
SuspendSignal-->>StreamingController: result True
StreamingController-->>Client: return True
else already_suspended
StreamingController->>StreamingController: check __suspend_signal.done()
alt not done
StreamingController->>SuspendSignal: set_result(True)
end
StreamingController->>ResumeSignal: create new __resume_signal
StreamingController->>ResumeSignal: await __resume_signal
ResumeSignal-->>StreamingController: result True
StreamingController-->>Client: return True
end
Client->>StreamingController: call resume()
StreamingController->>StreamingController: check __resume_signal and not done
alt resume_signal_pending
StreamingController->>ResumeSignal: set_result(True)
end
StreamingController->>StreamingController: _suspend_tags = None
StreamingController->>StreamingController: __resume_signal = None
StreamingController->>StreamingController: __suspend_signal = None
Class diagram for updated ChatManager and streaming controlclassDiagram
class ChatManager {
+dict~str, ChatObjectMeta~ running_chat_object
+dict~str, ChatObjectMeta~ running_chat_object_id2map
+bool clean_obj(k: str, maxitems: int)
+list~ChatObjectMeta~ get_all_objs()
}
class ChatObjectMeta {
+str stream_id
+datetime start_at
+datetime end_at
+bool is_done
}
class StreamingController {
-Future __suspend_signal
-Future __resume_signal
-list~str~ _suspend_tags
+async bool _wait_for_continue(tag: str)
+void resume()
+bool queue_closed()
}
ChatManager "1" o-- "many" ChatObjectMeta : manages
StreamingController "1" o-- "many" ChatObjectMeta : streams
File-Level Changes
Assessment against linked issues
Tips and commandsInteracting with Sourcery
Customizing Your ExperienceAccess your dashboard to:
Getting Help
|
Member
Author
|
@sourcery-ai title |
Contributor
There was a problem hiding this comment.
Hey - I've found 1 issue, and left some high level feedback:
- In
ChatManager.clean_obj, accessingself.running_chat_object[k]directly will raise aKeyErrorif the key is missing; consider usingget(and returningFalseearly) to make this safer for callers that may invokeclean_objdefensively. - When logging the warning about "too many chat objects" in
_entry, it may be more actionable to include the actual object count (and limit) in the log message to help diagnose memory leak patterns.
Prompt for AI Agents
Please address the comments from this code review:
## Overall Comments
- In `ChatManager.clean_obj`, accessing `self.running_chat_object[k]` directly will raise a `KeyError` if the key is missing; consider using `get` (and returning `False` early) to make this safer for callers that may invoke `clean_obj` defensively.
- When logging the warning about "too many chat objects" in `_entry`, it may be more actionable to include the actual object count (and limit) in the log message to help diagnose memory leak patterns.
## Individual Comments
### Comment 1
<location path="src/amrita_core/streaming.py" line_range="153-156" />
<code_context>
"""Resume to run when suspend."""
if self.__resume_signal and not self.__resume_signal.done():
self.__resume_signal.set_result(True)
- self._suspend_tags = None
+ self._suspend_tags = None
+ self.__resume_signal = None
+ self.__suspend_signal = None # Just for cleanup
def queue_closed(self) -> bool:
</code_context>
<issue_to_address>
**issue (bug_risk):** Resetting `__suspend_signal` to `None` increases the risk of `AttributeError` in `_wait_for_continue` if the lifecycle ever gets out of sync.
`_wait_for_continue` calls `self.__suspend_signal.set_result(True)` assuming it’s always a `Future`, but `resume()` now assigns `None` to `__suspend_signal`. Any path that triggers a new suspend without recreating the `Future` will raise `AttributeError`. To decouple these lifecycles, either guard the call (e.g. `if self.__suspend_signal and not self.__suspend_signal.done():`) or avoid setting `__suspend_signal` to `None` and simply overwrite it on the next suspend cycle.
</issue_to_address>Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
close #59
Summary by Sourcery
Address stability and cleanup issues in chat management and streaming while bumping the package patch version.
Bug Fixes:
Enhancements:
Build: