Skip to content

Conversation

@Teagan42
Copy link
Contributor

@Teagan42 Teagan42 commented Oct 5, 2025

Summary

  • wire the loader pipeline to the persistence stage so the stage manages batching, queue emission, and concurrent upsert workers
  • expand the persistence stage to chunk point batches, run Qdrant writes with retry callbacks, and notify the pipeline for progress accounting
  • add unit tests covering successful upserts, retry queue population, semaphore release on errors, and bump the project version to 0.26.69

Testing

  • uv run ruff check .
  • uv run pytest

https://chatgpt.com/codex/tasks/task_e_68e2d3eef50c83289cafe78ad1d1452e

@github-actions
Copy link

github-actions bot commented Oct 5, 2025

Coverage

Coverage Report
FileStmtsMissCoverMissing
mcp_plex/loader
   __init__.py74612983%173–180, 229, 315–320, 922–923, 967, 969, 1034–1035, 1046–1048, 1051, 1054–1055, 1057, 1079, 1089, 1100–1132, 1147, 1149, 1156–1170, 1173–1187, 1193, 1208–1239, 1244–1277, 1280–1284, 1289–1297, 1307, 1375–1377, 1438–1440, 1798
mcp_plex/loader/pipeline
   channels.py70297%19–20
   enrichment.py3344487%77, 79, 86, 90, 133–136, 169, 190, 210, 218–220, 227–230, 233–235, 243, 300, 356, 379, 383, 385, 411, 429, 464, 470, 473–481, 507, 531, 602, 605–607
   ingestion.py81890%67, 98–108, 129, 157, 179
   persistence.py73692%103, 111, 116, 120–122
mcp_plex/server
   __init__.py6142995%43–44, 119–120, 148, 252, 256, 277–280, 297, 362, 365, 402, 420–421, 458, 1109, 1131–1137, 1173, 1191, 1196, 1214, 1338, 1375
   __main__.py440%3–8
   config.py48785%50, 52–55, 65, 76
TOTAL218022989% 

Tests Skipped Failures Errors Time
123 0 💤 0 ❌ 0 🔥 57.045s ⏱️

@Teagan42 Teagan42 merged commit 4cbe7ed into main Oct 5, 2025
4 checks passed
@Teagan42 Teagan42 deleted the codex/relocate-batching-logic-and-upsert-worker branch October 5, 2025 20:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants