Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
44 changes: 40 additions & 4 deletions AGENTS.md
Original file line number Diff line number Diff line change
@@ -1,18 +1,54 @@
# Agent instructions for github_pm workspaces

## Read this file when starting or resuming work

- **Open `AGENTS.md` again** when you begin a task on this repo or return after a long gap, so required checks (below) stay in context until **`tox`** and frontend checks are green.

## Required checks before finishing any task

- **`tox` must complete successfully** for every change that touches the Python backend (and should be run once you believe backend work is done). Run it from the **`backend`** directory:
### Python backend (`backend/`)

- **`tox` must complete successfully** for every change that touches the Python backend (and must be run once you believe backend work is done):

```bash
cd backend && tox
```

This runs the environments defined in `backend/pyproject.toml` (format, import order, lint, tests, coverage). Do **not** consider backend work complete while **`tox`** reports failures.
This runs the environments defined in `backend/pyproject.toml`: **Black** (`format`), **isort** (`isort`), **flake8** (`lint`), **pytest** (`test`), and **coverage** (`coverage`). Do **not** consider backend work complete while **`tox`** reports failures.

- **While iterating**, you may run a faster subset (still required before hand-off if you only used this shortcut):

```bash
cd backend && tox -e format,isort,lint
```

When that is green, run the full **`tox`** (including **`test`** / **`coverage`**) before stopping.

- **Auto-fixing style** (use only when you are already touching those files; avoid unrelated reformatting): from `backend/` with dev dependencies installed:

```bash
uv sync --extra dev
uv run black src tests
uv run isort src tests
```

Then re-run **`tox`** (or at least **`tox -e format,isort,lint`**) so checks pass without relying on uncommitted formatter drift.

- **flake8** enforces more than imports: for example **E731** forbids assigning a **`lambda`** where a nested **`def`** is clearer. Fix all **flake8** issues, not only import order.

- **isort** is configured in `pyproject.toml` (`profile = "black"`, `known_first_party = ["github_pm"]`). First-party imports must match that layout (including ordering among `github_pm.*` imports).

- **Fix all failures** those tools report **before** stopping. A green full **`tox`** run is the acceptance bar for backend changes.

### Frontend (`frontend/`)

When the task changes UI or client code under `frontend/src/`:

- **Fix all lint failures and unit test failures** reported by those checks (and any other checks you ran) **before** stopping. A green **`tox`** run is the acceptance bar for backend changes.
```bash
cd frontend && npm run format && npm run format:check && npm test
```

- For **frontend** (`frontend/`) changes, run **`npm test`** (and **`npm run format:check`** if you edited formatted sources) from `frontend/` and fix failures there as well when the task involves the UI or client code.
- **`npm run format`** applies Prettier; **`npm run format:check`** verifies formatting in CI style; **`npm test`** runs the Vitest suite. Do not skip **`format:check`** after editing formatted sources.

## Notes

Expand Down
25 changes: 25 additions & 0 deletions backend/src/github_pm/api.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
import re
import time
from typing import Annotated, Any, AsyncGenerator
from urllib.parse import quote_plus

from fastapi import APIRouter, Body, Depends, HTTPException, Path, Query
from pydantic import BaseModel, Field
Expand Down Expand Up @@ -71,6 +72,30 @@ def get_paged(self, path: str, headers: dict[str, str] | None = None) -> list[di
break
return results

def search_issue_items(
self, search_query: str, headers: dict[str, str] | None = None
) -> list[dict]:
"""Run ``GET /search/issues`` with pagination; returns the ``items`` array union."""
q_param = quote_plus(search_query)
url: str | None = f"{self.base_url}/search/issues?q={q_param}&per_page=100"
results: list[dict] = []
while url:
response = self.github.get(url, headers=headers)
response.raise_for_status()
data = response.json()
items = data.get("items")
if isinstance(items, list):
results.extend(items)
url = None
link_header = response.headers.get("link")
if link_header:
for link in link_header.split(","):
if 'rel="next"' in link:
url = link.split(";")[0].strip().strip("<>")
logger.debug("search/issues paging to: %s", url)
break
return results

def patch(
self, path: str, data: dict[str, Any], headers: dict[str, str] | None = None
) -> dict:
Expand Down
2 changes: 2 additions & 0 deletions backend/src/github_pm/app.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@

from github_pm.api import api_router
from github_pm.sdlc_api import sdlc_router
from github_pm.status_report_api import status_report_router

router = APIRouter()

Expand All @@ -13,6 +14,7 @@ async def health():

router.include_router(api_router, prefix="/api/v1")
router.include_router(sdlc_router, prefix="/api/v1")
router.include_router(status_report_router, prefix="/api/v1")

app = FastAPI(
title="GitHub Project Management API",
Expand Down
99 changes: 97 additions & 2 deletions backend/src/github_pm/sdlc_metrics.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
from __future__ import annotations

from collections.abc import Callable, Iterable, Mapping, Sequence
from datetime import datetime, timedelta, UTC
from datetime import date, datetime, timedelta, UTC
import re
from typing import Any, Literal
from urllib.parse import quote_plus
Expand Down Expand Up @@ -340,6 +340,7 @@ def graphql_search_pull_requests(
search_query: str,
*,
page_size: int = 100,
filter_bot_authors: bool = True,
) -> list[dict[str, Any]]:
"""Paginate GitHub GraphQL search (PullRequest nodes)."""
nodes: list[dict[str, Any]] = []
Expand All @@ -351,6 +352,8 @@ def graphql_search_pull_requests(
nodes {
... on PullRequest {
number
title
url
createdAt
mergedAt
additions
Expand Down Expand Up @@ -384,7 +387,71 @@ def graphql_search_pull_requests(
raise RuntimeError(f"GitHub GraphQL error: {errors!r}")
search = data.get("data", {}).get("search") or {}
batch = search.get("nodes") or []
nodes.extend(filter_out_bot_pr_nodes(batch))
if filter_bot_authors:
nodes.extend(filter_out_bot_pr_nodes(batch))
else:
nodes.extend([n for n in batch if n and n.get("number") is not None])
page = search.get("pageInfo") or {}
if not page.get("hasNextPage"):
break
cursor = page.get("endCursor")
if not cursor:
break
return nodes


def graphql_search_timeline_nodes(
post_graphql: Callable[[dict[str, Any]], dict[str, Any]],
search_query: str,
*,
page_size: int = 100,
) -> list[dict[str, Any]]:
"""Paginate GraphQL ``search(type: ISSUE)`` returning Issue and PullRequest nodes.

Used when REST ``GET /search/issues`` rejects the same query string (422). Request only
fields common to both types.
"""
nodes: list[dict[str, Any]] = []
cursor: str | None = None
gql = """
query($q: String!, $first: Int!, $after: String) {
search(query: $q, type: ISSUE, first: $first, after: $after) {
pageInfo { hasNextPage endCursor }
nodes {
__typename
... on PullRequest {
number
title
url
createdAt
}
... on Issue {
number
title
url
createdAt
}
}
}
}
"""
while True:
payload = {
"query": gql,
"variables": {
"q": search_query,
"first": page_size,
"after": cursor,
},
}
data = post_graphql(payload)
errors = data.get("errors")
if errors:
logger.error("GraphQL errors: %s", errors)
raise RuntimeError(f"GitHub GraphQL error: {errors!r}")
search = data.get("data", {}).get("search") or {}
batch = search.get("nodes") or []
nodes.extend([n for n in batch if n and n.get("number") is not None])
page = search.get("pageInfo") or {}
if not page.get("hasNextPage"):
break
Expand All @@ -405,6 +472,34 @@ def merged_prs_query(github_repo: str, merged_since: datetime) -> str:
)


def merged_prs_query_between(github_repo: str, start_d: date, end_d: date) -> str:
"""GraphQL issue search: merged PRs with merge date in ``[start_d, end_d]`` (UTC calendar days).

REST ``GET /search/issues`` rejects several merged/closed date combinations (422); GraphQL
``search`` accepts ``merged:`` ranges the same way as the web UI.
"""
a, b = start_d.isoformat(), end_d.isoformat()
if a > b:
a, b = b, a
return f"{repo_search_fragment(github_repo)} is:pr is:merged merged:{a}..{b}"


def opened_prs_between_query(github_repo: str, start_d: date, end_d: date) -> str:
"""PRs with ``created`` in ``[start_d, end_d]`` (UTC calendar days, inclusive)."""
a, b = start_d.isoformat(), end_d.isoformat()
if a > b:
a, b = b, a
return f"{repo_search_fragment(github_repo)} is:pr created:{a}..{b}"


def opened_issues_between_query(github_repo: str, start_d: date, end_d: date) -> str:
"""Issues (not PRs) with ``created`` in ``[start_d, end_d]`` (UTC calendar days, inclusive)."""
a, b = start_d.isoformat(), end_d.isoformat()
if a > b:
a, b = b, a
return f"{repo_search_fragment(github_repo)} is:issue created:{a}..{b}"


def opened_prs_query(github_repo: str, created_since: datetime) -> str:
return (
f"{repo_search_fragment(github_repo)} is:pr "
Expand Down
37 changes: 37 additions & 0 deletions backend/src/github_pm/status_report_api.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
"""REST API for the weekly project status report."""

from __future__ import annotations

from datetime import date, datetime, UTC
from typing import Annotated

from fastapi import APIRouter, Depends, Query

from github_pm.api import connection, Connector
from github_pm.status_report_models import ProjectStatusReportResponse
from github_pm.status_report_service import build_project_status_report

status_report_router = APIRouter(tags=["project-status"])


def _default_end_date() -> date:
return datetime.now(UTC).date()


@status_report_router.get("/project-status", response_model=ProjectStatusReportResponse)
async def get_project_status_report(
gitctx: Annotated[Connector, Depends(connection)],
end_date: Annotated[
date | None,
Query(
description="Last day of the 7-day window (UTC calendar date). Defaults to today in UTC.",
),
] = None,
):
"""
Status for seven **calendar** days inclusive: ``end_date - 6 days`` through ``end_date``.

Sections: merged pull requests (by merge date), pull requests opened, issues opened (PRs excluded).
"""
resolved_end = end_date if end_date is not None else _default_end_date()
return build_project_status_report(gitctx, end_date=resolved_end)
34 changes: 34 additions & 0 deletions backend/src/github_pm/status_report_models.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
"""Pydantic models for the weekly project status report API."""

from __future__ import annotations

from datetime import date

from pydantic import BaseModel, Field


class StatusReportItem(BaseModel):
"""A GitHub issue or pull request row for the status UI."""

number: int = Field(description="Issue or PR number")
title: str = Field(description="Title")
html_url: str = Field(description="GitHub HTML URL for the issue or PR")


class ProjectStatusReportResponse(BaseModel):
"""Seven calendar days inclusive ending on ``end_date`` (UTC calendar dates)."""

start_date: date = Field(description="First calendar day of the window (inclusive)")
end_date: date = Field(description="Last calendar day of the window (inclusive)")
merged_pull_requests: list[StatusReportItem] = Field(
default_factory=list,
description="Pull requests merged in the window (by merge date)",
)
opened_pull_requests: list[StatusReportItem] = Field(
default_factory=list,
description="Pull requests created in the window",
)
opened_issues: list[StatusReportItem] = Field(
default_factory=list,
description="Issues created in the window (pull requests excluded)",
)
Loading
Loading