Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
19 changes: 19 additions & 0 deletions public-api/v1alpha/AGENTS.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
# Repository Guidelines

## Project Structure & Module Organization
Source lives in `lib/pipelines_api`, which exposes HTTP controllers that call internal gRPC clients. Regenerated gRPC stubs sit in `lib/internal_api`; rerun `make pb.gen` instead of editing them. Environment configuration is under `config/*.exs` (overrides in `config/runtime.exs`). Shared scripts live in `priv/script`, Helm assets in `helm/`, and ExUnit fixtures in `test/` plus `test/support`.

## Build, Test, and Development Commands
Use `make console` to enter the Docker dev container with dependencies mounted. Start the server via `iex -S mix` inside that shell; it binds to port 4004 per `docker-compose.yml`. Run the suite through `make unit-test` or `mix test`. Refresh dependencies with `mix deps.get`, and enforce formatting with `mix format`. Static analysis helpers include `mix credo --strict` and `mix dialyzer` (after `mix deps.get && mix compile`).

## Coding Style & Naming Conventions
Stick to idiomatic Elixir: two-space indentation, pipeline-friendly code, and one module per file (e.g. `lib/pipelines_api/web/router.ex` for `PipelinesAPI.Web.Router`). Run `mix format` before opening a PR; `.formatter.exs` defines the inputs. Let Credo guide readabilityβ€”fix warnings rather than ignoring them. Prefer descriptive atoms, snake_case for functions, PascalCase for modules, and add concise `@moduledoc` notes to complex modules.

## Testing Guidelines
The project uses ExUnit with helpers in `test/support`. Place new suites under `test/`, mirroring the source path, and name files `_test.exs`. Run `mix test --cover` when touching request or authorization logic to watch coverage. For gRPC integrations, lean on `GrpcMock` rather than real services. Keep tests deterministic by stubbing network calls and timestamps with `Faker` or recorded fixtures.

## Commit & Pull Request Guidelines
Follow the Conventional Commits pattern seen in history (`type(scope): summary`), using scopes such as `secrethub` or `docs` to clarify impact. Keep commits focused and include test or lint updates when relevant. Pull requests must describe the scenario, list impacted gRPC endpoints, and note any Helm or config follow-up. Link tracking issues, call out breaking changes, and include the latest `mix test` or `mix credo` output before requesting review.

## Security & Configuration Tips
Environment defaults live in the Makefile and `docker-compose.yml`; override gRPC endpoints via variables like `LOGHUB_API_URL` or `API_VERSION`. Never hardcode secretsβ€”use the provided env vars or the helpers in `lib/pipelines_api/secrets`. When updating protobuf definitions, ensure regenerated clients do not expose new fields without matching authorization checks in the HTTP layer.
44 changes: 44 additions & 0 deletions public-api/v1alpha/DOCUMENTATION.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
# Repository Architecture Notes

## Service Purpose & Entry Points
Pipelines API is the public HTTP faΓ§ade for Semaphore’s Pipelines domain. It receives REST requests, validates them, and forwards calls to internal gRPC services such as Pipelines, Workflows, Deployments, RBAC, and SecretHub. The OTP entry point is `lib/pipelines_api/application.ex`, which starts a `Plug.Cowboy` endpoint on port 4004, initialises feature providers, and warms two `Cachex` caches (`:feature_provider_cache`, `:project_api_cache`).

## Request Flow & Runtime Stack
- HTTP traffic lands in `lib/pipelines_api/router.ex`, a `Plug.Router` that wires every route to a domain-specific module (e.g., `PipelinesAPI.Pipelines.Describe`).
- Domain modules follow a consistent pattern: parse params, call into a `*_client` module, and reply via helpers in `PipelinesAPI.Pipelines.Common` or a domain-specific responder. Success and error tuples map to HTTP status automatically.
- Authentication/authorization gates live in the RBAC and Feature clients; there are no custom Plug modules today (`lib/pipelines_api/plugs/` is intentionally empty).
- Long-running gRPC calls run inside `Wormhole.capture` to enforce timeouts defined in `config/config.exs`. Metrics flow through `PipelinesAPI.Util.Metrics` into Watchman/StatsD (`watchman` config block).

## Directory Map
- `lib/pipelines_api/pipelines`, `workflows`, `deployments`, `schedules`, `self_hosted_agent_types`, etc.: HTTP handlers grouped by resource. Files mirror verbs (`list.ex`, `describe.ex`, `terminate.ex`).
- `lib/pipelines_api/*_client/`: Each client encapsulates gRPC glue with submodules for request formatting, the actual gRPC stub wrapper, and response shaping. Single-file clients (e.g., `jobs_client.ex`) follow the same tuple contract.
- `lib/internal_api/`: Generated gRPC stubs. Regenerate via `make pb.gen`, which clones `renderedtext/internal_api` and runs `scripts/internal_protos.sh`.
- `config/`: Compile-time (`config.exs`) and runtime (`runtime.exs`) settings. `config/test.exs` shortens gRPC and Wormhole timeouts; `config/dev.exs` points metrics to the local StatsD agent.
- `test/support/`: Shared stubs and fake services. `Support.FakeServices` boots `GrpcMock` servers on port 50052 so unit tests never hit production systems.
- `scripts/`: `internal_protos.sh` for protobuf regeneration and `vagrant_sudo` helper for privileged Docker commands.

## External Integrations
- Environment variables in `Makefile` and `docker-compose.yml` provide endpoints (`PPL_GRPC_URL`, `LOGHUB_API_URL`, `FEATURE_GRPC_URL`, etc.). Override them when connecting to non-default clusters.
- Feature flags are served by a remote Feature Hub unless `ON_PREM=true`, in which case a YAML provider is loaded with `FEATURE_YAML_PATH`.
- Response pagination goes through `Scrivener.Headers`, which rewrites paths to `/api/<API_VERSION>/…` before responses leave the service.

## Development & Diagnostics Workflow
- `make console` launches the Docker dev container with dependencies and shares `_build`/`deps` for fast recompiles.
- From inside the container: start the API via `iex -S mix`. Health probes hit `/` or `/health_check/ping`.
- Format and lint with `mix format` and `mix credo --strict`. Optional type checks come from `mix dialyzer` once PLTs are cached.
- Run suites with `make unit-test` or `mix test --cover`. The custom ExUnit formatter writes JUnit XML reports under `./out/test-reports.xml`.
- To inspect gRPC traffic locally, tail logs produced by `PipelinesAPI.Util.Log` or enable DEBUG by exporting `LOG_LEVEL=debug` before boot.

## Testing Infrastructure
- Tests rely heavily on support factories (`test/support/stubs.ex`, `test/support/factories.ex`) to fabricate workflows, pipelines, and users.
- `GrpcMock` doubles are registered for every dependency (SecretHub, Gofer, Pipeline, ProjectHub, etc.) in `Support.FakeServices.init/0`.
- When adding new gRPC calls, extend the relevant mock to cover the new RPC and update helper stubs so fixtures stay meaningful.

## Common Change Playbooks
1. **Add or adjust an endpoint**: Update `router.ex`, create/modify the domain module under `lib/pipelines_api/<resource>/`, and ensure responses return `{status, payload}` tuples through the common helper.
2. **Add a gRPC call**: Touch the appropriate `*_client` directoryβ€”update the RequestFormatter, extend the `GrpcClient` wrapper, and cover ResponseFormatter cases. Regenerate protobuf stubs if the contract changed.
3. **Introduce feature-flagged behaviour**: Depend on the provider returned from `Application.get_env(:pipelines_api, :feature_provider)`, and store expensive lookups in Cachex to match existing patterns.
4. **Regenerate protobufs**: Run `make pb.gen`, commit resulting changes under `lib/internal_api`, and verify no manual edits are lost.
5. **Triage production incidents**: Use `/logs/:job_id` for streaming logs, `/troubleshoot/*` endpoints for aggregated context, inspect Watchman metrics for latency spikes, and confirm caches or feature toggles aren’t stale.

Keep this document nearby when picking up new tasksβ€”most flows follow the patterns above, so identifying the right directory or client is usually the quickest path to a fix.
2 changes: 1 addition & 1 deletion public-api/v1alpha/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ ENV MIX_ENV=$BUILD_ENV

RUN echo "Build for $MIX_ENV environment started"

RUN apk update && apk add --no-cache build-base git python3 curl openssh
RUN apk update && apk add --no-cache build-base git python3 curl openssh bash

RUN mkdir -p ~/.ssh
RUN touch ~/.ssh/known_hosts
Expand Down
4 changes: 3 additions & 1 deletion public-api/v1alpha/docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -45,4 +45,6 @@ services:
ON_PREM: ${ON_PREM:-false}

volumes:
- .:/app
- .:/app
- /app/_build
- /app/deps
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,16 @@ defmodule PipelinesAPI.GoferClient.RequestFormatter do

defp to_int(val, _field) when is_integer(val), do: val

defp to_int(val, field) do
defp to_int(val, field) when is_binary(val) do
case Integer.parse(val) do
{int, ""} -> int
_ -> invalid_integer(field, val)
end
end

defp to_int(val, field), do: invalid_integer(field, val)

defp invalid_integer(field, val) do
"Invalid value of '#{field}' param: #{inspect(val)} - needs to be integer."
|> ToTuple.user_error()
|> throw()
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -118,7 +118,16 @@ defmodule PipelinesAPI.PeriodicSchedulerClient.RequestFormatter do

defp to_int(val, _field) when is_integer(val), do: val

defp to_int(val, field) do
defp to_int(val, field) when is_binary(val) do
case Integer.parse(val) do
{int, ""} -> int
_ -> invalid_integer(field, val)
end
end

defp to_int(val, field), do: invalid_integer(field, val)

defp invalid_integer(field, val) do
"Invalid value of '#{field}' param: #{inspect(val)} - needs to be integer."
|> ToTuple.user_error()
|> throw()
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,8 @@ defmodule PipelinesAPI.GoferClient.RequestFormatter.Test do

alias InternalApi.Gofer.{
TriggerRequest,
EnvVariable
EnvVariable,
ListTriggerEventsRequest
}

test "form_trigger_request() returns internal error when it is not called with map as a param" do
Expand Down Expand Up @@ -52,4 +53,27 @@ defmodule PipelinesAPI.GoferClient.RequestFormatter.Test do
assert {:error, {:user, msg}} = RequestFormatter.form_trigger_request(params)
assert msg == "Invalid value of 'override' param: \"not-a-bool-val\" - needs to be boolean."
end

test "form_list_request() converts numeric params received as strings" do
params = %{
"switch_id" => "sw1",
"name" => "tg1",
"page" => "2",
"page_size" => "15"
}

assert {:ok,
%ListTriggerEventsRequest{
switch_id: "sw1",
target_name: "tg1",
page: 2,
page_size: 15
}} = RequestFormatter.form_list_request(params)
end

test "form_list_request() returns user error when numeric params are invalid" do
assert {:error, {:user, msg}} = RequestFormatter.form_list_request(%{"page_size" => "ten"})

assert msg == "Invalid value of 'page_size' param: \"ten\" - needs to be integer."
end
end
Original file line number Diff line number Diff line change
Expand Up @@ -119,6 +119,19 @@ defmodule PipelinesAPI.PeriodicSchedulerClient.RequestFormatter.Test do
assert request.requester_id == ""
end

test "form_list_request() converts pagination params provided as strings" do
conn = create_conn(:describe)

params = %{
"project_id" => UUID.uuid4(),
"page" => "3",
"page_size" => "25"
}

assert {:ok, %ListRequest{page: 3, page_size: 25}} =
RequestFormatter.form_list_request(params, conn)
end

# Run Now

test "form_run_now_request() returns internal error when it is not called with map as a param" do
Expand Down