feat: support AIP filtering#4117
Conversation
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
📝 WalkthroughWalkthroughAdds a typed deepObject filter system: TypeSpec and OpenAPI updates, a query parser (filters.Parse), API→internal conversions, new API/internal filter types and validation, template/middleware wiring to call the parser, handler/adapters updated to use converted Ent predicates, extensive tests, and supporting docs and Makefile changes. Changes
Sequence Diagram(s)sequenceDiagram
participant Client
participant Chi as Chi middleware
participant Handler as HTTP Handler
participant Parser as filters.Parse
participant Converter as filters.FromAPIFilter*
participant Validator as pkg/filter.Validate
participant Ent as Ent ORM
participant DB as Database
Client->>Chi: GET /resources?filter[field][contains]=foo
Chi->>Handler: pass request with params
Handler->>Parser: filters.Parse(r.URL.Query(), ¶ms.Filter)
Parser-->>Handler: populated params.Filter (typed)
Handler->>Converter: FromAPIFilterString(¶ms.Filter.Field)
Converter-->>Handler: *pkg/filter.FilterString (or nil, error)
Handler->>Validator: Validate/ValidateWithComplexity(filter)
Validator-->>Handler: ok / validation error
Handler->>Ent: q.Where(filter.Select("field")...)
Ent->>DB: SQL query with WHERE (ILIKE / predicates)
DB-->>Ent: rows
Ent-->>Handler: results
Handler-->>Client: 200 OK / error
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes Possibly related PRs
Suggested reviewers
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 3
🧹 Nitpick comments (8)
api/spec/packages/aip/src/llmcost/operations.tsp (1)
15-25: LGTM!Good use of
x-go-typeandx-go-type-importextensions to bind the TypeSpec model to the Go implementation. This ensures the generated code uses the actual filter type fromapi/v3/filters.Note: The TODO comment on line 17 mentions this is temporary - might be worth creating a follow-up issue to track the permanent solution.
Would you like me to open an issue to track resolving the TODO for the temporary filter API solution?
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@api/spec/packages/aip/src/llmcost/operations.tsp` around lines 15 - 25, Add a follow-up issue to track replacing the temporary filter API solution noted in the TODO: document the work to remove or replace the temporary FilterSingleString model (model name: FilterSingleString) and to implement the permanent filtering approach that uses the real filter types in api/v3/filters (extensions: x-go-type and x-go-type-import). In the issue, include the file and TODO context, desired end-state (remove temporary model, wire TypeSpec to the permanent Go filter types), acceptance criteria, and any migration steps for generated code so someone can pick it up later.api/v3/filters/filter.go (1)
110-125: Consider allowing combined range operators for numeric filters.The mutual exclusivity validation prevents combining
gt/gtewithlt/lte, which would be useful for range queries like "between 10 and 20". Is this intentional? If users need ranges, they'd have to make two separate requests or use a different approach.This might be fine if the API design intentionally avoids complex filter combinations, just wanted to flag it.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@api/v3/filters/filter.go` around lines 110 - 125, FilterNumeric.Validate currently calls validateMutuallyExclusiveFilters with all numeric operator flags which disallows combining high-side (Gt/Gte) with low-side (Lt/Lte) operators; to support range queries, update the validation so Gt/Gte may be combined with Lt/Lte while keeping other exclusivity rules. Specifically, modify FilterNumeric.Validate (or the call to validateMutuallyExclusiveFilters) to exclude the mutual-exclusion check between the group (f.Gt != nil || f.Gte != nil) and the group (f.Lt != nil || f.Lte != nil), and instead validate that at most one of Eq/Neq/Oeq is used with any range or single-value operators and that range endpoints are not contradictory when both sides are present.pkg/filter/filter.go (2)
808-823: Consider adding ESCAPE clause for LIKE predicates.The
fieldLikefunction generates LIKE/ILIKE predicates but doesn't include anESCAPE '\'clause. TheEscapeLikePatternfunction uses backslash as the escape character. On Postgres this typically works by default, but explicitly specifying the escape character would make the behavior more portable and explicit.💡 Suggested improvement
func fieldLike(field, value string, negate, insensitive bool) func(*sql.Selector) { return func(s *sql.Selector) { s.Where(sql.P(func(b *sql.Builder) { b.WriteString(s.C(field)) if negate { b.WriteString(" NOT") } if insensitive { b.WriteString(" ILIKE ") } else { b.WriteString(" LIKE ") } b.Arg(value) + b.WriteString(" ESCAPE '\\'") })) } }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@pkg/filter/filter.go` around lines 808 - 823, The LIKE/ILIKE predicates in function fieldLike do not explicitly include the ESCAPE clause used by EscapeLikePattern; update fieldLike to append " ESCAPE '\\'" (with a single backslash as the escape character) to the constructed predicate so the SQL becomes "... LIKE ? ESCAPE '\\'" or "... ILIKE ? ESCAPE '\\'" and preserve the existing negate handling; modify the code building logic in fieldLike to write the escape clause after choosing LIKE vs ILIKE and before adding the argument.
238-241: Consider usingNotPredicates(sql.FieldContainsFold(...))for Ncontains.Right now
Containsuses the Ent helpersql.FieldContainsFold, butNcontainsrolls its own thing withfieldLike. Since Ent doesn't have aFieldNotContainsFold, the cleaner approach is to wrap the contains check withNotPredicates(sql.FieldContainsFold(...)). This keeps both methods aligned and leaning on the same Ent helper.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@pkg/filter/filter.go` around lines 238 - 241, Replace the custom negative-like branch for f.Ncontains that calls fieldLike with a NotPredicates wrapping the same Ent helper used by f.Contains: use NotPredicates(sql.FieldContainsFold(field, *f.Ncontains)) so Ncontains and Contains rely on the same FieldContainsFold logic; remove the fieldLike/ContainsPattern usage for the Ncontains case and ensure the pointer value is passed into FieldContainsFold to preserve the same fold/case-insensitive behavior.api/v3/filters/parsefilters_test.go (1)
41-107: Consider adding error case tests.The tests cover happy paths well, but
ParseFiltershas error returns for invalid targets (non-pointer, nil pointer). Adding tests for these would improve coverage and document the expected error behavior.💡 Example test cases to add
func TestParseFilters_Errors(t *testing.T) { type testFilter struct { Key *FilterString `json:"key,omitempty"` } t.Run("non-pointer target returns error", func(t *testing.T) { qs := url.Values{"filter[key]": {"val"}} var f testFilter err := ParseFilters(qs, f) // not &f require.Error(t, err) assert.Contains(t, err.Error(), "non-nil pointer") }) t.Run("nil pointer returns error", func(t *testing.T) { qs := url.Values{"filter[key]": {"val"}} var f *testFilter err := ParseFilters(qs, f) // nil, not &f require.Error(t, err) }) }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@api/v3/filters/parsefilters_test.go` around lines 41 - 107, Add tests that assert ParseFilters returns errors for invalid targets: create a new TestParseFilters_Errors with two subtests that call ParseFilters(qs, f) where f is (1) a non-pointer value (e.g., var f testFilter) and (2) a nil pointer (e.g., var f *testFilter), and require an error in both cases; reference the ParseFilters function and name the new test TestParseFilters_Errors so it lives alongside TestParseFilters_PointerToPointer and checks the expected error messages/conditions.api/spec/packages/aip/src/common/parameters.tsp (1)
174-194:LabelsFieldFilterduplicatesStringFieldFilterstructure.These two union types have identical operator sets. Consider if one could extend or alias the other to reduce duplication. Though I understand if they're kept separate for semantic clarity or future divergence.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@api/spec/packages/aip/src/common/parameters.tsp` around lines 174 - 194, LabelsFieldFilter duplicates the exact operator set of StringFieldFilter; replace the duplicate by making LabelsFieldFilter an alias or extension of StringFieldFilter (e.g., define LabelsFieldFilter = StringFieldFilter or extend the union) so you remove the repeated union body, and update any usage sites that reference LabelsFieldFilter to continue to work with the aliased/extended type (ensure comments/annotations from the original LabelsFieldFilter are preserved on the alias for clarity).api/v3/filters/parsefilters.go (1)
188-199: Order-dependent behavior with mixed shorthand and operator forms.If a request has both
filter[field]=valueandfilter[field][op]=other, the shorthand wins due to the early return on line 198. Since Go's map iteration is non-deterministic, the result could vary depending on which key is processed first.This is probably fine since mixing forms is likely a client error, but you might want to either:
- Document this behavior, or
- Detect and reject conflicting forms explicitly
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@api/v3/filters/parsefilters.go` around lines 188 - 199, Shorthand form currently returns early when rest=="" which causes nondeterministic behavior if the same field also appears with operator form; change this to detect conflicts instead of returning immediately: when rest=="" set the equality into the constructed object (e.g., obj["eq"]=value) and if obj is non-nil and already contains other operator keys, return an explicit error indicating conflicting filter forms for the same field (or choose to prefer one form consistently), using the same variables/rest checks around rest, value and obj so you avoid the nondeterministic early return.api/spec/Makefile (1)
8-15: Nice post-processing approach for consolidating filter schemas!This cleanly replaces inline filter definitions with
$refentries pointing to the shared definitions file. The defensiveyq -echeck before modification is a good touch.A couple of small observations:
SortQueryis hardcoded whileFieldFilterschemas are discovered dynamically. If more non-FieldFilterschemas need this treatment later, consider extracting the static list or using a more general pattern.If the file path or referenced schema doesn't exist, the loop silently continues. This is probably fine for a build tool, but if you want early failure detection, you could add a simple existence check:
💡 Optional: Add file existence check
# Replace inline filter definitions with $ref to common/definitions/aip_filters.yaml `@AIP_REF`="../../../../common/definitions/aip_filters.yaml#/components/schemas"; \ FILE="packages/aip/output/definitions/metering-and-billing/v3/openapi.MeteringAndBilling.yaml"; \ + if [ ! -f "$$FILE" ]; then echo "Error: $$FILE not found"; exit 1; fi; \ for schema in SortQuery $$(yq '.components.schemas | keys | .[]' "$$FILE" | grep 'FieldFilter'); do \🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@api/spec/Makefile` around lines 8 - 15, The Makefile loop currently hardcodes SortQuery while dynamically discovering FieldFilter schemas; update the logic around the variables AIP_REF and FILE so non-FieldFilter schemas can be handled either by extracting a static list (e.g., add a variable like STATIC_SCHEMAS="SortQuery OtherSchema") or by broadening the discovery pattern to include those names, and add an existence check for the target FILE and the referenced AIP_REF path before the for-loop so the task fails fast if the YAML file or referenced schema file is missing; ensure the script still uses the defensive yq -e check when setting .components.schemas[strenv(SCHEMA)] = {"$ref": strenv(REF_VAL)} so missing schemas are reported and the build exits with a non-zero status on error.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@api/v3/filters/convert.go`:
- Around line 136-143: The current mapping of f.Eq to out.Gte changes semantics;
instead approximate equality by creating a closed-open interval: set out.Gte =
&t and set out.Lt = &t2 where t2 := t.Add(time.Nanosecond) (or an appropriate
epsilon), ensuring the internal FilterTime has an Lt field to accept this bound;
update the convert logic (the block handling f.Eq in convert.go) to produce both
Gte and Lt rather than only Gte so an exact datetime is approximated correctly.
- Around line 131-178: ConvertFilterDateTime currently ignores
FilterDateTime.Neq and FilterDateTime.Oeq causing silent drops; update
ConvertFilterDateTime to explicitly handle these cases by checking if f.Neq or
f.Oeq are non-nil and returning a clear error (e.g. fmt.Errorf("filter neq:
unsupported for datetime") / fmt.Errorf("filter oeq: unsupported for datetime"))
so callers get notified, placing these checks in ConvertFilterDateTime alongside
the existing Eq/Gt/Gte/Lt/Lte parsing logic.
In `@api/v3/handlers/customers/list.go`:
- Around line 71-79: The handler currently only maps params.Filter.Key.Contains
and .Eq into filterKey (variable filterKey) and silently ignores any other
FilterString operators; update the customers list handler to validate
unsupported operators on params.Filter.Key: inspect all operator fields on
FilterString (e.g., Neq, Oeq, Ocontains, Gt, Gte, Lt, Lte, Exists) and if any
are non-nil return a 400 validation error (with a clear message listing
unsupported operators) instead of silently dropping them, or alternatively
update the mapping logic to handle the additional operators in the service layer
(per the TODO) by passing a richer FilterString object rather than reducing to
*string; reference params.Filter.Key and filterKey to locate where to add this
validation or refactor.
---
Nitpick comments:
In `@api/spec/Makefile`:
- Around line 8-15: The Makefile loop currently hardcodes SortQuery while
dynamically discovering FieldFilter schemas; update the logic around the
variables AIP_REF and FILE so non-FieldFilter schemas can be handled either by
extracting a static list (e.g., add a variable like STATIC_SCHEMAS="SortQuery
OtherSchema") or by broadening the discovery pattern to include those names, and
add an existence check for the target FILE and the referenced AIP_REF path
before the for-loop so the task fails fast if the YAML file or referenced schema
file is missing; ensure the script still uses the defensive yq -e check when
setting .components.schemas[strenv(SCHEMA)] = {"$ref": strenv(REF_VAL)} so
missing schemas are reported and the build exits with a non-zero status on
error.
In `@api/spec/packages/aip/src/common/parameters.tsp`:
- Around line 174-194: LabelsFieldFilter duplicates the exact operator set of
StringFieldFilter; replace the duplicate by making LabelsFieldFilter an alias or
extension of StringFieldFilter (e.g., define LabelsFieldFilter =
StringFieldFilter or extend the union) so you remove the repeated union body,
and update any usage sites that reference LabelsFieldFilter to continue to work
with the aliased/extended type (ensure comments/annotations from the original
LabelsFieldFilter are preserved on the alias for clarity).
In `@api/spec/packages/aip/src/llmcost/operations.tsp`:
- Around line 15-25: Add a follow-up issue to track replacing the temporary
filter API solution noted in the TODO: document the work to remove or replace
the temporary FilterSingleString model (model name: FilterSingleString) and to
implement the permanent filtering approach that uses the real filter types in
api/v3/filters (extensions: x-go-type and x-go-type-import). In the issue,
include the file and TODO context, desired end-state (remove temporary model,
wire TypeSpec to the permanent Go filter types), acceptance criteria, and any
migration steps for generated code so someone can pick it up later.
In `@api/v3/filters/filter.go`:
- Around line 110-125: FilterNumeric.Validate currently calls
validateMutuallyExclusiveFilters with all numeric operator flags which disallows
combining high-side (Gt/Gte) with low-side (Lt/Lte) operators; to support range
queries, update the validation so Gt/Gte may be combined with Lt/Lte while
keeping other exclusivity rules. Specifically, modify FilterNumeric.Validate (or
the call to validateMutuallyExclusiveFilters) to exclude the mutual-exclusion
check between the group (f.Gt != nil || f.Gte != nil) and the group (f.Lt != nil
|| f.Lte != nil), and instead validate that at most one of Eq/Neq/Oeq is used
with any range or single-value operators and that range endpoints are not
contradictory when both sides are present.
In `@api/v3/filters/parsefilters_test.go`:
- Around line 41-107: Add tests that assert ParseFilters returns errors for
invalid targets: create a new TestParseFilters_Errors with two subtests that
call ParseFilters(qs, f) where f is (1) a non-pointer value (e.g., var f
testFilter) and (2) a nil pointer (e.g., var f *testFilter), and require an
error in both cases; reference the ParseFilters function and name the new test
TestParseFilters_Errors so it lives alongside TestParseFilters_PointerToPointer
and checks the expected error messages/conditions.
In `@api/v3/filters/parsefilters.go`:
- Around line 188-199: Shorthand form currently returns early when rest==""
which causes nondeterministic behavior if the same field also appears with
operator form; change this to detect conflicts instead of returning immediately:
when rest=="" set the equality into the constructed object (e.g.,
obj["eq"]=value) and if obj is non-nil and already contains other operator keys,
return an explicit error indicating conflicting filter forms for the same field
(or choose to prefer one form consistently), using the same variables/rest
checks around rest, value and obj so you avoid the nondeterministic early
return.
In `@pkg/filter/filter.go`:
- Around line 808-823: The LIKE/ILIKE predicates in function fieldLike do not
explicitly include the ESCAPE clause used by EscapeLikePattern; update fieldLike
to append " ESCAPE '\\'" (with a single backslash as the escape character) to
the constructed predicate so the SQL becomes "... LIKE ? ESCAPE '\\'" or "...
ILIKE ? ESCAPE '\\'" and preserve the existing negate handling; modify the code
building logic in fieldLike to write the escape clause after choosing LIKE vs
ILIKE and before adding the argument.
- Around line 238-241: Replace the custom negative-like branch for f.Ncontains
that calls fieldLike with a NotPredicates wrapping the same Ent helper used by
f.Contains: use NotPredicates(sql.FieldContainsFold(field, *f.Ncontains)) so
Ncontains and Contains rely on the same FieldContainsFold logic; remove the
fieldLike/ContainsPattern usage for the Ncontains case and ensure the pointer
value is passed into FieldContainsFold to preserve the same
fold/case-insensitive behavior.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 2f43b78d-b44a-49ec-922d-90370750ce30
⛔ Files ignored due to path filters (1)
api/v3/openapi.yamlis excluded by!**/openapi.yaml
📒 Files selected for processing (19)
.gitignoreMakefileapi/spec/Makefileapi/spec/packages/aip/src/common/parameters.tspapi/spec/packages/aip/src/customers/operations.tspapi/spec/packages/aip/src/llmcost/operations.tspapi/v3/api.gen.goapi/v3/codegen.yamlapi/v3/filters/convert.goapi/v3/filters/convert_test.goapi/v3/filters/filter.goapi/v3/filters/parsefilters.goapi/v3/filters/parsefilters_test.goapi/v3/filters/queryparse.goapi/v3/filters/queryparse_test.goapi/v3/handlers/customers/list.goapi/v3/templates/chi-middleware.tmpl.patchpkg/filter/filter.gopkg/filter/filter_test.go
There was a problem hiding this comment.
Actionable comments posted: 3
🧹 Nitpick comments (1)
api/v3/filters/convert.go (1)
109-114:oeqon numeric filters currently explodes into OR chains.This works, but it turns one list filter into
col = ? OR col = ? ...instead of a singleINpredicate. Since this path feeds DB queries, I’d consider adding numericInsupport inpkg/filterand mappingoeqthere, the same way stringoeqalready does.As per coding guidelines, "Performance should be a priority in critical code paths. Anything related to event ingestion, message processing, database operations (regardless of database) should be vetted for potential performance bottlenecks."
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@api/v3/filters/convert.go` around lines 109 - 114, The numeric path in convert.go currently expands f.Oeq into an OR chain by creating filter.FilterFloat entries and assigning them to out.Or; instead add numeric In support in pkg/filter (e.g., add an In []float64 field to filter.FilterFloat), then map f.Oeq to that In slice (mirroring how string oeq is handled) instead of producing multiple Eq entries and setting out.Or; update the convert logic (the block referencing f.Oeq, filter.FilterFloat, and out.Or) to populate the new In field and adjust any downstream consumers to honor filter.FilterFloat.In for SQL generation so queries use a single IN predicate.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@api/spec/packages/aip/src/common/parameters.tsp`:
- Around line 79-99: The TypeSpec union StringFieldFilter is missing operators
the server already accepts (gt, gte, lt, lte, exists, nexists); update the
public schema to match the implementation by adding optional fields gt?: string,
gte?: string, lt?: string, lte?: string, exists?: boolean, nexists?: boolean to
the object branch of StringFieldFilter (and repeat the same additions in the
other StringFieldFilter occurrence around lines 170-190) so the declared API
matches api/v3/filters/parse.go and api/v3/filters/convert.go; alternatively, if
you prefer to reject those operators, add validation early in the filter parsing
code to return an error for gt/gte/lt/lte/exists/nexists so the server behavior
matches the TypeSpec, but do not leave the discrepancy.
In `@api/v3/filters/parse_test.go`:
- Around line 111-318: Add unit tests for the newly introduced numeric and
boolean parsing paths: create tests exercising ParseFilterNumeric and
ParseFilterBoolean (happy paths: eq/gt/lt for numeric, eq/exists for boolean)
and negative cases (invalid number format, unsupported operators), and add at
least one higher-level test that exercises the generic Parse(...) wiring when
the field type is numeric or boolean to ensure the correct parser is chosen;
reference existing test patterns (TestParseFilterString*,
TestParseFilterDateTime*, TestParseFilterStringSimple) for structure, use
url.Values with keys like "filter[age][eq]" / "filter[active][eq]" and assert
expected FilterNumeric / FilterBoolean (and error messages contain "unsupported
operator" or parse failure) to cover both valid and invalid inputs.
In `@api/v3/filters/parse.go`:
- Around line 25-41: The Parse entrypoint currently ignores unknown or unmatched
filter[...] keys causing silent failures; update Parse and parseFiltersValue to
detect leftover/unmapped filter keys and return an error instead of silently
dropping them: after parsing all struct fields in parseFiltersValue (and when
handling the **Struct path in Parse), collect any qs keys that did not
correspond to a handled field (use hasFilterKeys to detect presence then track
consumed keys), and if any filter[...] keys remain unconsumed return a
descriptive error (e.g., invalid or unknown filter key) so callers can respond
with a 400; ensure the detection works for both pointer-to-struct and
pointer-to-pointer paths by returning the error from Parse when
parseFiltersValue reports leftover keys.
---
Nitpick comments:
In `@api/v3/filters/convert.go`:
- Around line 109-114: The numeric path in convert.go currently expands f.Oeq
into an OR chain by creating filter.FilterFloat entries and assigning them to
out.Or; instead add numeric In support in pkg/filter (e.g., add an In []float64
field to filter.FilterFloat), then map f.Oeq to that In slice (mirroring how
string oeq is handled) instead of producing multiple Eq entries and setting
out.Or; update the convert logic (the block referencing f.Oeq,
filter.FilterFloat, and out.Or) to populate the new In field and adjust any
downstream consumers to honor filter.FilterFloat.In for SQL generation so
queries use a single IN predicate.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 36e6cbca-8078-49ce-92f4-1b9ba590da3d
📒 Files selected for processing (9)
api/spec/packages/aip/src/common/parameters.tspapi/v3/api.gen.goapi/v3/filters/convert.goapi/v3/filters/convert_test.goapi/v3/filters/filter.goapi/v3/filters/parse.goapi/v3/filters/parse_test.goapi/v3/templates/chi-middleware.tmpl.patchpkg/filter/filter.go
✅ Files skipped from review due to trivial changes (1)
- api/v3/filters/convert_test.go
🚧 Files skipped from review as they are similar to previous changes (1)
- api/v3/templates/chi-middleware.tmpl.patch
There was a problem hiding this comment.
Actionable comments posted: 8
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
pkg/filter/filter_test.go (1)
33-176:⚠️ Potential issue | 🟡 MinorPlease add
Validate()coverage for the new string operators.This suite exercises
Nlike,Nilike,Contains, andNcontainsin SQL generation, but it never checks thatFilterString.Validate()accepts them on their own or rejects them when combined with another operator. SinceListCustomersInput.Validate()now gates requests on that method, a one-of regression here would still slip through.As per coding guidelines,
**/*_test.go: Make sure the tests are comprehensive and cover the changes.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@pkg/filter/filter_test.go` around lines 33 - 176, Add Validate() test coverage for the new string operators by updating TestFilterString_Validate to include cases where filter.FilterString has only Nlike, Nilike, Contains, and Ncontains (wantErr: false) and cases where each of those is combined with another operator (e.g., Eq) to assert validation fails with "only one filter can be set"; ensure you exercise the FilterString.Validate() method for each operator both alone and in combination so the new operators cannot regress.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@api/v3/api.gen.go`:
- Around line 4671-4676: The generated comment for SortQuery incorrectly claims
comma-separated multi-sort is supported; update the docstring for SortQuery to
remove or soften the statement about "Multiple sort attributes may be provided
via a comma separated list" (or explicitly note it's not yet supported for
customers list) so the generated contract matches the actual handler behavior
(see SortQuery and the customers list handler referenced by list.go), keeping
the remainder of the description about asc/desc and JSONPath intact.
- Around line 4294-4300: The generated API docs now show StringFieldFilter for
Key/Name/PrimaryEmail but still instruct clients to send filter[key]=my-db-id (a
bare value); fix by updating the generator source (not api/v3/api.gen.go):
either make the parser treat a bare filter value as an implicit eq in the
filters.Parse implementation (function Parse in package api/v3/filters) or, more
clearly, update the OpenAPI/spec/template that produces the StringFieldFilter
docs to explicitly state the implicit-eq behavior (or require JSON-style
filter[key][eq]=value). Adjust the parser (filters.Parse) to accept bare values
as eq or change the spec/template that renders StringFieldFilter docs so the
wire format instructions match the actual parsing behavior.
In `@api/v3/filters/parse.go`:
- Around line 68-74: checkUnknownFilterKeys currently accepts dotted keys like
"labels.env" by only checking the base segment, but buildFilterJSON later looks
only for the exact key (e.g., "labels") and drops the suffix; update both
checkUnknownFilterKeys and buildFilterJSON to extract the base (name[:dot]) and
suffix (name[dot+1:]) when a dot is present and treat the filter as belonging to
the base field while carrying the suffix into the resulting JSON payload.
Specifically, in checkUnknownFilterKeys use the existing base variable logic to
validate membership against knownFields and record/return the suffix for later
use; in buildFilterJSON, when iterating filter keys (the places around the shown
block and the other ranges noted: 205-210, 237-278), detect the dot, split into
base and suffix, map the clause into the base field but insert the suffix as the
nested key in the generated JSON structure (e.g., base: { suffix: <clause> } or
merging into an existing base map) so dotted filters like "labels.env" produce a
JSON payload that retains the "env" segment instead of being dropped.
- Around line 465-467: The code currently treats empty filter values (when rest
== "" && value == "") as a no-op by continuing, which lets malformed
numeric/date filters like filter[amount]= or filter[createdAt]= silently pass;
change this so that when rest=="" && value=="" and the filter key maps to a
numeric or datetime field (the same type-checking used later in the parsing
logic), return a 400 validation error instead of continue. Locate the two
continue sites (the one with variables rest and value around the numeric/date
parsing block and the second occurrence near lines 546-548) and replace the
continue with early validation failure (return an appropriate
BadRequest/validation error) only for numeric/date field types, leaving other
empty-value cases unchanged.
- Around line 26-42: The Parse function currently calls parseFiltersValue on
values that may not be structs, causing panics; add explicit kind guards: when
target is a non-pointer value (the branch where v.Kind() != reflect.Pointer),
check that v.Kind() == reflect.Struct and return a validation error if not
before calling parseFiltersValue; when handling the pointer-to-struct branch,
check that v.Elem().Kind() == reflect.Struct (after the nil-pointer allocation
logic and hasFilterKeys check) and return an error if it’s not a struct; retain
use of hasFilterKeys and only allocate the inner struct when needed.
- Around line 217-226: The code fails to detect pointer-receiver UnmarshalJSON
on value-type fields because iface := target.Interface() uses a value instead of
its address; change the logic to create a pointer targetPtr (use
reflect.New(fieldVal.Type().Elem()) for pointer fields, or fieldVal.Addr() for
value fields), assert iface := targetPtr.Interface() against json.Unmarshaler,
call unmarshaler.UnmarshalJSON(data) on that pointer, and after successful
decode set the field with fieldVal.Set(targetPtr) for pointer fields or
fieldVal.Set(targetPtr.Elem()) for value fields; reference symbols: fieldVal,
target/targetPtr, json.Unmarshaler, UnmarshalJSON.
In `@openmeter/customer/customer.go`:
- Around line 307-309: The Key, Name, and PrimaryEmail filter fields (and the
other filter string fields in the same validator block) currently call
FilterString.Validate(), which allows arbitrarily deep And/Or trees to flow into
adapter.ListCustomers; change those Validate() calls to use
FilterString.ValidateWithComplexity(maxDepth) instead (e.g., a small integer
like 3 or 5) and propagate/return the complexity error if it exceeds the limit
so the request is rejected early; update all occurrences in this validator (Key,
Name, PrimaryEmail and the other FilterString fields in that same validation
block) to use ValidateWithComplexity and handle its error rather than plain
Validate, ensuring adapter.ListCustomers never receives unbounded filter depth.
In `@test/customer/customer.go`:
- Around line 552-557: The test uses ListCustomers with ListCustomersInput.Key =
&filter.FilterString{Contains: lo.ToPtr("customer-1")}, but "customer-1" is the
full stored key so it doesn't verify partial-match behavior; change the Contains
value to a true substring (shorter than the full key, e.g. a fragment like
"customer-1-" or "ustomer-1") so ListCustomers (and the FilterString Contains
path) must perform a partial/contains match rather than behaving like Eq; update
the test case that constructs ListCustomersInput/FilterString used in the test
to use that shorter substring.
---
Outside diff comments:
In `@pkg/filter/filter_test.go`:
- Around line 33-176: Add Validate() test coverage for the new string operators
by updating TestFilterString_Validate to include cases where filter.FilterString
has only Nlike, Nilike, Contains, and Ncontains (wantErr: false) and cases where
each of those is combined with another operator (e.g., Eq) to assert validation
fails with "only one filter can be set"; ensure you exercise the
FilterString.Validate() method for each operator both alone and in combination
so the new operators cannot regress.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 7da88f75-54ab-4a04-8bf3-16c88e74bddf
⛔ Files ignored due to path filters (1)
api/v3/openapi.yamlis excluded by!**/openapi.yaml
📒 Files selected for processing (12)
Makefileapi/spec/packages/aip/src/customers/operations.tspapi/v3/api.gen.goapi/v3/filters/parse.goapi/v3/filters/parse_test.goapi/v3/handlers/customers/list.goopenmeter/apiconverter/filter.goopenmeter/customer/adapter/customer.goopenmeter/customer/customer.goopenmeter/customer/httpdriver/customer.gopkg/filter/filter_test.gotest/customer/customer.go
✅ Files skipped from review due to trivial changes (2)
- openmeter/apiconverter/filter.go
- api/v3/filters/parse_test.go
🚧 Files skipped from review as they are similar to previous changes (3)
- Makefile
- api/v3/handlers/customers/list.go
- api/spec/packages/aip/src/customers/operations.tsp
| // SortQuery Sort query. | ||
| // | ||
| // The `asc` suffix is optional as the default sort order is ascending. | ||
| // The `desc` suffix is used to specify a descending order. | ||
| // Multiple sort attributes may be provided via a comma separated list. | ||
| // JSONPath notation may be used to specify a sub-attribute (eg: 'foo.bar desc'). |
There was a problem hiding this comment.
At least for customers, multi-sort is still over-promised here.
Line 4675 still says comma-separated sort fields are supported. That keeps the shared generated contract ahead of the current customers list behavior, so I’d move that claim back to the source spec/template until the handler actually supports it.
Based on learnings, in the v3 customers list endpoint (api/v3/handlers/customers/list.go), multi-sort (comma-separated sort fields) is intentionally not supported yet, even though the API spec documentation mentions it.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@api/v3/api.gen.go` around lines 4671 - 4676, The generated comment for
SortQuery incorrectly claims comma-separated multi-sort is supported; update the
docstring for SortQuery to remove or soften the statement about "Multiple sort
attributes may be provided via a comma separated list" (or explicitly note it's
not yet supported for customers list) so the generated contract matches the
actual handler behavior (see SortQuery and the customers list handler referenced
by list.go), keeping the remainder of the description about asc/desc and
JSONPath intact.
| Key *filter.FilterString | ||
| Name *filter.FilterString | ||
| PrimaryEmail *filter.FilterString |
There was a problem hiding this comment.
Bound filter depth before this reaches the adapter.
Key, Name, and PrimaryEmail are now full FilterStrings, but this validator only calls Validate(). That lets arbitrarily deep And/Or trees flow straight into adapter.ListCustomers, which can blow up SQL size and DB work on a hot list path. I’d switch these to ValidateWithComplexity(...) with a small max depth here.
💡 Suggested guardrail
func (i ListCustomersInput) Validate() error {
var errs []error
+ const maxFilterDepth = 3
if i.Namespace == "" {
errs = append(errs, models.NewGenericValidationError(errors.New("namespace is required")))
}
@@
if i.Key != nil {
- if err := i.Key.Validate(); err != nil {
+ if err := i.Key.ValidateWithComplexity(maxFilterDepth); err != nil {
errs = append(errs, models.NewGenericValidationError(fmt.Errorf("invalid key filter: %w", err)))
}
}
if i.Name != nil {
- if err := i.Name.Validate(); err != nil {
+ if err := i.Name.ValidateWithComplexity(maxFilterDepth); err != nil {
errs = append(errs, models.NewGenericValidationError(fmt.Errorf("invalid name filter: %w", err)))
}
}
if i.PrimaryEmail != nil {
- if err := i.PrimaryEmail.Validate(); err != nil {
+ if err := i.PrimaryEmail.ValidateWithComplexity(maxFilterDepth); err != nil {
errs = append(errs, models.NewGenericValidationError(fmt.Errorf("invalid primary email filter: %w", err)))
}
}As per coding guidelines, performance should be a priority in critical code paths, and anything related to database operations should be vetted for potential performance bottlenecks.
Also applies to: 318-347
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@openmeter/customer/customer.go` around lines 307 - 309, The Key, Name, and
PrimaryEmail filter fields (and the other filter string fields in the same
validator block) currently call FilterString.Validate(), which allows
arbitrarily deep And/Or trees to flow into adapter.ListCustomers; change those
Validate() calls to use FilterString.ValidateWithComplexity(maxDepth) instead
(e.g., a small integer like 3 or 5) and propagate/return the complexity error if
it exceeds the limit so the request is rejected early; update all occurrences in
this validator (Key, Name, PrimaryEmail and the other FilterString fields in
that same validation block) to use ValidateWithComplexity and handle its error
rather than plain Validate, ensuring adapter.ListCustomers never receives
unbounded filter depth.
7728920 to
309201a
Compare
There was a problem hiding this comment.
Actionable comments posted: 7
♻️ Duplicate comments (4)
openmeter/customer/customer.go (1)
329-345:⚠️ Potential issue | 🟠 MajorPlease cap filter depth on this list validator.
These are full
FilterStrings now, but plainValidate()still lets arbitrarily deepAnd/Ortrees reachadapter.ListCustomers.ValidateWithComplexity(...)would reject pathological inputs earlier and keep the generated SQL bounded.As per coding guidelines, "Performance should be a priority in critical code paths. Anything related to event ingestion, message processing, database operations (regardless of database) should be vetted for potential performance bottlenecks."
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@openmeter/customer/customer.go` around lines 329 - 345, The filter validators currently call FilterString.Validate() on i.Key, i.Name, and i.PrimaryEmail which allows arbitrarily deep And/Or trees; replace those Validate() calls with a bounded validator FilterString.ValidateWithComplexity(maxFilterComplexity) and return the same models.NewGenericValidationError(fmt.Errorf(...)) on error; introduce a small constant (e.g., maxFilterComplexity = 5) near the validator or reuse an existing cap constant and use that when calling ValidateWithComplexity to ensure pathological inputs are rejected before reaching adapter.ListCustomers.api/v3/filters/parse.go (3)
36-53:⚠️ Potential issue | 🟡 MinorAdd struct-kind guards before calling
parseFiltersValue.When
v.Kind() != reflect.Pointeron line 42, we go straight toparseFiltersValuewithout verifyingvis actually a struct. If someone passesParse(&intValue)or any non-struct pointer,parseFiltersValuewill panic att.NumField().Suggested fix
func Parse(qs url.Values, target any) error { v := reflect.ValueOf(target) if v.Kind() != reflect.Pointer || v.IsNil() { return fmt.Errorf("Parse: target must be a non-nil pointer") } v = v.Elem() if v.Kind() != reflect.Pointer { + if v.Kind() != reflect.Struct { + return fmt.Errorf("Parse: target must point to a struct or *struct") + } return parseFiltersValue(qs, v) } + if v.Type().Elem().Kind() != reflect.Struct { + return fmt.Errorf("Parse: target must point to a struct or *struct") + } // target is **Struct — allocate the inner struct if any filter[...] keys exist🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@api/v3/filters/parse.go` around lines 36 - 53, The Parse function currently calls parseFiltersValue without ensuring the pointed-to value is a struct; add guards to validate types: in the branch where v.Elem() is used before calling parseFiltersValue, check that v.Kind() == reflect.Struct and return a descriptive error (e.g., "Parse: target must point to a struct") if not; likewise, in the **pointer-to-pointer** branch (the path that allocates the inner value when v.Kind() == reflect.Pointer), after allocating (or before calling parseFiltersValue) verify v.Elem().Kind() == reflect.Struct (or the inner type is a struct) and return an error if it isn't; keep use of hasFilterKeys() as-is to avoid unnecessary allocation when no filter keys exist.
476-479:⚠️ Potential issue | 🟡 MinorEmpty numeric/datetime filter values are silently ignored.
filter[amount]=orfilter[createdAt]=hit theparam.barecheck and become no-ops. This turns a malformed filter into an unfiltered query instead of a 400 error, which could be surprising for API consumers.If the intent is to support bare-key existence checks for these types, that's worth documenting. Otherwise, returning an error would be more explicit.
Also applies to: 545-548
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@api/v3/filters/parse.go` around lines 476 - 479, The parser currently treats parsedFilterParam.bare as a no-op inside the forEachFieldParam handler (in the block starting with err := forEachFieldParam(qs, field, func(param parsedFilterParam) error { if param.bare { return nil } }), which silently ignores empty values for numeric/datetime filters; change this to validate the field type and return a clear error for empty numeric or datetime filter values instead of nil (e.g., when param.bare is true and the target field kind/type is numeric or datetime, return a 400-style parse error), and apply the same check in the other identical handler around the 545-548 area so malformed filters like filter[amount]= or filter[createdAt]= produce a parsing error rather than being treated as unfiltered.
236-246:⚠️ Potential issue | 🟠 MajorValue-type fields with pointer-receiver
UnmarshalJSONwon't decode.When
fieldValis a struct value (not a pointer),target.Interface()returns the value itself. The type assertion againstjson.Unmarshalerwill fail silently for any type with a pointer-receiverUnmarshalJSON. UsingfieldVal.Addr()for value fields would fix this since struct fields are always addressable.Suggested fix
target := fieldVal if fieldVal.Kind() == reflect.Pointer { target = reflect.New(fieldVal.Type().Elem()) + } else if fieldVal.CanAddr() { + target = fieldVal.Addr() } iface := target.Interface() if unmarshaler, ok := iface.(json.Unmarshaler); ok { if err := unmarshaler.UnmarshalJSON(data); err != nil { return fmt.Errorf("filter[%s]: %w", name, err) } - fieldVal.Set(target) + if fieldVal.Kind() == reflect.Pointer { + fieldVal.Set(target) + } }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@api/v3/filters/parse.go` around lines 236 - 246, The current decode path uses target := fieldVal and reflect.New(...) for pointer fields, which fails to call pointer-receiver UnmarshalJSON on non-pointer (value) fields; change the target logic to take the address for addressable value fields (use fieldVal.Addr() when fieldVal.Kind() != reflect.Pointer) so iface will be the pointer type that implements json.Unmarshaler, invoke UnmarshalJSON on that iface, and then set the original field appropriately (if the original field was a pointer, set it to target; if it was a value, set it from target.Elem()). Ensure this uses the existing symbols fieldVal, target, iface, and the UnmarshalJSON check.
🧹 Nitpick comments (6)
openmeter/llmcost/service/service_test.go (1)
110-112: Prefert.Context()overcontext.Background()in tests.Per coding guidelines, when a
testing.Tis available, usingt.Context()keeps cancellation and lifecycle tied to the test harness. This applies throughout the test file (lines 110, 128, 147, 166, 183, 201).♻️ Example fix for the first occurrence
- result, err := svc.ListPrices(context.Background(), llmcost.ListPricesInput{ + result, err := svc.ListPrices(t.Context(), llmcost.ListPricesInput{As per coding guidelines: "Use
t.Context()when atesting.Tortesting.TBis available instead of introducingcontext.Background()"🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@openmeter/llmcost/service/service_test.go` around lines 110 - 112, Replace uses of context.Background() in the test calls with the test's context via t.Context() so the test lifecycle and cancellation propagate correctly; specifically update all invocations like svc.ListPrices(context.Background(), ...) and other similar calls in this test file (e.g., the calls around ListPrices, and the calls at the other noted locations) to use t.Context() instead of context.Background(), ensuring the testing.T instance is used for context.api/spec/packages/aip/common/definitions/properties.yaml (1)
37-48:ExpiresAtandExpiresAtNullableappear identical.Both schemas have
nullable: trueand the same description. Unless there's a subtle difference I'm missing (like one being used in a context where thenullableproperty gets stripped), you might be able to consolidate these.If they're intentionally separate for different codegen behaviors, a brief comment clarifying the distinction would help future maintainers.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@api/spec/packages/aip/common/definitions/properties.yaml` around lines 37 - 48, The two OpenAPI schema entries ExpiresAt and ExpiresAtNullable are identical (both have type:string, format:date-time, nullable:true and the same description); either consolidate them into a single shared definition (remove one and update references to use the other) or, if both must exist for codegen/runtime differences, add a clarifying comment and adjust one to clearly reflect its unique purpose (e.g., remove duplicated nullable or change name/description to document why both exist). Locate the ExpiresAt and ExpiresAtNullable definitions in properties.yaml and perform the consolidation or add the distinguishing comment and update any references accordingly.api/v3/handlers/customers/list.go (1)
79-101: Good error handling, but there's some repetition.The filter conversion pattern is solid—each field gets proper error handling with clear field names in the error response. However, the three blocks are quite similar.
If you're open to it, a small helper could reduce the boilerplate:
♻️ Optional: Extract a helper to reduce duplication
// Helper function (could live in a shared location) func convertFilter(ctx context.Context, f *api.StringFieldFilter, fieldName string) (*filter.FilterString, error) { result, err := filters.FromAPIFilterString(f) if err != nil { return nil, apierrors.NewBadRequestError(ctx, err, apierrors.InvalidParameters{ {Field: fieldName, Reason: err.Error(), Source: apierrors.InvalidParamSourceQuery}, }) } return result, nil } // Usage: if params.Filter != nil { if req.Key, err = convertFilter(ctx, params.Filter.Key, "filter[key]"); err != nil { return ListCustomersRequest{}, err } if req.Name, err = convertFilter(ctx, params.Filter.Name, "filter[name]"); err != nil { return ListCustomersRequest{}, err } if req.PrimaryEmail, err = convertFilter(ctx, params.Filter.PrimaryEmail, "filter[primary_email]"); err != nil { return ListCustomersRequest{}, err } }This is a nitpick—the current code is perfectly readable and correct!
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@api/v3/handlers/customers/list.go` around lines 79 - 101, The three repeated filter conversion blocks in the ListCustomersRequest construction should be extracted into a small helper to reduce duplication: implement a function (e.g., convertFilter(ctx, f *api.StringFieldFilter, fieldName string) (*filter.FilterString, error)) that calls filters.FromAPIFilterString and, on error, returns an apierrors.NewBadRequestError with the provided fieldName and original error; then replace the repeated logic in the params.Filter != nil block by assigning req.Key, req.Name and req.PrimaryEmail via calls to convertFilter and returning the error directly if non-nil.api/spec/packages/aip/src/customers/operations.tsp (1)
73-79: Documentation mentions onlyfilter[key]but there are now three filter fields.The example in the doc comment only shows
filter[key], but the model now supportsfilter[name]andfilter[primary_email]too. Consider expanding the docs:📝 Suggested documentation update
/** * Filter customers returned in the response. * - * To filter customers by key add the following query param: filter[key]=my-db-id + * Supported filters: + * - `filter[key][eq]=my-db-id` - filter by exact key match + * - `filter[name][contains]=acme` - filter by name containing text + * - `filter[primary_email][eq]=user@example.com` - filter by primary email */🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@api/spec/packages/aip/src/customers/operations.tsp` around lines 73 - 79, Update the doc comment for the query parameter above the `@query`(#{ style: "deepObject", explode: true }) filter?: ListCustomersParamsFilter to list all supported filter keys (key, name, primary_email) and show example query params for each (e.g., filter[key]=..., filter[name]=..., filter[primary_email]=...); ensure the comment mentions that these map to fields on ListCustomersParamsFilter and adjust wording to reflect multiple filterable fields rather than only filter[key].api/spec/Makefile (1)
8-15: Smart consolidation of filter schemas — good approach to deduplicate them.The Makefile escaping (
$$ref→$ref) is spot-on.One thing worth noting: if
yqisn't installed locally, theifcheck on line 12 will silently pass (the2>&1redirect hides the error), and the build might not catch it right away. Consider adding a quick dependency check at the top of thegeneratetarget, or document thatyqneeds to be installed beforehand. Depending on your workflow, you could also pin it in devDependencies or the nix flake.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@api/spec/Makefile` around lines 8 - 15, Add an explicit dependency check for yq at the start of the Makefile generate target so the script fails fast instead of silently continuing when yq is missing: detect yq (e.g., via command -v or which) and print a clear error and non-zero exit if not found before using AIP_REF/FILE and the for loop that invokes yq; alternatively document the requirement or pin yq in devDependencies/nix flake so builds are reproducible.api/spec/packages/aip/common/definitions/konnect_properties.yaml (1)
97-105: Minor: Redundantformat: stringfor Workspace parameter.Line 104 has
format: stringwhich is a bit unusual -formatis typically used for values likeuuid,date-time, etc. For a plain string type, the format is implied and this is redundant. Could just remove it or leave the example alone.schema: type: string - format: string example: team-payments🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@api/spec/packages/aip/common/definitions/konnect_properties.yaml` around lines 97 - 105, The Workspace parameter schema includes an unnecessary "format: string"; remove the redundant format property from the Workspace schema (the "Workspace" object that defines the path parameter "workspace") so the schema only declares type: string and example: team-payments, ensuring the parameter remains required and named "workspace".
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.agents/skills/api-filters/SKILL.md:
- Around line 35-50: The fenced example block uses plain backticks with no
language and triggers markdownlint MD040; update the triple-backtick fence for
the example to specify a language such as "text" (i.e. change ``` to ```text) so
markdownlint-cli2 stops warning—apply this to the fenced code block that
contains the filter examples in SKILL.md (the block starting with
filter[field]=value).
- Around line 109-218: The PR inverted the intended filter pipeline: service
inputs (e.g., types in List...Input in openmeter/customer/customer.go and
openmeter/llmcost/service.go) must use the api/v3/filters types (e.g.,
*filters.FilterString / FilterDateTime), handlers (e.g., ListPrices,
ListOverrides) must call filters.Parse(r.URL.Query(), &req) in the HTTP decoder,
and adapters should use the Convert* helpers (filters.ConvertFilterString,
ConvertFilterDateTime, etc.) to produce pkg/filter predicates before selecting
ent fields; revert usages of pkg/filter.FilterString in service input structs,
remove ad-hoc FromAPIFilterString conversions in the handlers, add filters.Parse
in the handler decoder that builds the request struct, and update the adapter
List... methods to call filters.Convert* then .Select(...) (skipping when
Convert* returns nil).
In `@api/spec/packages/aip/common/definitions/errors.yaml`:
- Around line 124-130: ErrorResponse currently omits the 413 PayloadTooLarge
response and the discriminator mapping, and there's no Common.PayloadTooLarge
model in the TypeSpec error models; add the PayloadTooLarge response into
ErrorResponse's oneOf array and include the "413": "PayloadTooLarge" mapping in
its discriminator, and create a PayloadTooLarge model in the common error
TypeSpec (named PayloadTooLarge or Common.PayloadTooLarge) that uses the
response ref (the errors.yaml response) and sets `@statusCode` _: 413 so the new
response is exposed and typed consistently (target symbols: ErrorResponse,
discriminator, PayloadTooLarge model / Common.PayloadTooLarge, and the
errors.yaml#/components/responses/PayloadTooLarge ref).
In `@api/spec/packages/aip/common/definitions/security.yaml`:
- Around line 25-26: Update the awkward auth description strings in
security.yaml to read clearly and grammatically (e.g., replace "The Konnect
access token is meant to be used by the Konnect dashboard and the decK CLI
authenticate with." with a cleaner sentence), and apply the same tightening to
the other description entries referenced (lines 32-33, 37-38, 44-45). Locate the
description fields in the security.yaml (the Konnect access token description
and the similar auth description entries) and rewrite each to a concise,
grammatical phrase such as "The Konnect access token for authenticating the
Konnect dashboard and the decK CLI." or similar consistent phrasing across those
entries.
- Around line 33-40: The security scheme serviceAccessToken is documented as an
internal-only token but is missing the x-internal: true flag (unlike
clientToken), so update the OpenAPI definition to mark serviceAccessToken as
internal-only by adding x-internal: true to the serviceAccessToken object (near
the type/scheme/bearerFormat/description) to match clientToken and prevent
accidental exposure in generated/public artifacts.
In `@api/v3/filters/convert.go`:
- Around line 74-85: FromAPIFilterStringExact currently always assigns In:
&f.Oeq causing a non-nil pointer even for empty slices; change
FromAPIFilterStringExact to mirror FromAPIFilterString by only setting In when
len(f.Oeq) > 0 (i.e., conditionally assign In = &f.Oeq), leaving In nil for
empty Oeq, so downstream nil-check semantics match; update the return
construction in FromAPIFilterStringExact accordingly.
In `@openmeter/llmcost/service.go`:
- Around line 63-75: ListPricesInput.Validate() and
ListOverridesInput.Validate() currently accept full pkg/filter.FilterString
trees but do not vet them, allowing deeply nested or multi-op trees to reach the
DB path; update both Validate() methods to walk the FilterString AST and reject
complex trees by enforcing allowed operators (e.g., only simple comparison
operators), maximum depth and maximum total nodes/branches, and disallow
multi-op nodes (nested And/Or beyond one level) so Select() never receives an
explosion of predicates. Implement a small traversal helper that takes a
*filter.FilterString, returns the depth and node count and verifies each node is
one of the permitted types, use it inside ListPricesInput.Validate() and
ListOverridesInput.Validate() to return a validation error for disallowed trees,
and apply the same check to the other similar validator(s) referenced around the
other snippet (lines 207-210).
---
Duplicate comments:
In `@api/v3/filters/parse.go`:
- Around line 36-53: The Parse function currently calls parseFiltersValue
without ensuring the pointed-to value is a struct; add guards to validate types:
in the branch where v.Elem() is used before calling parseFiltersValue, check
that v.Kind() == reflect.Struct and return a descriptive error (e.g., "Parse:
target must point to a struct") if not; likewise, in the **pointer-to-pointer**
branch (the path that allocates the inner value when v.Kind() ==
reflect.Pointer), after allocating (or before calling parseFiltersValue) verify
v.Elem().Kind() == reflect.Struct (or the inner type is a struct) and return an
error if it isn't; keep use of hasFilterKeys() as-is to avoid unnecessary
allocation when no filter keys exist.
- Around line 476-479: The parser currently treats parsedFilterParam.bare as a
no-op inside the forEachFieldParam handler (in the block starting with err :=
forEachFieldParam(qs, field, func(param parsedFilterParam) error { if param.bare
{ return nil } }), which silently ignores empty values for numeric/datetime
filters; change this to validate the field type and return a clear error for
empty numeric or datetime filter values instead of nil (e.g., when param.bare is
true and the target field kind/type is numeric or datetime, return a 400-style
parse error), and apply the same check in the other identical handler around the
545-548 area so malformed filters like filter[amount]= or filter[createdAt]=
produce a parsing error rather than being treated as unfiltered.
- Around line 236-246: The current decode path uses target := fieldVal and
reflect.New(...) for pointer fields, which fails to call pointer-receiver
UnmarshalJSON on non-pointer (value) fields; change the target logic to take the
address for addressable value fields (use fieldVal.Addr() when fieldVal.Kind()
!= reflect.Pointer) so iface will be the pointer type that implements
json.Unmarshaler, invoke UnmarshalJSON on that iface, and then set the original
field appropriately (if the original field was a pointer, set it to target; if
it was a value, set it from target.Elem()). Ensure this uses the existing
symbols fieldVal, target, iface, and the UnmarshalJSON check.
In `@openmeter/customer/customer.go`:
- Around line 329-345: The filter validators currently call
FilterString.Validate() on i.Key, i.Name, and i.PrimaryEmail which allows
arbitrarily deep And/Or trees; replace those Validate() calls with a bounded
validator FilterString.ValidateWithComplexity(maxFilterComplexity) and return
the same models.NewGenericValidationError(fmt.Errorf(...)) on error; introduce a
small constant (e.g., maxFilterComplexity = 5) near the validator or reuse an
existing cap constant and use that when calling ValidateWithComplexity to ensure
pathological inputs are rejected before reaching adapter.ListCustomers.
---
Nitpick comments:
In `@api/spec/Makefile`:
- Around line 8-15: Add an explicit dependency check for yq at the start of the
Makefile generate target so the script fails fast instead of silently continuing
when yq is missing: detect yq (e.g., via command -v or which) and print a clear
error and non-zero exit if not found before using AIP_REF/FILE and the for loop
that invokes yq; alternatively document the requirement or pin yq in
devDependencies/nix flake so builds are reproducible.
In `@api/spec/packages/aip/common/definitions/konnect_properties.yaml`:
- Around line 97-105: The Workspace parameter schema includes an unnecessary
"format: string"; remove the redundant format property from the Workspace schema
(the "Workspace" object that defines the path parameter "workspace") so the
schema only declares type: string and example: team-payments, ensuring the
parameter remains required and named "workspace".
In `@api/spec/packages/aip/common/definitions/properties.yaml`:
- Around line 37-48: The two OpenAPI schema entries ExpiresAt and
ExpiresAtNullable are identical (both have type:string, format:date-time,
nullable:true and the same description); either consolidate them into a single
shared definition (remove one and update references to use the other) or, if
both must exist for codegen/runtime differences, add a clarifying comment and
adjust one to clearly reflect its unique purpose (e.g., remove duplicated
nullable or change name/description to document why both exist). Locate the
ExpiresAt and ExpiresAtNullable definitions in properties.yaml and perform the
consolidation or add the distinguishing comment and update any references
accordingly.
In `@api/spec/packages/aip/src/customers/operations.tsp`:
- Around line 73-79: Update the doc comment for the query parameter above the
`@query`(#{ style: "deepObject", explode: true }) filter?:
ListCustomersParamsFilter to list all supported filter keys (key, name,
primary_email) and show example query params for each (e.g., filter[key]=...,
filter[name]=..., filter[primary_email]=...); ensure the comment mentions that
these map to fields on ListCustomersParamsFilter and adjust wording to reflect
multiple filterable fields rather than only filter[key].
In `@api/v3/handlers/customers/list.go`:
- Around line 79-101: The three repeated filter conversion blocks in the
ListCustomersRequest construction should be extracted into a small helper to
reduce duplication: implement a function (e.g., convertFilter(ctx, f
*api.StringFieldFilter, fieldName string) (*filter.FilterString, error)) that
calls filters.FromAPIFilterString and, on error, returns an
apierrors.NewBadRequestError with the provided fieldName and original error;
then replace the repeated logic in the params.Filter != nil block by assigning
req.Key, req.Name and req.PrimaryEmail via calls to convertFilter and returning
the error directly if non-nil.
In `@openmeter/llmcost/service/service_test.go`:
- Around line 110-112: Replace uses of context.Background() in the test calls
with the test's context via t.Context() so the test lifecycle and cancellation
propagate correctly; specifically update all invocations like
svc.ListPrices(context.Background(), ...) and other similar calls in this test
file (e.g., the calls around ListPrices, and the calls at the other noted
locations) to use t.Context() instead of context.Background(), ensuring the
testing.T instance is used for context.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 0d9d14a8-e91b-41b7-9298-35433edc701f
⛔ Files ignored due to path filters (1)
api/v3/openapi.yamlis excluded by!**/openapi.yaml
📒 Files selected for processing (60)
.agents/skills/api-filters/SKILL.md.agents/skills/api/AIP.md.agents/skills/api/SKILL.md.agents/skills/api/rules/aip-122-naming.md.agents/skills/api/rules/aip-126-enums.md.agents/skills/api/rules/aip-129-labels.md.agents/skills/api/rules/aip-132-list.md.agents/skills/api/rules/aip-134-135-crud.md.agents/skills/api/rules/aip-137-content-type.md.agents/skills/api/rules/aip-142-time.md.agents/skills/api/rules/aip-158-pagination.md.agents/skills/api/rules/aip-160-filtering.md.agents/skills/api/rules/aip-181-stability.md.agents/skills/api/rules/aip-193-errors.md.agents/skills/api/rules/aip-235-bulk-delete.md.agents/skills/api/rules/aip-3101-versioning.md.agents/skills/api/rules/aip-3106-empty-fields.md.agents/skills/api/rules/aip-composition.md.agents/skills/api/rules/aip-docs.md.agents/skills/api/rules/aip-visibility.md.gitignoreMakefileapi/spec/Makefileapi/spec/README.mdapi/spec/packages/aip/README.mdapi/spec/packages/aip/common/definitions/aip_filters.yamlapi/spec/packages/aip/common/definitions/errors.yamlapi/spec/packages/aip/common/definitions/konnect_properties.yamlapi/spec/packages/aip/common/definitions/metadatas.yamlapi/spec/packages/aip/common/definitions/properties.yamlapi/spec/packages/aip/common/definitions/security.yamlapi/spec/packages/aip/src/common/parameters.tspapi/spec/packages/aip/src/customers/operations.tspapi/spec/packages/aip/src/llmcost/operations.tspapi/v3/api.gen.goapi/v3/codegen.yamlapi/v3/filters/convert.goapi/v3/filters/convert_test.goapi/v3/filters/filter.goapi/v3/filters/filter_test.goapi/v3/filters/parse.goapi/v3/filters/parse_test.goapi/v3/handlers/customers/list.goapi/v3/handlers/llmcost/convert.goapi/v3/handlers/llmcost/convert_test.goapi/v3/handlers/llmcost/list_overrides.goapi/v3/handlers/llmcost/list_prices.goapi/v3/templates/chi-middleware.tmpl.patchopenmeter/apiconverter/filter.goopenmeter/customer/adapter/customer.goopenmeter/customer/customer.goopenmeter/customer/httpdriver/customer.goopenmeter/ent/schema/customer.goopenmeter/llmcost/adapter/price.goopenmeter/llmcost/service.goopenmeter/llmcost/service/service.goopenmeter/llmcost/service/service_test.gopkg/filter/filter.gopkg/filter/filter_test.gotest/customer/customer.go
💤 Files with no reviewable changes (4)
- .agents/skills/api/AIP.md
- api/v3/handlers/llmcost/convert.go
- api/v3/filters/filter_test.go
- api/spec/packages/aip/README.md
✅ Files skipped from review due to trivial changes (22)
- api/v3/codegen.yaml
- api/spec/README.md
- .agents/skills/api/rules/aip-composition.md
- .agents/skills/api/rules/aip-126-enums.md
- .agents/skills/api/rules/aip-142-time.md
- .agents/skills/api/rules/aip-docs.md
- .agents/skills/api/rules/aip-3101-versioning.md
- .agents/skills/api/rules/aip-3106-empty-fields.md
- .agents/skills/api/rules/aip-181-stability.md
- .agents/skills/api/rules/aip-137-content-type.md
- .agents/skills/api/SKILL.md
- .agents/skills/api/rules/aip-122-naming.md
- .agents/skills/api/rules/aip-235-bulk-delete.md
- .agents/skills/api/rules/aip-visibility.md
- .agents/skills/api/rules/aip-132-list.md
- .agents/skills/api/rules/aip-158-pagination.md
- .agents/skills/api/rules/aip-193-errors.md
- .agents/skills/api/rules/aip-134-135-crud.md
- .agents/skills/api/rules/aip-160-filtering.md
- openmeter/ent/schema/customer.go
- .agents/skills/api/rules/aip-129-labels.md
- .gitignore
🚧 Files skipped from review as they are similar to previous changes (9)
- api/v3/templates/chi-middleware.tmpl.patch
- test/customer/customer.go
- Makefile
- api/v3/filters/filter.go
- openmeter/apiconverter/filter.go
- openmeter/customer/adapter/customer.go
- api/spec/packages/aip/src/llmcost/operations.tsp
- openmeter/customer/httpdriver/customer.go
- api/spec/packages/aip/src/common/parameters.tsp
| ``` | ||
| filter[field]=value # shorthand → eq | ||
| filter[field][eq]=value # exact match | ||
| filter[field][neq]=value # not equal (also returns NULLs) | ||
| filter[field][contains]=value # substring match (case-insensitive on strings) | ||
| filter[field][oeq]=a,b,c # one-of-equal (comma-separated, max 50 items) | ||
| filter[field][ocontains]=a,b # one-of-contains | ||
| filter[field][gt]=value # greater than | ||
| filter[field][gte]=value # greater than or equal | ||
| filter[field][lt]=value # less than | ||
| filter[field][lte]=value # less than or equal | ||
| filter[field] # bare key → exists=true (presence check) | ||
| filter[field][exists] # explicit existence check | ||
| filter[field][nexists] # absence check (only for additionalProperties maps like labels) | ||
| filter[labels.key_1][eq]=val # dot-notation: only the FIRST dot is a delimiter | ||
| ``` |
There was a problem hiding this comment.
Add a language to this fenced example.
markdownlint-cli2 is already warning on this block (MD040). text would be enough here.
🧰 Tools
🪛 markdownlint-cli2 (0.22.0)
[warning] 35-35: Fenced code blocks should have a language specified
(MD040, fenced-code-language)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In @.agents/skills/api-filters/SKILL.md around lines 35 - 50, The fenced example
block uses plain backticks with no language and triggers markdownlint MD040;
update the triple-backtick fence for the example to specify a language such as
"text" (i.e. change ``` to ```text) so markdownlint-cli2 stops warning—apply
this to the fenced code block that contains the filter examples in SKILL.md (the
block starting with filter[field]=value).
| description: | | ||
| The Konnect access token is meant to be used by the Konnect dashboard and the decK CLI authenticate with. |
There was a problem hiding this comment.
Tighten wording in auth descriptions for clearer generated docs
A few lines read awkwardly (for example, “authenticate with”). Small wording cleanup will make the docs feel more polished.
✏️ Suggested wording cleanup
konnectAccessToken:
type: http
scheme: bearer
bearerFormat: JWT
description: |
- The Konnect access token is meant to be used by the Konnect dashboard and the decK CLI authenticate with.
+ The Konnect access token is used by the Konnect dashboard and decK CLI for authentication.
@@
portalAccessToken:
type: apiKey
in: cookie
name: portalaccesstoken
description: |
- The Developer portal cookie is meant to be used by the Developer API to authenticate with.
+ The Developer Portal cookie is used by the Developer API for authentication.
@@
serviceAccessToken:
type: http
scheme: bearer
bearerFormat: JWT
description: |
- The Service access token is meant to be used between internal services.
+ The service access token is used for authentication between internal services.
@@
clientToken:
x-internal: true
type: http
scheme: bearer
bearerFormat: Token
description: |
- The Client token is meant to be used by internal service clients for the MinK integration.
+ The client token is used by internal service clients for MinK integration.Also applies to: 32-33, 37-38, 44-45
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@api/spec/packages/aip/common/definitions/security.yaml` around lines 25 - 26,
Update the awkward auth description strings in security.yaml to read clearly and
grammatically (e.g., replace "The Konnect access token is meant to be used by
the Konnect dashboard and the decK CLI authenticate with." with a cleaner
sentence), and apply the same tightening to the other description entries
referenced (lines 32-33, 37-38, 44-45). Locate the description fields in the
security.yaml (the Konnect access token description and the similar auth
description entries) and rewrite each to a concise, grammatical phrase such as
"The Konnect access token for authenticating the Konnect dashboard and the decK
CLI." or similar consistent phrasing across those entries.
| serviceAccessToken: | ||
| type: http | ||
| scheme: bearer | ||
| bearerFormat: JWT | ||
| description: | | ||
| The Service access token is meant to be used between internal services. | ||
| clientToken: | ||
| x-internal: true |
There was a problem hiding this comment.
Potential internal-scheme exposure mismatch
Line 38 says serviceAccessToken is for internal services, but unlike clientToken (Line 40), it is not marked x-internal: true. If this is meant to stay internal, this can unintentionally widen visibility in generated/public artifacts.
🔒 Suggested fix (if service token is internal-only)
serviceAccessToken:
+ x-internal: true
type: http
scheme: bearer
bearerFormat: JWT
description: |
The Service access token is meant to be used between internal services.📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| serviceAccessToken: | |
| type: http | |
| scheme: bearer | |
| bearerFormat: JWT | |
| description: | | |
| The Service access token is meant to be used between internal services. | |
| clientToken: | |
| x-internal: true | |
| serviceAccessToken: | |
| x-internal: true | |
| type: http | |
| scheme: bearer | |
| bearerFormat: JWT | |
| description: | | |
| The Service access token is meant to be used between internal services. | |
| clientToken: | |
| x-internal: true |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@api/spec/packages/aip/common/definitions/security.yaml` around lines 33 - 40,
The security scheme serviceAccessToken is documented as an internal-only token
but is missing the x-internal: true flag (unlike clientToken), so update the
OpenAPI definition to mark serviceAccessToken as internal-only by adding
x-internal: true to the serviceAccessToken object (near the
type/scheme/bearerFormat/description) to match clientToken and prevent
accidental exposure in generated/public artifacts.
309201a to
1fd6897
Compare
There was a problem hiding this comment.
Actionable comments posted: 3
♻️ Duplicate comments (4)
.agents/skills/api-filters/SKILL.md (1)
35-50:⚠️ Potential issue | 🟡 MinorAdd a language to this fenced example block.
Tiny lint thing: this fence still has no language, so
markdownlintwill keep raising MD040.Suggested fix
-``` +```text filter[field]=value # shorthand → eq filter[field][eq]=value # exact match ... filter[labels.key_1][eq]=val # dot-notation: only the FIRST dot is a delimiter</details> <details> <summary>🤖 Prompt for AI Agents</summary>Verify each finding against the current code and only fix it if needed.
In @.agents/skills/api-filters/SKILL.md around lines 35 - 50, Add a language
identifier to the fenced code block that starts with the filter examples (the
triple-backtick block containing lines like "filter[field]=value" and
"filter[labels.key_1][eq]=val") so markdownlint MD040 is satisfied; change the
opening fence fromtotext (or another appropriate language liketest/customer/customer.go (1)
552-557:⚠️ Potential issue | 🟡 MinorUse a real substring in the
Containscase.
"customer-1"is the full stored key, so this still passes if the adapter accidentally treatsContainslikeEq. A shorter fragment would actually lock in the partial-match behavior.💡 Small tweak
- Key: &filter.FilterString{Contains: lo.ToPtr("customer-1")}, + Key: &filter.FilterString{Contains: lo.ToPtr("ustomer-1")},As per coding guidelines, "
**/*_test.go: Make sure the tests are comprehensive and cover the changes. Keep a strong focus on unit tests and in-code integration tests."🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/customer/customer.go` around lines 552 - 557, The test uses the full key "customer-1" for the Contains case which doesn't validate partial matching; update the call to service.ListCustomers (ListCustomersInput, FilterString.Contains) to use a real substring of the stored key (e.g., a shorter fragment like "ustomer-1" or "customer-") so the test will fail if the adapter treats Contains as Eq and thus truly verifies partial-match behavior.api/spec/packages/aip/src/common/parameters.tsp (1)
79-99:⚠️ Potential issue | 🟠 MajorThese string filter schemas are still under-declaring the supported operators.
filters.FilterStringand the parser already handlegt/gte/lt/lteplusexists/nexists, butStringFieldFilterandLabelsFieldFilteronly advertiseeq/neq/contains/oeq/ocontains. That leaves the OpenAPI/SDK surface behind the runtime behavior. Please either add the missing operators here or reject them before they reach the parser.As per coding guidelines, "The declared API should be accurate, in parity with the actual implementation, and easy to understand for the user."
Also applies to: 170-190
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@api/spec/packages/aip/src/common/parameters.tsp` around lines 79 - 99, The OpenAPI schema for StringFieldFilter (and likewise LabelsFieldFilter at 170-190) is missing the operators supported at runtime; update the union object for StringFieldFilter to declare gt, gte, lt, lte (as optional strings) and exists, nexists (as optional booleans) in addition to the existing eq/neq/contains/oeq/ocontains so the declared API matches filters.FilterString and the parser behavior; ensure the multi-value operators (oeq/ocontains) keep their `@encode`(ArrayEncoding.commaDelimited) annotation and mirror the same additions in LabelsFieldFilter so the SDK/OpenAPI surface is accurate.openmeter/customer/customer.go (1)
329-345:⚠️ Potential issue | 🟠 MajorPlease cap filter depth before adapter query building
Nice move adding typed filters, but Line 330, Line 336, and Line 342 still call
Validate(), which allows effectively unbounded$and/$ordepth. On this list path, that can create very expensive predicates/SQL. Switching toValidateWithComplexity(maxFilterDepth)here would reject pathological filters early.💡 Suggested patch
func (i ListCustomersInput) Validate() error { var errs []error + const maxFilterDepth = 3 @@ if i.Key != nil { - if err := i.Key.Validate(); err != nil { + if err := i.Key.ValidateWithComplexity(maxFilterDepth); err != nil { errs = append(errs, models.NewGenericValidationError(fmt.Errorf("invalid key filter: %w", err))) } } @@ if i.Name != nil { - if err := i.Name.Validate(); err != nil { + if err := i.Name.ValidateWithComplexity(maxFilterDepth); err != nil { errs = append(errs, models.NewGenericValidationError(fmt.Errorf("invalid name filter: %w", err))) } } @@ if i.PrimaryEmail != nil { - if err := i.PrimaryEmail.Validate(); err != nil { + if err := i.PrimaryEmail.ValidateWithComplexity(maxFilterDepth); err != nil { errs = append(errs, models.NewGenericValidationError(fmt.Errorf("invalid primary email filter: %w", err))) } }You can verify quickly with:
#!/bin/bash # Verify ListCustomersInput.Validate() uses bounded filter validation. rg -n -C3 --type=go 'func \(i ListCustomersInput\) Validate\(\)' openmeter/customer/customer.go rg -n -C2 --type=go 'i\.(Key|Name|PrimaryEmail)\.Validate(WithComplexity)?\(' openmeter/customer/customer.goExpected result: each of
Key,Name, andPrimaryEmailusesValidateWithComplexity(...), notValidate().As per coding guidelines, performance should be a priority in critical code paths, including database operations.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@openmeter/customer/customer.go` around lines 329 - 345, ListCustomersInput currently calls i.Key.Validate(), i.Name.Validate(), and i.PrimaryEmail.Validate(), which allows unbounded $and/$or depth; replace those calls with the bounded variant i.Key.ValidateWithComplexity(maxFilterDepth), i.Name.ValidateWithComplexity(maxFilterDepth), and i.PrimaryEmail.ValidateWithComplexity(maxFilterDepth) respectively; ensure a sensible maxFilterDepth constant is defined (or reuse an existing package-level constant like MaxFilterDepth/DefaultMaxFilterDepth) and used consistently, and update any imports/docs if needed so pathological filters are rejected before adapter query building.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.agents/skills/api-filters/SKILL.md:
- Around line 111-113: The documentation is ambiguous about when parsability is
checked: update the descriptions for IsEmpty() and Validate(ctx context.Context,
field string) error to state that Validate enforces mutual exclusivity, range
rules, and parse-time value parsability (e.g., numeric and enum parsing), and
explicitly note that format-to-type conversions that require convert-time logic
(e.g., datetime normalization) are validated later during Convert/Apply steps;
apply the same clarification to the other occurrence around lines 259–263 so
readers understand parse-time vs convert-time error expectations.
In `@api/spec/packages/aip/src/common/parameters.tsp`:
- Around line 11-16: The shared sort parameter docs currently claim
comma-separated multi-sort but the V3 customers list handler
(api/v3/handlers/customers/list.go) only parses a single sort field; update the
documentation in parameters.tsp to reflect the actual behavior by removing or
restricting the “Multiple sort attributes via a comma separated list” sentence
(or explicitly state “single sort field only”) so it matches the parsing logic
in Customers list, or alternatively add a separate scalar/docs entry for
endpoints that truly support multi-sort and keep parameters.tsp limited to
single-field sort.
- Around line 24-31: The TypeSpec currently defines BooleanFieldFilter as a
plain scalar but the Go runtime expects an object DTO filters.FilterBoolean with
an Eq field; replace the scalar declaration for BooleanFieldFilter with an
object/record (or discriminated union if needed) that models the same shape as
filters.FilterBoolean (e.g., an object with an optional eq/Eq boolean property)
so the published API matches the generated Go surface; keep the existing
`@extension`("x-go-type","filters.FilterBoolean") mapping and ensure the field
name and nullability match the Go DTO (optional pointer -> optional property).
---
Duplicate comments:
In @.agents/skills/api-filters/SKILL.md:
- Around line 35-50: Add a language identifier to the fenced code block that
starts with the filter examples (the triple-backtick block containing lines like
"filter[field]=value" and "filter[labels.key_1][eq]=val") so markdownlint MD040
is satisfied; change the opening fence from ``` to ```text (or another
appropriate language like ```plain) to mark the block as code with a language.
In `@api/spec/packages/aip/src/common/parameters.tsp`:
- Around line 79-99: The OpenAPI schema for StringFieldFilter (and likewise
LabelsFieldFilter at 170-190) is missing the operators supported at runtime;
update the union object for StringFieldFilter to declare gt, gte, lt, lte (as
optional strings) and exists, nexists (as optional booleans) in addition to the
existing eq/neq/contains/oeq/ocontains so the declared API matches
filters.FilterString and the parser behavior; ensure the multi-value operators
(oeq/ocontains) keep their `@encode`(ArrayEncoding.commaDelimited) annotation and
mirror the same additions in LabelsFieldFilter so the SDK/OpenAPI surface is
accurate.
In `@openmeter/customer/customer.go`:
- Around line 329-345: ListCustomersInput currently calls i.Key.Validate(),
i.Name.Validate(), and i.PrimaryEmail.Validate(), which allows unbounded
$and/$or depth; replace those calls with the bounded variant
i.Key.ValidateWithComplexity(maxFilterDepth),
i.Name.ValidateWithComplexity(maxFilterDepth), and
i.PrimaryEmail.ValidateWithComplexity(maxFilterDepth) respectively; ensure a
sensible maxFilterDepth constant is defined (or reuse an existing package-level
constant like MaxFilterDepth/DefaultMaxFilterDepth) and used consistently, and
update any imports/docs if needed so pathological filters are rejected before
adapter query building.
In `@test/customer/customer.go`:
- Around line 552-557: The test uses the full key "customer-1" for the Contains
case which doesn't validate partial matching; update the call to
service.ListCustomers (ListCustomersInput, FilterString.Contains) to use a real
substring of the stored key (e.g., a shorter fragment like "ustomer-1" or
"customer-") so the test will fail if the adapter treats Contains as Eq and thus
truly verifies partial-match behavior.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 495e736f-fc70-424a-a9c7-046c8ad5b011
⛔ Files ignored due to path filters (1)
api/v3/openapi.yamlis excluded by!**/openapi.yaml
📒 Files selected for processing (52)
.agents/skills/api-filters/SKILL.md.agents/skills/api/AIP.md.agents/skills/api/SKILL.md.agents/skills/api/rules/aip-122-naming.md.agents/skills/api/rules/aip-126-enums.md.agents/skills/api/rules/aip-129-labels.md.agents/skills/api/rules/aip-132-list.md.agents/skills/api/rules/aip-134-135-crud.md.agents/skills/api/rules/aip-137-content-type.md.agents/skills/api/rules/aip-142-time.md.agents/skills/api/rules/aip-158-pagination.md.agents/skills/api/rules/aip-160-filtering.md.agents/skills/api/rules/aip-181-stability.md.agents/skills/api/rules/aip-193-errors.md.agents/skills/api/rules/aip-235-bulk-delete.md.agents/skills/api/rules/aip-3101-versioning.md.agents/skills/api/rules/aip-3106-empty-fields.md.agents/skills/api/rules/aip-composition.md.agents/skills/api/rules/aip-docs.md.agents/skills/api/rules/aip-visibility.md.gitignoreMakefileapi/spec/Makefileapi/spec/README.mdapi/spec/packages/aip/README.mdapi/spec/packages/aip/src/common/parameters.tspapi/spec/packages/aip/src/customers/operations.tspapi/spec/packages/aip/src/llmcost/operations.tspapi/v3/api.gen.goapi/v3/codegen.yamlapi/v3/filters/convert.goapi/v3/filters/convert_test.goapi/v3/filters/filter.goapi/v3/filters/filter_test.goapi/v3/filters/parse.goapi/v3/filters/parse_test.goapi/v3/handlers/customers/list.goapi/v3/handlers/llmcost/convert.goapi/v3/handlers/llmcost/convert_test.goapi/v3/handlers/llmcost/list_overrides.goapi/v3/handlers/llmcost/list_prices.goapi/v3/templates/chi-middleware.tmpl.patchopenmeter/apiconverter/filter.goopenmeter/customer/adapter/customer.goopenmeter/customer/customer.goopenmeter/customer/httpdriver/customer.goopenmeter/ent/schema/customer.goopenmeter/llmcost/adapter/price.goopenmeter/llmcost/service.goopenmeter/llmcost/service/service.goopenmeter/llmcost/service/service_test.gotest/customer/customer.go
💤 Files with no reviewable changes (4)
- .agents/skills/api/AIP.md
- api/spec/packages/aip/README.md
- api/v3/filters/filter_test.go
- api/v3/handlers/llmcost/convert.go
✅ Files skipped from review due to trivial changes (25)
- .agents/skills/api/rules/aip-126-enums.md
- api/v3/codegen.yaml
- .agents/skills/api/rules/aip-composition.md
- .agents/skills/api/rules/aip-docs.md
- api/spec/README.md
- .agents/skills/api/rules/aip-visibility.md
- openmeter/llmcost/adapter/price.go
- .agents/skills/api/rules/aip-3106-empty-fields.md
- .gitignore
- .agents/skills/api/rules/aip-142-time.md
- api/spec/Makefile
- .agents/skills/api/rules/aip-181-stability.md
- openmeter/ent/schema/customer.go
- .agents/skills/api/rules/aip-3101-versioning.md
- .agents/skills/api/rules/aip-137-content-type.md
- .agents/skills/api/rules/aip-122-naming.md
- .agents/skills/api/rules/aip-129-labels.md
- .agents/skills/api/rules/aip-132-list.md
- .agents/skills/api/rules/aip-235-bulk-delete.md
- .agents/skills/api/rules/aip-134-135-crud.md
- .agents/skills/api/rules/aip-158-pagination.md
- .agents/skills/api/rules/aip-193-errors.md
- .agents/skills/api/SKILL.md
- .agents/skills/api/rules/aip-160-filtering.md
- api/v3/filters/parse.go
🚧 Files skipped from review as they are similar to previous changes (10)
- openmeter/llmcost/service/service_test.go
- api/spec/packages/aip/src/customers/operations.tsp
- api/v3/handlers/customers/list.go
- api/v3/handlers/llmcost/list_prices.go
- api/v3/templates/chi-middleware.tmpl.patch
- Makefile
- openmeter/customer/httpdriver/customer.go
- openmeter/llmcost/service.go
- api/v3/filters/convert_test.go
- api/v3/filters/convert.go
1fd6897 to
038419e
Compare
| func applyStringFilter(f *filter.FilterString, field string, query **entdb.LLMCostPriceQuery) { | ||
| if f == nil { | ||
| return | ||
| } | ||
|
|
||
| switch { | ||
| case f.Eq != nil: | ||
| *query = (*query).Where(eqFold(*f.Eq)) | ||
| case f.Neq != nil: | ||
| *query = (*query).Where(neq(*f.Neq)) | ||
| case f.Contains != nil: | ||
| *query = (*query).Where(containsFold(*f.Contains)) | ||
| if p := f.Select(field); p != nil { | ||
| *query = (*query).Where(p) | ||
| } | ||
| } |
There was a problem hiding this comment.
I would just add a small helper on this inside the filter package like:
type Filter {
Select()
}
type EntQuery[T any] {
Where(...) T
}
ApplyFilterIfNotEmpty[F Filter, Q Query](field_name) Q
|
|
||
| var parts []filter.FilterFloat | ||
|
|
||
| if f.Eq != nil { |
There was a problem hiding this comment.
nit/maybe later: I would consider adding a safeguard around missing the mapping, as we have:
- The aip yaml spec, that should match the
- Typespec spec
- That should match this
- That should match the filter spec
So for example here, I would think about checking if filternumeric is lo.IsEmpty, and if not, but the converson returns aan empty filter, I would at least log a warning or throw an error.
Summary by CodeRabbit
New Features
Bug Fixes / Behavior
Documentation
Tests