-
Notifications
You must be signed in to change notification settings - Fork 0
Add TRACE logging level for verbose payload logging across event writers #61
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
WalkthroughIntroduces a TRACE logging level via a new module and integrates TRACE-level payload logging into EventBridge, Kafka, and Postgres writers. Adds defensive TRACE initialization in event_gate_lambda. Provides tests validating TRACE payload logs across the three backends. No public function signatures changed. Changes
Sequence Diagram(s)sequenceDiagram
autonumber
actor Caller
participant Writer as Writer (EventBridge/Kafka/Postgres)
participant Logger as Logger (TRACE)
participant Backend as Backend Client
Caller->>Writer: write(topic, message)
alt TRACE enabled
Writer->>Logger: isEnabledFor(TRACE_LEVEL)
Logger-->>Writer: true
Writer->>Logger: trace("... payload ...")
else TRACE disabled
Note over Writer: Skip TRACE logging
end
Writer->>Backend: send/produce/execute(topic, message)
Backend-->>Writer: result
Writer-->>Caller: (ok, err)
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes Poem
Pre-merge checks and finishing touches❌ Failed checks (2 warnings)
✅ Passed checks (3 passed)
✨ Finishing touches
🧪 Generate unit tests
Tip 👮 Agentic pre-merge checks are now available in preview!Pro plan users can now enable pre-merge checks in their settings to enforce checklists before merging PRs.
Please see the documentation for more information. Example: reviews:
pre_merge_checks:
custom_checks:
- name: "Undocumented Breaking Changes"
mode: "warning"
instructions: |
Pass/fail criteria: All breaking changes to public APIs, CLI flags, environment variables, configuration keys, database schemas, or HTTP/GraphQL endpoints must be documented in the "Breaking Change" section of the PR description and in CHANGELOG.md. Exclude purely internal or private changes (e.g., code not exported from package entry points or explicitly marked as internal). Please share your feedback with us on this Discord post. Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
7898832
to
ad4d5d7
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 3
🧹 Nitpick comments (8)
src/logging_levels.py (1)
12-19
: Fix idempotency guard; expose constant onlogging
for reuse.
hasattr(logging, "TRACE")
is always False because the attribute is never set. This can re‑register on repeated imports. Switch to a stable check and set the constant for convenience.Apply this diff:
-if not hasattr(logging, "TRACE"): - logging.addLevelName(TRACE_LEVEL, "TRACE") +if logging.getLevelName(TRACE_LEVEL) != "TRACE": + logging.addLevelName(TRACE_LEVEL, "TRACE") + logging.TRACE = TRACE_LEVEL # convenience constant and future idempotency guard - - def trace(self: logging.Logger, message: str, *args, **kws): # type: ignore[override] + def trace(self: logging.Logger, message: str, *args, **kws): if self.isEnabledFor(TRACE_LEVEL): self._log(TRACE_LEVEL, message, args, **kws) # pylint: disable=protected-accesssrc/writer_kafka.py (1)
97-105
: Narrow exception; avoid blind catch in logging path.Catching
Exception
trips BLE001 and can hide programming errors. Limit to JSON serialization errors.Apply this diff:
- if logger.isEnabledFor(TRACE_LEVEL): - try: - logger.trace( # type: ignore[attr-defined] - "Kafka payload topic=%s payload=%s", topic_name, json.dumps(message, separators=(",", ":")) - ) - except Exception: # pragma: no cover - defensive - logger.trace("Kafka payload topic=%s <unserializable>", topic_name) # type: ignore[attr-defined] + if logger.isEnabledFor(TRACE_LEVEL): + try: + logger.trace( # type: ignore[attr-defined] + "Kafka payload topic=%s payload=%s", + topic_name, + json.dumps(message, separators=(",", ":"), ensure_ascii=False), + ) + except (TypeError, ValueError, OverflowError): # pragma: no cover - defensive + logger.trace("Kafka payload topic=%s <unserializable>", topic_name) # type: ignore[attr-defined]src/writer_postgres.py (1)
277-285
: Narrow exception; add safe JSON options.Mirror the narrower handling used elsewhere and avoid BLE001.
Apply this diff:
- if _logger.isEnabledFor(TRACE_LEVEL): - try: - _logger.trace( # type: ignore[attr-defined] - "Postgres payload topic=%s payload=%s", topic_name, json.dumps(message, separators=(",", ":")) - ) - except Exception: # pragma: no cover - defensive - _logger.trace("Postgres payload topic=%s <unserializable>", topic_name) # type: ignore[attr-defined] + if _logger.isEnabledFor(TRACE_LEVEL): + try: + _logger.trace( # type: ignore[attr-defined] + "Postgres payload topic=%s payload=%s", + topic_name, + json.dumps(message, separators=(",", ":"), ensure_ascii=False), + ) + except (TypeError, ValueError, OverflowError): # pragma: no cover - defensive + _logger.trace("Postgres payload topic=%s <unserializable>", topic_name) # type: ignore[attr-defined]src/writer_eventbridge.py (2)
29-32
: Remove redundant side‑effect import.
from .logging_levels import TRACE_LEVEL
already executes the module and registers the level; the extrafrom . import logging_levels # noqa: F401
is unnecessary and triggers RUF100.Apply this diff:
-# Ensure TRACE level is registered -from . import logging_levels # noqa: F401 -from .logging_levels import TRACE_LEVEL +# Ensure TRACE level is registered +from .logging_levels import TRACE_LEVEL
75-83
: Narrow exception; align JSON options.Limit the catch to JSON serialization errors and keep payload readable.
Apply this diff:
- if logger.isEnabledFor(TRACE_LEVEL): - try: - logger.trace( # type: ignore[attr-defined] - "EventBridge payload topic=%s payload=%s", topic_name, json.dumps(message, separators=(",", ":")) - ) - except Exception: # pragma: no cover - defensive serialization guard - logger.trace("EventBridge payload topic=%s <unserializable>", topic_name) # type: ignore[attr-defined] + if logger.isEnabledFor(TRACE_LEVEL): + try: + logger.trace( # type: ignore[attr-defined] + "EventBridge payload topic=%s payload=%s", + topic_name, + json.dumps(message, separators=(",", ":"), ensure_ascii=False), + ) + except (TypeError, ValueError, OverflowError): # pragma: no cover - defensive serialization guard + logger.trace("EventBridge payload topic=%s <unserializable>", topic_name) # type: ignore[attr-defined]src/event_gate_lambda.py (1)
46-51
: Tighten import guard; register name in fallback.Use
ImportError
instead of broadException
, drop the unusednoqa
, and ensure the level name is registered even in fallback soLOG_LEVEL=TRACE
works.Apply this diff:
-# Register custom TRACE level before using LOG_LEVEL env var -try: - from .logging_levels import TRACE_LEVEL # noqa: F401 -except Exception: # pragma: no cover - defensive - TRACE_LEVEL = 5 # type: ignore +# Register custom TRACE level before using LOG_LEVEL env var +try: + from .logging_levels import TRACE_LEVEL +except ImportError: # pragma: no cover - defensive + TRACE_LEVEL = 5 # type: ignore + logging.addLevelName(TRACE_LEVEL, "TRACE")tests/test_trace_logging.py (2)
24-42
: Tidy test doubles to silence arg/noqa
lint nits.Rename unused args and drop the unused
noqa
onflush
.Apply this diff:
- class FakeProducer: - def produce(self, *a, **kw): + class FakeProducer: + def produce(self, *_, **__): cb = kw.get("callback") if cb: cb(None, object()) - def flush(self, *a, **kw): # noqa: D401 + def flush(self, *_, **__): return 0
71-75
: Same nit on DummyPsycopg2.connect.Apply this diff:
- class DummyPsycopg2: - def connect(self, **kwargs): # noqa: D401 + class DummyPsycopg2: + def connect(self, **_): return DummyConnection()
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (6)
src/event_gate_lambda.py
(1 hunks)src/logging_levels.py
(1 hunks)src/writer_eventbridge.py
(2 hunks)src/writer_kafka.py
(2 hunks)src/writer_postgres.py
(2 hunks)tests/test_trace_logging.py
(1 hunks)
🧰 Additional context used
🧬 Code graph analysis (4)
src/writer_kafka.py (1)
src/logging_levels.py (1)
trace
(15-17)
tests/test_trace_logging.py (3)
src/writer_eventbridge.py (1)
write
(55-108)src/writer_kafka.py (1)
write
(81-134)src/writer_postgres.py (1)
write
(260-311)
src/writer_eventbridge.py (1)
src/logging_levels.py (1)
trace
(15-17)
src/writer_postgres.py (1)
src/logging_levels.py (1)
trace
(15-17)
🪛 Ruff (0.12.2)
src/writer_kafka.py
103-103: Do not catch blind exception: Exception
(BLE001)
tests/test_trace_logging.py
26-26: Unused method argument: a
(ARG002)
31-31: Unused method argument: a
(ARG002)
31-31: Unused method argument: kw
(ARG002)
31-31: Unused noqa
directive (non-enabled: D401
)
Remove unused noqa
directive
(RUF100)
72-72: Unused method argument: kwargs
(ARG002)
72-72: Unused noqa
directive (non-enabled: D401
)
Remove unused noqa
directive
(RUF100)
src/writer_eventbridge.py
30-30: Unused noqa
directive (non-enabled: F401
)
Remove unused noqa
directive
(RUF100)
81-81: Do not catch blind exception: Exception
(BLE001)
src/event_gate_lambda.py
48-48: Unused noqa
directive (non-enabled: F401
)
Remove unused noqa
directive
(RUF100)
49-49: Do not catch blind exception: Exception
(BLE001)
src/writer_postgres.py
283-283: Do not catch blind exception: Exception
(BLE001)
🔇 Additional comments (2)
src/writer_kafka.py (1)
29-31
: Import looks good.Side-effect import via
from .logging_levels import TRACE_LEVEL
is sufficient to register the level and method.src/writer_postgres.py (1)
34-36
: TRACE registration import is correct.Ensures
logger.trace
exists before use.
# TRACE-level payload logging | ||
if logger.isEnabledFor(TRACE_LEVEL): | ||
try: | ||
logger.trace( # type: ignore[attr-defined] | ||
"EventBridge payload topic=%s payload=%s", topic_name, json.dumps(message, separators=(",", ":")) | ||
) | ||
except Exception: # pragma: no cover - defensive serialization guard | ||
logger.trace("EventBridge payload topic=%s <unserializable>", topic_name) # type: ignore[attr-defined] | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Verification agent
🧩 Analysis chain
Guard against PII/large payloads at TRACE.
Add configurable redaction and a size cap to avoid leaking secrets and unbounded log volume.
🏁 Script executed:
#!/bin/bash
# Find all TRACE payload logging sites to update consistently.
rg -nP --type=py -C2 'payload topic=%s payload=%s'
Length of output: 1215
Implement configurable redaction + size cap for TRACE payload logs
Raw payload JSON is being logged at TRACE (can leak PII and create unbounded log volume). Fix by centralizing safe serialization and applying it consistently.
- Add a centralized safe_serialize_for_log(message, redact_keys, max_bytes) that: redacts configured sensitive keys (nested), truncates output to max_bytes (append "..."), and returns "" on serialization errors.
- Replace raw json.dumps(...) at: src/writer_eventbridge.py:79, src/writer_kafka.py:101, src/writer_postgres.py:281 to use the helper.
- Make redact_keys and max_bytes configurable (env/config) and add unit tests for redaction, truncation, and unserializable fallback.
🧰 Tools
🪛 Ruff (0.12.2)
81-81: Do not catch blind exception: Exception
(BLE001)
# TRACE-level payload logging prior to produce | ||
if logger.isEnabledFor(TRACE_LEVEL): | ||
try: | ||
logger.trace( # type: ignore[attr-defined] | ||
"Kafka payload topic=%s payload=%s", topic_name, json.dumps(message, separators=(",", ":")) | ||
) | ||
except Exception: # pragma: no cover - defensive | ||
logger.trace("Kafka payload topic=%s <unserializable>", topic_name) # type: ignore[attr-defined] | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Consider payload size/PII guard at TRACE.
Even at TRACE, raw payloads can contain secrets/PII and blow up log volume. Add a redaction/cap mechanism (e.g., redact keys list + max length).
Would you like a small helper (e.g., safe_payload_str(payload, redact_keys, max_len=10000)
) used by all writers?
🧰 Tools
🪛 Ruff (0.12.2)
103-103: Do not catch blind exception: Exception
(BLE001)
🤖 Prompt for AI Agents
In src/writer_kafka.py around lines 97 to 105, the TRACE-level payload log
prints raw message which can expose PII and produce huge logs; replace direct
json.dumps with a call to a shared helper like safe_payload_str(payload,
redact_keys, max_len=10000) that: 1) redacts configured sensitive keys (support
nested keys) by replacing values with "<REDACTED>", 2) truncates the resulting
string to max_len and appends "...(truncated)" if exceeded, 3) gracefully
handles non-serializable objects by falling back to a compact representation
(e.g., repr) and 4) never raises (wrap in try/except and return
"<unserializable>" on failure); call this helper when logging TRACE payloads and
add a TODO/config entry for redact_keys so other writers can reuse it.
# TRACE-level payload logging (only when we intend to write) | ||
if _logger.isEnabledFor(TRACE_LEVEL): | ||
try: | ||
_logger.trace( # type: ignore[attr-defined] | ||
"Postgres payload topic=%s payload=%s", topic_name, json.dumps(message, separators=(",", ":")) | ||
) | ||
except Exception: # pragma: no cover - defensive | ||
_logger.trace("Postgres payload topic=%s <unserializable>", topic_name) # type: ignore[attr-defined] | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TRACE payloads may leak sensitive fields.
Introduce optional redaction (configurable list of keys) and an upper bound on logged payload length.
I can add a shared trace_payload(logger, backend, topic, message, redact=("password","token"), max_len=10000)
helper in logging_levels.py
and use it across writers. Want me to draft it?
🧰 Tools
🪛 Ruff (0.12.2)
283-283: Do not catch blind exception: Exception
(BLE001)
This pull request introduces a new custom TRACE logging level (below DEBUG) for highly verbose payload logging, and integrates it into the EventBridge, Kafka, and Postgres writer modules. It also includes comprehensive tests to ensure TRACE-level logging works as intended across all three writers. The main themes are the addition of the TRACE level, its integration into the writers, and the new tests for this functionality.
Custom TRACE logging level:
TRACE_LEVEL
(level 5) and a correspondingLogger.trace()
method inlogging_levels.py
, ensuring idempotent registration and compatibility with the standard logging module.event_gate_lambda.py
,writer_eventbridge.py
,writer_kafka.py
, andwriter_postgres.py
to ensure the TRACE level is registered before use. [1] [2] [3] [4]Integration of TRACE logging in writers:
writer_eventbridge.py
,writer_kafka.py
, andwriter_postgres.py
, added TRACE-level payload logging before sending messages, including defensive handling for unserializable payloads. [1] [2] [3]Testing:
test_trace_logging.py
to verify that TRACE-level logging is triggered and correctly logs payloads for EventBridge, Kafka, and Postgres writers, using mocks and monkeypatching as needed.Release notes
Related
Summary by CodeRabbit
New Features
Tests