-
Notifications
You must be signed in to change notification settings - Fork 0
improve JSON logger, safe setup, and RequestLoggerAdapter
#31
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
WalkthroughReplaces the previous minimal JSON log payload with a featureful JsonFormatter emitting UTC ISO-8601 timestamps, renamed top-level Changes
Sequence Diagram(s)sequenceDiagram
autonumber
participant App as Application Code
participant Adp as RequestLoggerAdapter
participant Log as Logger
participant H as StreamHandler
participant F as JsonFormatter
participant Out as Stdout
rect rgb(235,245,255)
note over App,Adp: App logs with optional contextual extras
App->>Adp: .info("message", extra={"user_id":"u1"})
Adp->>Log: log(msg, merged_extras)
Log->>H: emit(LogRecord)
end
rect rgb(240,255,240)
note over H,F: Formatting to JSON and output
H->>F: format(record)
F-->>H: JSON line {time, level, logger, message, extras, exc_text?}
H->>Out: write(json_line)
end
sequenceDiagram
autonumber
participant App as Application
participant Cfg as setup_logging
participant Root as RootLogger
participant H as StreamHandler
App->>Cfg: setup_logging(level, force)
alt No existing stream handler
Cfg->>Root: addHandler(new StreamHandler with JsonFormatter)
else existing stream handler & force == true
Cfg->>Root: remove existing StreamHandler
Cfg->>Root: addHandler(new StreamHandler with JsonFormatter)
else existing stream handler & force == false
Cfg-->>App: keep existing handler (no-op)
end
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes Possibly related PRs
Suggested reviewers
Poem
Pre-merge checks and finishing touches❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✨ Finishing touches
🧪 Generate unit tests
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🧹 Nitpick comments (7)
services/logging_config.py (7)
19-23
: Use 'Z' suffix for UTC and silence unused-arg warning.Adopt RFC 3339-style UTC with 'Z' and rename the unused arg to avoid ARG002.
- def formatTime(self, record: logging.LogRecord, datefmt: Optional[str] = None) -> str: + def formatTime(self, record: logging.LogRecord, _datefmt: Optional[str] = None) -> str: # Always use UTC ISO-8601 with milliseconds dt = datetime.fromtimestamp(record.created, tz=timezone.utc) - return dt.isoformat(timespec="milliseconds") + return dt.isoformat(timespec="milliseconds").replace("+00:00", "Z")If your log aggregator expects +00:00 instead of Z, keep the current output.
24-31
: Include basic source/process/thread fields for better diagnostics.These are often needed for triage and don’t bloat the payload.
def format(self, record: logging.LogRecord) -> str: # type: ignore[override] log_record: Dict[str, Any] = { "level": record.levelname, "time": self.formatTime(record, self.datefmt), "logger": record.name, "message": record.getMessage(), + "module": record.module, + "filename": record.filename, + "lineno": record.lineno, + "func": record.funcName, + "process": record.process, + "thread": record.threadName, }
32-37
: Comment contradicts behavior; optionally addstack
when present.You say “don’t include it in the machine‑readable payload,” but
exc_text
is included. Either adjust the comment or addstack
too whenstack_info
exists.# Include standard exception info if present if record.exc_info: # Use the built-in formatter to render exception text but don't - # include it in the machine-readable payload. Keep a short flag. + # include it as nested objects. Keep a concise string field. log_record["exc_text"] = self.formatException(record.exc_info) + if getattr(record, "stack_info", None): + log_record["stack"] = record.stack_info
38-51
: Avoid per-field json.dumps and broad except; let json handle defaults.Per-attribute dumps +
except Exception
is slow and noisy (BLE001). Collect extras verbatim and rely ondefault=str
during the final dump.# Attach any extra attributes provided via ``extra={...}`` for key, value in record.__dict__.items(): if key in ("name", "msg", "args", "levelname", "levelno", "pathname", "filename", "module", "exc_info", "exc_text", "stack_info", "lineno", "funcName", "created", "msecs", "relativeCreated", "thread", "threadName", "processName", "process"): continue - # Only include JSON-serializable extras; fall back to str() - try: - json.dumps({key: value}) - log_record[key] = value - except Exception: - log_record[key] = str(value) + # Include extras as-is; non-serializable values are handled by default=str at dump time. + log_record[key] = value
52-57
: Remove blind catch; usedefault=str
to serialize non-JSON types.Resolves BLE001 and simplifies control flow.
- # Ensure deterministic output order for readability in logs - try: - return json.dumps(log_record, ensure_ascii=False, separators=(",", ":")) - except Exception: - # Fallback to a safe string representation - return json.dumps({"level": record.levelname, "message": record.getMessage()}) + # Ensure deterministic output (insertion-ordered) and safe serialization + return json.dumps(log_record, ensure_ascii=False, separators=(",", ":"), default=str)
88-101
: Loosenmsg
type and hardenextra
merge.
LoggerAdapter.process
accepts any message type. Also guard against non-dict extras.- def process(self, msg: str, kwargs: Dict[str, Any]) -> Tuple[str, Dict[str, Any]]: - extra = kwargs.setdefault("extra", {}) - # Merge adapter's context without overwriting existing keys - for k, v in (self.extra or {}).items(): - extra.setdefault(k, v) - return msg, kwargs + def process(self, msg: Any, kwargs: Dict[str, Any]) -> Tuple[Any, Dict[str, Any]]: + extra_obj = kwargs.get("extra") + if not isinstance(extra_obj, dict): + extra_obj = {} + kwargs["extra"] = extra_obj + # Merge adapter's context without overwriting existing keys + for k, v in (self.extra or {}).items(): + extra_obj.setdefault(k, v) + return msg, kwargs
60-70
: Docstring tweak suggestion.Since setup may reuse an existing stdout handler (setting formatter), consider clarifying: “Ensures JSON logs to stdout; if a stdout handler exists, updates its formatter unless force=True.”
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
services/logging_config.py
(1 hunks)
🧰 Additional context used
🪛 Ruff (0.13.1)
services/logging_config.py
19-19: Unused method argument: datefmt
(ARG002)
49-49: Do not catch blind exception: Exception
(BLE001)
55-55: Do not catch blind exception: Exception
(BLE001)
…ve handler management
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🧹 Nitpick comments (4)
services/logging_config.py (4)
19-23
: Silence unused-arg lint; optionally emit 'Z' for UTC.
- Use the
datefmt
argument to satisfy linters.- Optional: replace
+00:00
withZ
for stricter ISO‑8601(UTC) interop.- def formatTime(self, record: logging.LogRecord, datefmt: Optional[str] = None) -> str: - # Always use UTC ISO-8601 with milliseconds - dt = datetime.fromtimestamp(record.created, tz=timezone.utc) - return dt.isoformat(timespec="milliseconds") + def formatTime(self, record: logging.LogRecord, datefmt: Optional[str] = None) -> str: + # Always use UTC ISO-8601 with milliseconds + _ = datefmt # satisfy linters; API includes this param + dt = datetime.fromtimestamp(record.created, tz=timezone.utc) + return dt.isoformat(timespec="milliseconds").replace("+00:00", "Z")
53-58
: Make output order deterministic and narrow fallback exception.Match the comment by setting
sort_keys=True
; avoid blanketexcept
.- try: - return json.dumps(log_record, ensure_ascii=False, separators=(",", ":")) - except Exception: - # Fallback to a safe string representation - return json.dumps({"level": record.levelname, "message": record.getMessage()}) + try: + return json.dumps( + log_record, + ensure_ascii=False, + separators=(",", ":"), + sort_keys=True, + ) + except (TypeError, ValueError, OverflowError, RecursionError): + # Fallback to a safe string representation + return json.dumps({"level": record.levelname, "message": record.getMessage()})
63-70
: Docstring: clarify stdout-specific behavior.Reflect actual stdout handler semantics in the
force
description.- force: If True, replace existing handlers. Default False (adds a handler only - if no stream handler is present). + force: If True, replace the existing stdout handler. Default False (configure an + existing stdout handler if present; otherwise add one).
104-109
: Minor: widenmsg
type and hardenextra
shape.Align with logging API flexibility and guard against non-dict extras.
- def process(self, msg: str, kwargs: Dict[str, Any]) -> Tuple[str, Dict[str, Any]]: + def process(self, msg: Any, kwargs: Dict[str, Any]) -> Tuple[str, Dict[str, Any]]: extra = kwargs.setdefault("extra", {}) + if not isinstance(extra, dict): + extra = {} + kwargs["extra"] = extra # Merge adapter's context without overwriting existing keys for k, v in (self.extra or {}).items(): extra.setdefault(k, v) return msg, kwargs
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
services/logging_config.py
(1 hunks)
🧰 Additional context used
🪛 Ruff (0.13.1)
services/logging_config.py
19-19: Unused method argument: datefmt
(ARG002)
49-49: Do not catch blind exception: Exception
(BLE001)
55-55: Do not catch blind exception: Exception
(BLE001)
🔇 Additional comments (3)
services/logging_config.py (3)
32-37
: Good inclusion of structured exception context.Including
exc_text
whenexc_info
is present is correct and keeps logs machine-friendly.
96-103
: Adapter design LGTM.Merges per-request context without overwriting explicit extras.
75-94
: Fix stdout handler update and avoid removing stderr handlers.
- Early-return skips installing the JSON formatter on an existing stdout handler.
force=True
removes all stream handlers, including stderr, which might be in use.- # Detect existing StreamHandler(s) that write to stdout specifically. - stdout_handlers = [h for h in root_logger.handlers - if isinstance(h, logging.StreamHandler) and getattr(h, 'stream', None) is sys.stdout] - - # If a stdout handler already exists and we're not forcing an update, do nothing. - if stdout_handlers and not force: - return - - # If forcing, remove existing StreamHandler(s) (but leave file/syslog handlers intact). - if force: - root_logger.handlers = [h for h in root_logger.handlers if not isinstance(h, logging.StreamHandler)] - - # Only add the stdout JSON handler if one isn't already present (avoid duplicates). - stdout_already = any(isinstance(h, logging.StreamHandler) and getattr(h, 'stream', None) is sys.stdout - for h in root_logger.handlers) - if not stdout_already: - handler = logging.StreamHandler(sys.stdout) - handler.setFormatter(JsonFormatter()) - root_logger.addHandler(handler) + # Find an existing stdout StreamHandler, if any + existing_stdout = next( + ( + h for h in root_logger.handlers + if isinstance(h, logging.StreamHandler) and getattr(h, "stream", None) is sys.stdout + ), + None, + ) + if existing_stdout and not force: + # Reuse and ensure JSON formatting; let logger level govern emission + existing_stdout.setFormatter(JsonFormatter()) + existing_stdout.setLevel(logging.NOTSET) + return + + if force: + # Remove only stdout stream handlers; preserve stderr and non-stream handlers (file/syslog) + root_logger.handlers = [ + h for h in root_logger.handlers + if not (isinstance(h, logging.StreamHandler) and getattr(h, "stream", None) is sys.stdout) + ] + + # Avoid duplicates; add stdout JSON handler if missing + exists_after = any( + isinstance(h, logging.StreamHandler) and getattr(h, "stream", None) is sys.stdout + for h in root_logger.handlers + ) + if not exists_after: + handler = logging.StreamHandler(sys.stdout) + handler.setLevel(logging.NOTSET) # let logger level control emission + handler.setFormatter(JsonFormatter()) + root_logger.addHandler(handler)
# Attach any extra attributes provided via ``extra={...}`` | ||
for key, value in record.__dict__.items(): | ||
if key in ("name", "msg", "args", "levelname", "levelno", "pathname", "filename", | ||
"module", "exc_info", "exc_text", "stack_info", "lineno", "funcName", | ||
"created", "msecs", "relativeCreated", "thread", "threadName", "processName", | ||
"process"): | ||
continue | ||
# Only include JSON-serializable extras; fall back to str() | ||
try: | ||
json.dumps({key: value}) | ||
log_record[key] = value | ||
except Exception: | ||
log_record[key] = str(value) | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Prevent extras from clobbering core fields; narrow exception catch.
- Extras can overwrite
"level"
,"time"
,"logger"
, or"message"
today. - Avoid
except Exception
per lint and best practices.
- # Attach any extra attributes provided via ``extra={...}``
- for key, value in record.__dict__.items():
- if key in ("name", "msg", "args", "levelname", "levelno", "pathname", "filename",
- "module", "exc_info", "exc_text", "stack_info", "lineno", "funcName",
- "created", "msecs", "relativeCreated", "thread", "threadName", "processName",
- "process"):
- continue
- # Only include JSON-serializable extras; fall back to str()
- try:
- json.dumps({key: value})
- log_record[key] = value
- except Exception:
- log_record[key] = str(value)
+ # Attach any extra attributes provided via ``extra={...}``
+ reserved_keys = {"level", "time", "logger", "message", "exc_text"}
+ for key, value in record.__dict__.items():
+ if (
+ key in reserved_keys
+ or key in (
+ "name", "msg", "args", "levelname", "levelno", "pathname", "filename",
+ "module", "exc_info", "exc_text", "stack_info", "lineno", "funcName",
+ "created", "msecs", "relativeCreated", "thread", "threadName", "processName",
+ "process"
+ )
+ ):
+ continue
+ # Only include JSON-serializable extras; fall back to str()
+ try:
+ json.dumps({key: value})
+ log_record[key] = value
+ except (TypeError, ValueError, OverflowError, RecursionError):
+ log_record[key] = str(value)
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
# Attach any extra attributes provided via ``extra={...}`` | |
for key, value in record.__dict__.items(): | |
if key in ("name", "msg", "args", "levelname", "levelno", "pathname", "filename", | |
"module", "exc_info", "exc_text", "stack_info", "lineno", "funcName", | |
"created", "msecs", "relativeCreated", "thread", "threadName", "processName", | |
"process"): | |
continue | |
# Only include JSON-serializable extras; fall back to str() | |
try: | |
json.dumps({key: value}) | |
log_record[key] = value | |
except Exception: | |
log_record[key] = str(value) | |
# Attach any extra attributes provided via ``extra={...}`` | |
reserved_keys = {"level", "time", "logger", "message", "exc_text"} | |
for key, value in record.__dict__.items(): | |
if ( | |
key in reserved_keys | |
or key in ( | |
"name", "msg", "args", "levelname", "levelno", "pathname", "filename", | |
"module", "exc_info", "exc_text", "stack_info", "lineno", "funcName", | |
"created", "msecs", "relativeCreated", "thread", "threadName", "processName", | |
"process" | |
) | |
): | |
continue | |
# Only include JSON-serializable extras; fall back to str() | |
try: | |
json.dumps({key: value}) | |
log_record[key] = value | |
except (TypeError, ValueError, OverflowError, RecursionError): | |
log_record[key] = str(value) |
🧰 Tools
🪛 Ruff (0.13.1)
49-49: Do not catch blind exception: Exception
(BLE001)
🤖 Prompt for AI Agents
In services/logging_config.py around lines 38 to 51, the loop that attaches
extra record attributes currently allows extras to overwrite core JSON fields
(e.g., "level", "time", "logger", "message") and uses a broad except Exception;
update the skip list to also include "level", "time", "logger", and "message" so
those extras are ignored and narrow the exception handler to only the exceptions
json.dumps can raise (e.g., TypeError, ValueError, OverflowError) so
non-serializable values fall back to str() without catching unrelated errors.
…den error handling (#30) * Add request ID logging and move start time initialization for web search answer * improve JSON logger, safe setup, and `RequestLoggerAdapter` (#31) * Enhance structured logging with improved JSON formatting and exception handling * Refactor logging setup to prevent duplicate stdout handlers and improve handler management * Enhance logging setup: implement rotating file handler, improve context-aware logging, and update JSON formatter for better structured output. * Add request ID logging and move start time initialization for web search answer * Enhance logging setup: implement rotating file handler, improve context-aware logging, and update JSON formatter for better structured output. --------- Co-authored-by: Payton Zuniga <zunigapayton3@gmail.com>
Streamlines structured logging: a sturdier JSON formatter (ISO-8601 UTC, exception info, safe extras), a safe
setup_logging(level, force=False)
to prevent duplicate handlers, and aRequestLoggerAdapter
for contextual fields (e.g.,request_id
).Key changes
Swap minimal JSON formatter for a robust
JsonFormatter
:extra
fields with fallback to prevent crashesAdd
setup_logging(level, force=False)
:force
to reset when neededIntroduce
RequestLoggerAdapter
:request_id
,user_id
) to all log recordsMinor: tighten type hints and serialization guards
Why
Reduce log breakage from non-serializable fields, avoid handler duplication, and ensure consistent, aggregator-friendly structured logs.
Summary by CodeRabbit
New Features
Bug Fixes