proxywhirl.logging_config

Structured logging configuration for ProxyWhirl.

This module provides MVP Phase 1 functionality for structured logging: - JSON and logfmt output formats - Log rotation configuration - Contextual metadata support - Integration with existing loguru usage

Example

>>> from proxywhirl.logging_config import configure_logging, get_logger
>>> configure_logging(format="json", level="INFO", rotation="10 MB")
>>> logger = get_logger()
>>> logger.info("Started proxy rotation")

Classes

LogContext

Context manager for adding contextual metadata to log entries.

LogRotationConfig

Configuration for log file rotation.

LoggingConfig

Main logging configuration model.

Functions

bind_context(**context)

Bind contextual metadata to the logger.

configure_logging([format, level, rotation, ...])

Configure structured logging for ProxyWhirl.

get_logger()

Get the configured loguru logger instance.

Module Contents

class proxywhirl.logging_config.LogContext(**context)[source]

Context manager for adding contextual metadata to log entries.

This class provides a convenient way to bind context variables to the logger for a specific block of code. Context is automatically removed when exiting the context manager.

Example

>>> with LogContext(request_id="req-123", operation="proxy_fetch"):
...     logger.info("Processing request")
# Log entry will include request_id and operation in context
Parameters:

**context (str | int | float | bool | None) – Arbitrary keyword arguments to bind as context metadata

Initialize context with metadata key-value pairs.

class proxywhirl.logging_config.LogRotationConfig(/, **data)[source]

Bases: pydantic.BaseModel

Configuration for log file rotation.

Parameters:

data (Any)

size[source]

Size threshold for rotation (e.g., “10 MB”, “1 GB”)

time[source]

Time threshold for rotation (e.g., “1 day”, “1 week”)

retention[source]

Number of rotated files to keep (e.g., “5 files”, “1 week”)

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

class proxywhirl.logging_config.LoggingConfig(/, **data)[source]

Bases: pydantic.BaseModel

Main logging configuration model.

Parameters:

data (Any)

format[source]

Output format - ‘default’, ‘json’, or ‘logfmt’

level[source]

Minimum log level (DEBUG, INFO, WARNING, ERROR, CRITICAL)

rotation[source]

Log rotation configuration (size/time-based)

retention[source]

Retention policy for rotated logs

colorize[source]

Enable colored output (only for ‘default’ format)

diagnose[source]

Enable diagnostic mode (adds extra debug info)

backtrace[source]

Enable backtrace on exceptions

enqueue[source]

Enable async logging with queue

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

proxywhirl.logging_config.bind_context(**context)[source]

Bind contextual metadata to the logger.

This is a convenience wrapper around logger.bind() for adding metadata to log entries. The bound context persists for the lifetime of the returned logger instance.

Parameters:

**context (str | int | float | bool | None) – Arbitrary keyword arguments to bind as context metadata

Returns:

Logger instance with bound context

Example

>>> request_logger = bind_context(request_id="req-123", operation="fetch")
>>> request_logger.info("Started operation")
# Log will include request_id and operation in context
proxywhirl.logging_config.configure_logging(format='default', level='INFO', rotation=None, retention=None, sink=sys.stderr, colorize=None, diagnose=False, backtrace=True, enqueue=True)[source]

Configure structured logging for ProxyWhirl.

This function sets up loguru with the specified format, level, and rotation settings. It removes existing handlers and adds a new one with the configured settings.

Parameters:
  • format (Literal['default', 'json', 'logfmt']) – Output format - ‘default’ (colored text), ‘json’, or ‘logfmt’

  • level (str) – Minimum log level (DEBUG, INFO, WARNING, ERROR, CRITICAL)

  • rotation (str | None) – Rotation threshold (e.g., ‘10 MB’, ‘1 day’, ‘00:00’)

  • retention (str | None) – Retention policy (e.g., ‘5 files’, ‘1 week’, ‘7 days’)

  • sink – Output destination (file path, sys.stderr, sys.stdout, etc.)

  • colorize (bool | None) – Enable colored output (auto-detected if None)

  • diagnose (bool) – Enable diagnostic mode (adds code context to logs)

  • backtrace (bool) – Enable backtrace on exceptions

  • enqueue (bool) – Enable async logging with queue

Return type:

None

Example

>>> # JSON logging to file with rotation
>>> configure_logging(
...     format="json",
...     level="INFO",
...     rotation="10 MB",
...     retention="5 files",
...     sink="logs/proxywhirl.log"
... )
>>> # Console logging with colors
>>> configure_logging(format="default", level="DEBUG")
>>> # Logfmt to stderr
>>> configure_logging(format="logfmt", level="WARNING")
proxywhirl.logging_config.get_logger()[source]

Get the configured loguru logger instance.

Returns:

The global loguru logger instance, configured via configure_logging()

Example

>>> logger = get_logger()
>>> logger.info("Processing started")
>>> logger.bind(request_id="req-123").info("Request received")