Metadata-Version: 2.4
Name: pyctxlog
Version: 0.1.0
Summary: Generic contextual logging for Python — a decorator/context-manager that auto-tags log lines with per-call context.
Project-URL: Homepage, https://github.com/YRamzy993/pyctxlog.git
Project-URL: Issues, https://github.com/YRamzy993/pyctxlog.git/issues
Project-URL: Changelog, https://github.com/YRamzy993/pyctxlog.git/blob/main/CHANGELOG.md
Author: Youssef Mohamed Ramzy
License: MIT
License-File: LICENSE
Keywords: context,contextvars,decorator,logging,middleware,observability,structured-logging
Classifier: Development Status :: 4 - Beta
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Topic :: System :: Logging
Classifier: Typing :: Typed
Requires-Python: >=3.9
Requires-Dist: ulid-py<2.0.0,>=1.1.0
Provides-Extra: dev
Requires-Dist: build; extra == 'dev'
Requires-Dist: mypy>=1.10; extra == 'dev'
Requires-Dist: pytest-asyncio>=0.23; extra == 'dev'
Requires-Dist: pytest>=8; extra == 'dev'
Requires-Dist: ruff>=0.5; extra == 'dev'
Requires-Dist: twine; extra == 'dev'
Description-Content-Type: text/markdown

# pyctxlog

[![PyPI version](https://img.shields.io/pypi/v/pyctxlog.svg)](https://pypi.org/project/pyctxlog/)
[![Python versions](https://img.shields.io/pypi/pyversions/pyctxlog.svg)](https://pypi.org/project/pyctxlog/)
[![License: MIT](https://img.shields.io/badge/license-MIT-blue.svg)](LICENSE)

**Generic contextual logging for Python services.** A small decorator (and
underlying context manager) that wraps any sync or async function and
auto-tags every log line inside with per-call context fields — request id,
job name, tenant, trace id, anything you want.

`pyctxlog` is unopinionated about your framework. It works equally well as a
Django/FastAPI middleware, a Celery task wrapper, an RPC handler context, or
just on plain functions.

## Why

Tracing what a single request, task or job did across many log lines is
painful when each line is just an unstructured string. `pyctxlog` solves
this with two primitives:

1. **`@log_context`** — wraps a function with a context block. Anything that
   logs through a `ContextualLogger` while the wrapped call is on the stack
   sees the context's fields rendered into every log line.
2. **`LogContext`** — the underlying `with`-block primitive. Use it directly
   when you can't put a decorator on a function (e.g. inside framework
   middleware).

Both push their fields onto a `contextvars.ContextVar`, so the context is
correctly isolated across threads, asyncio tasks, and concurrent requests.

## Install

```bash
pip install pyctxlog
```

Requires Python 3.9+. The only runtime dependency is `ulid-py` (used for
auto-generated request ids).

## 30-second quick start

```python
import logging
from pyctxlog import ContextualLogger, log_context

log = ContextualLogger(
    name="orders-api",
    static_fields={                  # constants rendered in every line
        "service": "orders-api",
        "env": "production",
    },
    enable_logging=True,
    log_level=logging.INFO,
    extra_fields=["job", "id"],      # dynamic fields pulled from context
)

@log_context(fields={"job": "ingest_orders"}, logger=log)
def run_ingest(batch_id: str) -> int:
    log.info(f"processing batch {batch_id}")
    return 42

run_ingest("BATCH-001")
```

Output:

```
2026-04-10 12:34:56 | INFO | orders-api | production | job=ingest_orders | id=01HQ... | 🚀 run_ingest() called 🚀
2026-04-10 12:34:56 | INFO | orders-api | production | job=ingest_orders | id=01HQ... | ✅ processing batch BATCH-001 ✅
2026-04-10 12:34:56 | INFO | orders-api | production | job=ingest_orders | id=01HQ... | 🕛 run_ingest() completed in 0.0001s 🕛
```

`static_fields` are baked into the logger at construction time and rendered
verbatim on every line. `extra_fields` are *dynamic*: their values are pulled
fresh from the active context (set by `LogContext`, `BaseLogContext`, or
`@log_context`) each time the logger emits a record.

The library makes no assumption about what your fields *mean* — there's no
hardcoded `service_name`, `environment`, `tenant` or anything else. Pass
whatever flat key→value pairs make sense for your project.

## Async works the same way

```python
@log_context(fields={"job": "ingest_orders"}, logger=log)
async def run_ingest_async(batch_id: str) -> int:
    log.info(f"processing batch {batch_id}")
    return 42
```

The decorator detects coroutine functions via `inspect.iscoroutinefunction`
and returns the right wrapper automatically.

## Decorator options

| Option | Default | What it does |
|---|---|---|
| `fields` | `None` | Static dict of context fields |
| `context_cls` | `None` | A `BaseLogContext` subclass — built from the matching kwargs of the wrapped call |
| `logger` | `None` | Which `ContextualLogger` to log entry/exit/error through |
| `name` | `func.__qualname__` | Display name for entry/exit lines |
| `capture_exceptions` | `True` | Log unhandled exceptions with traceback then re-raise |
| `log_args` | `False` | Log the function arguments on entry |
| `log_result` | `False` | Log the return value on exit |
| `slow_threshold_seconds` | `None` | Emit a warning if the call takes at least this long |
| `id_generator` | ULID | Callable returning the auto-generated context `id` |
| `extra` | `None` | Alias for `fields`, kept for symmetry with stdlib `logging` |

## Global configuration

If you find yourself passing the same `logger=...` (or `capture_exceptions=`,
`log_args=`, `slow_threshold_seconds=`, ...) to every `@log_context` call,
you can set those as module-level defaults once at startup with
`configure()`:

```python
from pyctxlog import ContextualLogger, configure, log_context

log = ContextualLogger(
    name="orders-api",
    static_fields={"service": "orders-api", "env": "prod"},
    extra_fields=["job", "id"],
)

configure(
    logger=log,                      # default logger for every @log_context
    capture_exceptions=True,
    log_args=False,
    slow_threshold_seconds=2.0,      # warn on any call >= 2s
    default_fields={"service_color": "blue"},  # auto-tag every call
)

@log_context(fields={"job": "ingest"})    # no logger= needed now
def run_ingest(batch_id: str) -> int:
    ...
```

Per-call arguments always win over config. To explicitly *override a
configured default back to "nothing"* on a single decorator, pass `None`:

```python
@log_context(fields={"job": "quiet"}, logger=None)   # suppress the configured logger
def run_quiet():
    ...
```

The full list of overridable keys: `logger`, `capture_exceptions`, `log_args`,
`log_result`, `slow_threshold_seconds`, `id_generator`, `id_field`,
`default_fields`. Call `reset_config()` to restore the factory defaults
(useful in tests).

## Typed contexts via `BaseLogContext`

For long-lived shapes (e.g. an HTTP request) you can declare a typed context
once and reuse it everywhere:

```python
from dataclasses import dataclass
from pyctxlog import BaseLogContext, log_context

@dataclass
class HttpRequestContext(BaseLogContext):
    method: str = ""
    path: str = ""
    user_id: str = "anonymous"

    def __post_init__(self):
        BaseLogContext.__init__(self)

@log_context(context_cls=HttpRequestContext, logger=log)
def handle_request(method: str, path: str, user_id: str):
    log.info("handling request")
```

When `context_cls` is set, the decorator inspects the wrapped function's
arguments and builds the context from any matching parameter names.

## Use as a context manager (for middleware)

`@log_context` is a thin wrapper around `LogContext`. When you can't put a
decorator on a function (e.g. inside framework middleware), use the context
manager directly:

```python
from pyctxlog import LogContext

with LogContext({"job": "ingest", "tenant": "acme"}):
    log.info("starting work")  # auto-tagged with job + tenant + id
```

## Recipes

### Django middleware

```python
from pyctxlog import LogContext, ContextualLogger

log = ContextualLogger(
    name="my-django-app",
    static_fields={"service": "my-django-app"},
    extra_fields=["request_id", "method", "path", "user_id"],
)

class PyCtxLogMiddleware:
    def __init__(self, get_response):
        self.get_response = get_response

    def __call__(self, request):
        with LogContext({
            "method": request.method,
            "path": request.path,
            "user_id": getattr(getattr(request, "user", None), "id", "anonymous"),
        }):
            return self.get_response(request)
```

Then add it to `MIDDLEWARE` in `settings.py`.

### FastAPI middleware

```python
from fastapi import FastAPI, Request
from pyctxlog import LogContext, ContextualLogger

log = ContextualLogger(
    name="my-fastapi-app",
    static_fields={"service": "my-fastapi-app"},
    extra_fields=["request_id", "method", "path"],
)

app = FastAPI()

@app.middleware("http")
async def pyctxlog_middleware(request: Request, call_next):
    async with LogContext({
        "method": request.method,
        "path": request.url.path,
    }):
        return await call_next(request)
```

### Celery task

```python
from pyctxlog import log_context

@app.task
@log_context(fields={"job": "send_invoice_email"}, logger=log, log_args=True)
def send_invoice_email(invoice_id: str, recipient: str):
    ...
```

### RabbitMQ RPC handler

```python
def on_rpc_message(channel, method, properties, body):
    with LogContext({
        "rpc_routing_key": method.routing_key,
        "correlation_id": properties.correlation_id,
        "reply_to_queue": properties.reply_to,
    }):
        handle_rpc(body)
```

## API reference

```python
from pyctxlog import (
    ContextualLogger,        # the configured logger
    log_context,             # the decorator
    LogContext,              # context manager primitive
    BaseLogContext,          # base for typed contexts
    app_context,             # the underlying ContextVar
    get_current_context,     # snapshot of currently-active fields
    Config,                  # the config dataclass
    configure,               # set module-level decorator defaults
    get_config,              # read the current config
    reset_config,            # restore factory defaults
    __version__,
)
```

## Development

```bash
git clone https://github.com/youssefmramzy/pyctxlog.git
cd pyctxlog
python -m venv .venv && source .venv/bin/activate
pip install -e .[dev]
pytest -v
ruff check src tests
mypy src/pyctxlog
```

## Building & publishing

```bash
python -m build              # produces dist/pyctxlog-X.Y.Z-py3-none-any.whl + .tar.gz
twine upload --repository testpypi dist/*   # smoke test on TestPyPI first
twine upload dist/*                          # then publish for real
```

## License

MIT — see [LICENSE](LICENSE).
