# azure-functions-logging — Full LLM Reference

> Structured logging for Azure Functions Python — invocation-aware, Application Insights-ready, zero breaking changes.

## Package Info

- PyPI: `pip install azure-functions-logging`
- Version: 0.7.1
- Python: >=3.10, <3.15
- License: MIT
- Docs: https://yeongseon.github.io/azure-functions-logging-python/
- Repository: https://github.com/yeongseon/azure-functions-logging-python

## Installation

```bash
pip install azure-functions-logging
```

## Public API

### Setup Function

```python
from azure_functions_logging import setup_logging
from pathlib import Path
import logging

def setup_logging(
    *,
    level: int = logging.INFO,
    format: str = "color",
    logger_name: str | None = None,
    functions_formatter: logging.Formatter | None = None,
    host_json_path: Path | str | None = None,
) -> None:
    """Configure logging for the current environment.

    Behavior depends on detected environment:
    - **Azure / Core Tools**: Installs ContextFilter on existing root handlers AND on the root logger itself (so direct calls on the root logger carry context). Does NOT add handlers or modify root logger level (respects host.json). If `functions_formatter` is provided, it is applied to host-managed handlers. NOTE: a filter on the root logger does not run for records that propagate from named child loggers to host/third-party handlers attached later — use `install_context_factory()` to guarantee context coverage in that case.
    - **Standalone local**: Sets the target/root logger level. Adds a StreamHandler (ColorFormatter or JsonFormatter) ONLY when no handlers exist; otherwise just attaches filters to existing handlers.

    This function is idempotent per logger_name.

    Args:
        level: Logging level for local dev. Ignored in Azure/Core Tools (default: INFO).
        format: Log format. "color" (default) or "json". In standalone local mode, selects ColorFormatter vs JsonFormatter. In Azure/Core Tools, host handlers are host-managed; use functions_formatter to set their formatter (passing format="json" alone emits a warning).
                Passing format="json" in Azure without functions_formatter emits a warning.
        logger_name: Optional logger name to configure. None = root logger.
        functions_formatter: Custom formatter for Azure/Core Tools host-managed handlers.
        host_json_path: Explicit path to host.json. When None, walks up from cwd up to 5
                        levels deep to discover host.json automatically.

    Raises:
        ValueError: If format not in {"color", "json"}.
    """
    ...
```

### Logger Creation

```python
from azure_functions_logging import get_logger, FunctionLogger

def get_logger(name: str | None = None) -> FunctionLogger:
    """Create a FunctionLogger wrapping a standard logging.Logger."""
    ...

class FunctionLogger:
    """Wrapper around logging.Logger with context binding.

    Delegates standard logging methods to the underlying logger.
    The bind() method returns a new wrapper with merged context.
    Reserved keys (invocation_id, function_name, trace_id, cold_start) are
    automatically sanitized before being forwarded to stdlib logging.
    """

    def __init__(self, logger: logging.Logger) -> None: ...

    @property
    def name(self) -> str: ...

    def bind(self, **kwargs: Any) -> FunctionLogger:
        """Return new FunctionLogger with merged context."""
        ...

    def clear_context(self) -> None: ...

    def hasHandlers(self) -> bool:
        """Stdlib parity: True if any handler is reachable up the chain."""
        ...

    def log(self, level: int, msg: object, *args: Any, **kwargs: Any) -> None:
        """Stdlib parity: log at arbitrary numeric level. Routes through internal
        sanitization and bound-context merging like the named-level methods."""
        ...

    def debug(self, msg: object, *args: Any, **kwargs: Any) -> None: ...
    def info(self, msg: object, *args: Any, **kwargs: Any) -> None: ...
    def warning(self, msg: object, *args: Any, **kwargs: Any) -> None: ...
    def error(self, msg: object, *args: Any, **kwargs: Any) -> None: ...
    def critical(self, msg: object, *args: Any, **kwargs: Any) -> None: ...
    def exception(self, msg: object, *args: Any, **kwargs: Any) -> None: ...

    # Stdlib parity: level inspection/configuration on the underlying logger.
    def setLevel(self, level: int | str) -> None: ...
    def isEnabledFor(self, level: int) -> bool: ...
    def getEffectiveLevel(self) -> int: ...
```

### JSON Formatter

```python
from azure_functions_logging import JsonFormatter
import logging

class JsonFormatter(logging.Formatter):
    """Structured JSON log formatter (NDJSON).
    
    Output is newline-delimited JSON (NDJSON), one JSON object per line.
    Context fields (invocation_id, function_name, etc.) included when present.
    
    Example output:
        {"timestamp": "2024-01-15T10:30:00+00:00", "level": "INFO", 
         "logger": "my_app", "message": "Request received",
         "invocation_id": "abc-123", "cold_start": true, 
         "extra": {"user_id": "123"}}
    """
    
    def __init__(self) -> None:
        """Initialize JSON formatter (no args)."""
        ...
    
    def format(self, record: logging.LogRecord) -> str:
        """Format log record as NDJSON string."""
        ...
```

### Filters

```python
from azure_functions_logging import SamplingFilter, RedactionFilter
import logging

class SamplingFilter(logging.Filter):
    """Rate-limit logger to emit max rate records per window.
    
    Useful for high-frequency loggers (HTTP logs, polling loops).
    Records at WARNING+ always passed through, regardless of cap.
    
    Args:
        rate: Max records per window (must be >= 1, default: 100).
        window: Rolling time window in seconds (default: 1.0).
        name: Logger name filter. Empty string = all (default).
    
    Raises:
        ValueError: If rate < 1 or window <= 0.
    """
    
    def __init__(
        self,
        rate: int = 100,
        window: float = 1.0,
        name: str = "",
    ) -> None:
        ...
    
    def filter(self, record: logging.LogRecord) -> bool:
        """Return True to emit, False to drop (based on rate)."""
        ...


class RedactionFilter(logging.Filter):
    """Mask PII / sensitive values on LogRecord extra attributes.
    
    Mutates record in-place, redacting any non-standard field whose
    lowercase name is in sensitive_keys. Both ColorFormatter and
    JsonFormatter see redacted values.
    
    Default sensitive keys: password, passwd, token, authorization,
    secret, api_key, apikey.
    
    Args:
        sensitive_keys: Iterable of lowercase key names to redact.
                       None = uses default set.
        name: Logger name filter. Empty string = all (default).
    """
    
    def __init__(
        self,
        sensitive_keys: Iterable[str] | None = None,
        name: str = "",
    ) -> None:
        ...
    
    def filter(self, record: logging.LogRecord) -> bool:
        """Redact sensitive fields. Always returns True."""
        ...
```

### Context Injection

```python
import contextvars
from collections.abc import Iterator
from typing import Any

import azure.functions as func

from azure_functions_logging import (
    ContextTokens,
    inject_context,
    logging_context,
    restore_context,
    reset_context,
    install_context_factory,
    get_logging_metadata,
    with_context,
)

# ContextTokens is a type alias (not a class) — opaque mapping returned by
# inject_context(). Pass to restore_context() to undo a single inject_context() call.
ContextTokens = dict[contextvars.ContextVar[Any], contextvars.Token[Any]]

def inject_context(context: Any) -> ContextTokens:
    """Set invocation context from an Azure Functions context object.

    Extracts invocation_id, function_name, trace_id, cold_start and stores
    them in contextvars. Returns a ContextTokens bundle that can be passed
    to restore_context() to undo this specific injection.

    Safe to call with any object; missing attributes are silently ignored
    (context failures never cause application failures).

    Example:
        def handler(req, context):
            tokens = inject_context(context)
            try:
                logger.info("Processing")
            finally:
                restore_context(tokens)
    """
    ...

def restore_context(tokens: ContextTokens) -> None:
    """Restore the previous contextvar values captured in tokens.
    Always pair with the matching inject_context() call."""
    ...

def reset_context() -> None:
    """Unconditionally clear all context vars. Use only in test teardown—
    prefer logging_context() / restore_context() in production code."""
    ...

def logging_context(context: Any) -> Iterator[None]:
    """Recommended primary pattern. Context manager that calls inject_context()
    on enter and **always** calls restore_context() on exit (even when the body
    raises). Prevents stale context from leaking into the next invocation on a
    reused worker.

    Example:
        def handler(req, context):
            with logging_context(context):
                logger.info("handler started")
    """
    ...

def install_context_factory() -> None:
    """Opt-in: install a global logging.setLogRecordFactory() that copies
    contextvars onto every LogRecord at creation time. Ensures coverage even
    on handlers added after setup_logging() and on loggers that bypass the
    filter chain. Once installed, invocation_id/function_name/trace_id/
    cold_start become reserved LogRecord attributes—passing them via stdlib
    extra= raises KeyError. Use FunctionLogger (auto-sanitizes) or rename keys.
    """
    ...

def get_logging_metadata(func: Any) -> dict[str, Any] | None:
    """Return logging metadata attached to a function by ``@with_context``.
    Returns ``None`` if the function was not decorated with ``@with_context``
    (no introspection of install/runtime state — metadata is per-function only)."""
    ...
```

### Context Decorator

```python
from azure_functions_logging import with_context
from typing import TypeVar, Callable, Any

_F = TypeVar("_F", bound=Callable[..., Any])

def with_context(
    func: _F | None = None,
    *,
    param: str = "context",
) -> _F | Callable[[_F], _F]:
    """Decorator that automatically injects invocation context.

    Finds the context parameter, calls inject_context() before the handler,
    and calls restore_context() in finally. Sync and async handlers supported.
    """
    ...
```
### Contextvar Helpers (Advanced)

All public helpers are exported from the top-level package. Do **not** import
from `azure_functions_logging._context` (private module — not part of the
supported API surface and may change without notice).

```python
from azure_functions_logging import (
    logging_context,        # primary: context manager
    inject_context,         # low-level: returns ContextTokens
    restore_context,        # low-level: pair with inject_context()
    reset_context,          # test teardown only
    install_context_factory,  # opt-in global LogRecordFactory
    ContextTokens,          # type alias used by inject_context/restore_context
)
```

Context fields (`invocation_id`, `function_name`, `trace_id`, `cold_start`) are
stored in private contextvars and surfaced on every `LogRecord` via the internal
`ContextFilter`, which is installed automatically by `setup_logging()`. The filter
and the underlying contextvars are **implementation details** — use the public
helpers above instead of importing them directly.


## Design Principles

1. **Narrow root logger contract** — In Azure/Core Tools, no handlers are added; `setup_logging()` installs `ContextFilter` on existing root handlers and on the root logger itself (so direct calls on the root logger carry context). To guarantee context on records that propagate from named child loggers to handlers attached later (host- or third-party-installed), call `install_context_factory()`. In standalone mode (`logger_name=None`), the root logger's level is set and a `StreamHandler` is added only when no handlers exist.
2. **Respects host.json** — In Azure/Core Tools, never changes root logger level or replaces existing handlers
3. **No runtime dependency on azure-functions** — Works with any context-like object
4. **Silent on context failures** — Missing attributes don't cause exceptions
5. **Idempotent setup** — Calling setup_logging() multiple times is safe
6. **Standard logging delegates** — FunctionLogger wraps, never replaces stdlib logging
7. **Always-restoring context** — logging_context() / restore_context() prevent stale
   context from leaking into the next invocation on a reused worker
8. **Strict W3C traceparent validation** — trace IDs are extracted only from valid W3C
   `traceparent` headers (version `00` requires exactly 4 parts; later versions may
   include trailing fields, which are ignored after validating the first four:
   2/32/16/2 hex; version `ff` and all-zero trace/span ids are rejected;
   forward-compatible with future versions)
9. **Bounded host.json discovery** — walks up from cwd, max 5 parent levels;
   override with host_json_path= for tests / non-standard layouts

## Common Patterns

### Basic setup with local color output:

```python
from azure_functions_logging import setup_logging, get_logger

setup_logging()  # Color in local, filter in Azure
logger = get_logger(__name__)
```

### JSON output:

Standalone local process (formatter applied directly):

```python
from azure_functions_logging import setup_logging, get_logger

setup_logging(format="json")
logger = get_logger(__name__)
```

Azure Functions / Core Tools (host owns the handlers — `format="json"` alone is ignored; pass `functions_formatter` instead):

```python
from azure_functions_logging import setup_logging, get_logger, JsonFormatter

setup_logging(functions_formatter=JsonFormatter())
logger = get_logger(__name__)
```

### Filtered third-party logging:

```python
from azure_functions_logging import setup_logging, SamplingFilter, RedactionFilter
import logging

setup_logging()

# Rate-limit urllib3 logs
urllib3_logger = logging.getLogger("urllib3")
handler = logging.StreamHandler()
handler.addFilter(SamplingFilter(rate=10, window=1.0))
urllib3_logger.addHandler(handler)

# Redact passwords on a stdlib logger (FunctionLogger does not expose addHandler;
# attach handlers/filters to the underlying stdlib logger instead).
auth_logger = logging.getLogger("myapp.auth")
handler = logging.StreamHandler()
handler.addFilter(RedactionFilter())
auth_logger.addHandler(handler)
```

### Per-request context with bind() (using `logging_context`):

```python
from azure_functions_logging import get_logger, logging_context

logger = get_logger(__name__)

def handler(req, context):
    with logging_context(context):
        # Create request-scoped logger with user_id
        req_logger = logger.bind(user_id=req.params.get("user_id"))
        req_logger.info("Processing request")
        # Log includes: invocation_id, user_id, cold_start, etc.
```

Equivalent low-level pattern (e.g. middleware that cannot use a `with` block):

```python
from azure_functions_logging import get_logger, inject_context, restore_context

logger = get_logger(__name__)

def handler(req, context):
    tokens = inject_context(context)
    try:
        req_logger = logger.bind(user_id=req.params.get("user_id"))
        req_logger.info("Processing request")
    finally:
        restore_context(tokens)  # Always pair to avoid stale context on reused workers
```

### Async handler with decorator:

```python
import azure.functions as func
from azure_functions_logging import with_context, get_logger, setup_logging

setup_logging()
logger = get_logger(__name__)
app = func.FunctionApp()

@app.route(route="async_hello")
@with_context
async def async_hello(req: func.HttpRequest, context: func.Context) -> func.HttpResponse:
    logger.info("Async handler with automatic context")
    return func.HttpResponse("OK")
```

## Integration with Application Insights

Configure Application Insights in host.json to capture structured logs:

```json
{
  "version": "2.0",
  "logging": {
    "applicationInsights": {
      "samplingSettings": {
        "isEnabled": true,
        "maxTelemetryItemsPerSecond": 20
      }
    }
  }
}
```

Log records emitted through this library carry `invocation_id`, `function_name`,
and any user-supplied `extra=` fields. When the host pipeline forwards them to
Application Insights, structured fields typically appear under `customDimensions`
(not as top-level table columns), so queries should reference them via
`customDimensions.<field>`. Exact ingestion shape depends on the host/exporter
pipeline; some setups may surface fields differently or leave them in the raw
`message`.

```
traces
| where customDimensions.invocation_id == "my-invocation"
| project timestamp, message, customDimensions
```

## Troubleshooting

**Q: Why aren't my logs appearing in Azure?**
- Check host.json logging level. This library respects host.json.
- Ensure setup_logging() is called in function init or first handler.

**Q: How do I get JSON output locally?**
- Use `setup_logging(format="json")` instead of default color format.

**Q: Can I use custom formatters?**
- Yes, in Azure/Core Tools pass `functions_formatter=MyFormatter()` to setup_logging() to format host-managed handlers. In standalone local mode the built-in `format="color"` / `format="json"` selection applies (custom formatters are not yet wired into the standalone branch).

**Q: Does this work without azure-functions?**
- Yes, pass any object with invocation_id, function_name attributes to inject_context().
