Metadata-Version: 2.4
Name: fastapi-pg-logger
Version: 0.2.0
Summary: Drop-in file + Postgres request/DB-call logging for FastAPI.
Author-email: Sergi Maspons <smaspons@bgeo.es>, BGEO <info@bgeo.es>
Maintainer-email: Sergi Maspons <smaspons@bgeo.es>, BGEO <info@bgeo.es>
License-Expression: GPL-3.0-or-later
Keywords: fastapi,logging,postgres
Classifier: Development Status :: 4 - Beta
Classifier: Framework :: FastAPI
Classifier: Topic :: System :: Logging
Classifier: Topic :: Database
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Classifier: Intended Audience :: Developers
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Requires-Python: >=3.10
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: fastapi>=0.100
Requires-Dist: psycopg[binary,pool]>=3.1
Provides-Extra: dev
Requires-Dist: pytest>=7; extra == "dev"
Requires-Dist: pytest-asyncio>=0.21; extra == "dev"
Requires-Dist: httpx>=0.24; extra == "dev"
Requires-Dist: ruff>=0.1; extra == "dev"
Dynamic: license-file

# fastapi-pg-logger

Drop-in **file + Postgres** request and DB-call logging for [FastAPI](https://fastapi.tiangolo.com/).

One `setup_logging()` call wires up:

- **HTTP request/response logging**: method, path, status, duration, headers, bodies (configurable truncation & sampling)
- **File logging**: daily-rotated JSON-line log files
- **Postgres logging**: partitioned tables with automatic monthly partitions
- **DB-call logging**: correlate individual SQL executions back to the originating HTTP request via `request_id`
- **Log viewer UI**: optional mountable router with a dark-themed AG Grid dashboard

---

## Installation

```bash
pip install fastapi-pg-logger
```

Or install from source:

```bash
pip install git+https://github.com/bgeo-gis/fastapi-pg-logger.git
```

### Dependencies

- `fastapi >= 0.100`
- `psycopg[binary,pool] >= 3.1`
- Python `>= 3.10`

---

## Quickstart

```python
from contextlib import asynccontextmanager
from fastapi import FastAPI
from fastapi_pg_logger import setup_logging, LogConfig, create_log_router

config = LogConfig(service_name="my-api")

@asynccontextmanager
async def lifespan(app: FastAPI):
    store = await setup_logging(app, config, dsn="postgresql://user:pass@localhost/mydb")
    # Optionally mount the log viewer
    if store:
        app.include_router(create_log_router(store), prefix="/logs")
    yield
    # Close the internally-managed connection pool
    if store:
        await store.close()

app = FastAPI(lifespan=lifespan)
```

That's it. Every request is now logged to file and Postgres.

If you don't have Postgres (or don't want DB logging), omit the `dsn` parameter or set `db_enabled=False` in `LogConfig`. File logging still works.

### Advanced: custom `db_manager`

If you need to supply your own connection pool or manager (e.g. shared pool, custom wrappers), use `setup_logging_advanced`:

```python
from fastapi_pg_logger import setup_logging_advanced, LogConfig

store = await setup_logging_advanced(app, config, db_manager=my_db_manager)
```

The `db_manager` must expose an **async context manager** called `get_db()` that yields a [psycopg](https://www.psycopg.org/) async connection:

```python
class MyDatabaseManager:
    async def get_db(self):
        async with self.pool.connection() as conn:
            yield conn
```

> **Deprecation notice:** Passing `db_manager` directly to `setup_logging()` is deprecated and will be removed in 1.0.0. Use the `dsn` parameter for simple setups, or migrate to `setup_logging_advanced()` for custom connection management.

---

## Configuration

All configuration is passed explicitly via `LogConfig`.

```python
from fastapi_pg_logger import LogConfig

config = LogConfig(
    # Service identity (used in file log naming)
    service_name="my-api",

    # File logging
    log_dir="logs",                # root directory
    log_level="INFO",              # DEBUG, INFO, WARNING, ERROR, CRITICAL
    log_rotate_days=14,            # how many days of backup logs to keep
    log_format="[%(asctime)s] %(levelname)s:%(name)s:%(message)s",
    log_date_format="%d/%m/%y %H:%M:%S",

    # Request body / header capture
    skip_body_prefixes=("/logs", "/health", "/docs", "/openapi.json"),
    header_allowlist=None,         # None = built-in default set
    max_body_bytes=0,              # 0 = no truncation
    request_id_header="X-Request-ID",

    # Postgres logging
    db_enabled=True,
    db_sample_rate=1.0,            # 0.0–1.0, fraction of requests logged to DB
    db_schema="log",
    api_logs_table="api_logs",
    db_logs_table="api_db_logs",
)
```

### Default header allowlist

When `header_allowlist` is `None`, these headers are captured:

`accept`, `accept-encoding`, `accept-language`, `cache-control`, `content-length`, `content-type`, `etag`, `user-agent`, `x-device`, `x-lang`, `x-forwarded-for`, `x-real-ip`, `x-request-id`

---

## DB-call logging

If your API executes SQL queries or stored procedures, you can log each call and correlate it with the parent HTTP request:

```python
import time
from fastapi_pg_logger import get_request_id, log_db_call

async def execute_procedure(store, schema, function_name, sql):
    start = time.monotonic()
    result = await run_sql(sql)
    duration_ms = int((time.monotonic() - start) * 1000)

    log_db_call(
        store,
        # request_id auto-read from context if omitted
        schema_name=schema,
        function_name=function_name,
        sql_text=sql,
        response_json=json.dumps(result),
        duration_ms=duration_ms,
        status="ok" if result else "error",
    )
    return result
```

`log_db_call` is fire-and-forget, it schedules a background task and never raises.

### Reading the request ID manually

```python
from fastapi_pg_logger import get_request_id

rid = get_request_id()  # uuid.UUID | None
```

---

## Log viewer

Mount the optional log viewer router to get a browser-based dashboard:

```python
from fastapi_pg_logger import create_log_router

router = create_log_router(store, auth_dependency=my_auth_dep)
app.include_router(router, prefix="/logs")
```

This adds:

| Endpoint      | Description                              |
|---------------|------------------------------------------|
| `GET /logs`    | Paginated, filterable request logs (JSON) |
| `GET /logs/db` | DB-call logs for a given `request_id`     |
| `GET /logs/ui` | Dark-themed HTML log viewer               |

The `auth_dependency` parameter accepts any FastAPI dependency. Pass `None` for no authentication.

---

## Postgres schema

The package auto-creates (idempotent) the following structure:

### `{db_schema}.{api_logs_table}` (partitioned by month)

| Column            | Type          |
|-------------------|---------------|
| `ts`              | `timestamptz` |
| `id`              | `bigserial`   |
| `method`          | `text`        |
| `endpoint`        | `text`        |
| `status`          | `integer`     |
| `duration_ms`     | `integer`     |
| `user_name`       | `text`        |
| `request_id`      | `uuid`        |
| `client_ip`       | `inet`        |
| `query_params`    | `jsonb`       |
| `body_size`       | `integer`     |
| `response_size`   | `integer`     |
| `request_headers` | `jsonb`       |
| `request_body`    | `text`        |
| `response_headers`| `jsonb`       |
| `response_body`   | `text`        |

### `{db_schema}.{db_logs_table}` (partitioned by month)

| Column          | Type          |
|-----------------|---------------|
| `ts`            | `timestamptz` |
| `id`            | `bigserial`   |
| `request_id`    | `uuid`        |
| `schema_name`   | `text`        |
| `function_name` | `text`        |
| `sql_text`      | `text`        |
| `response_json` | `text`        |
| `duration_ms`   | `integer`     |
| `status`        | `text`        |
| `error`         | `text`        |

Partitions are created automatically for the current month on startup and on each insert.

---

## Production notes

- All DB writes are wrapped in `try/except`, logging failures **never** crash the app.
- Partition creation handles concurrent race conditions (catches `DuplicateTable`).
- File logging works independently of Postgres availability.
- DB inserts run in `asyncio.create_task`, zero impact on response latency.
- Body truncation via `max_body_bytes` prevents memory spikes on large payloads.
- `db_sample_rate` lets you log a fraction of requests for high-traffic APIs.
- No root logger manipulation, only the package's own named logger is used.
- `app.state` keys are prefixed with `_fpgl_` to avoid collisions.

---

## License

This project is licensed under the [GNU General Public License v3.0](./LICENSE).
