Metadata-Version: 2.4
Name: JSONL-LOGGER
Version: 1.7.10
Summary: Async queue-based structured JSONL logging with thread-safe performance and auto-detected module names
Author-email: rocky <rocky@null.net>
License: MIT
Project-URL: Homepage, https://github.com/rocky/JSONL-LOGGER
Project-URL: Bug Reports, https://github.com/rocky/JSONL-LOGGER/issues
Project-URL: Source, https://github.com/rocky/JSONL-LOGGER
Keywords: logging,jsonl,structured-logging,async-logging,audit-logging
Classifier: Development Status :: 5 - Production/Stable
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Topic :: System :: Logging
Requires-Python: >=3.10
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: python-dotenv>=1.2.2
Provides-Extra: dev
Requires-Dist: pytest>=9.0.2; extra == "dev"
Requires-Dist: pytest-cov>=7.1.0; extra == "dev"
Dynamic: license-file

# JSONL_LOGGER

Async queue-based structured logging with JSONL output — single file, zero dependencies beyond python-dotenv.

## Installation

```bash
pip install JSONL_LOGGER
```

## Quick Start

### Set environment variables in .env
```bash
PROJECT_DIRECTORY=/path/to/your/project
LOGS_LOCAL_TIMEZONE=Asia/Kolkata
LOGGER_FILE_NAME=orders  # Optional: default log file name (default: "LOGS")
```

### Alternatively, set LOGGER_FILE_NAME in your module
```python
# In your_module.py
import os
os.environ["LOGGER_FILE_NAME"] = "orders"
from JSONL_LOGGER import log_info
log_info("Order placed", order_id=123)
# logfile_name="orders", module_name="your_module" (auto-detected from __name__)
```

### LOG FILE NAMING
- **logfile_name**: The logical group for log files. Set via:
  1. `LOGGER_FILE_NAME` env var in .env
  2. `os.environ["LOGGER_FILE_NAME"]` in your module
  3. `logfile_name=` parameter in function call
  4. `Falls back` to "LOGS" if none set

- **module_name**: The Python module name. Auto-detected from caller's `__name__`. Can be overridden via `module_name=` parameter.

- **source_file**: The actual Python filename. Auto-detected from call stack. Can be overridden via `source_file=` parameter.

### Import and use
```python
from JSONL_LOGGER import log_info, log_warn, log_error, log_metric
from JSONL_LOGGER import send_notification, send_notification_async

# Info logging with structured fields
log_info("User logged in", user_id=123, email="user@example.com")

# Warning logging
log_warn("Rate limit approaching", remaining=10, reset_seconds=60)

# Error logging with error codes
log_error("Payment failed", error_code=500, error="insufficient_funds")

# Metrics logging (custom METR level between INFO and WARNING)
log_metric("api_latency_ms", 142.5, unit="ms", endpoint="/checkout", method="POST")
log_metric("request_count", 1000, logfile_name="orders", module_name="order_service", status="success")

# Notifications to main_logger.jsonl
send_notification("Application started", source_file="main.py")
await send_notification_async("Deployment completed", source_file="deploy.py")
```

### All functions support optional parameters
- `logfile_name`: Log file name (auto-detected from LOGGER_FILE_NAME env/globals)
- `module_name`: Source module name (auto-detected from caller's `__name__`)
- `source_file`: Actual source filename (auto-detected from call stack)

### Output
```
{PROJECT_DIRECTORY}/_LOGS_DIRECTORY/{YYYY_MM_DD}/LOGS/{logfile_name}.jsonl
```
Example: `/path/to/logs/_LOGS_DIRECTORY/2026_04_03/LOGS/orders.jsonl`

### JSONL fields
`timestamp`, `timestamp_local`, `level`, `logfile_name`, `module_name`, `source_file`, `message`, `**extra_fields`

## Configuration

### Environment Variables

| Variable | Required | Default | Description |
|----------|----------|---------|-------------|
| `PROJECT_DIRECTORY` | Yes | — | Root directory for log storage |
| `LOGS_LOCAL_TIMEZONE` | Yes | — | Local timezone (e.g. `Asia/Kolkata`, `UTC`) |
| `LOGGER_FILE_NAME` | No | `LOGS` | Default log file name |
| `CONSOLE_LOGGING_ENABLED` | No | `false` | Enable colored console output |
| `LOGGER_REGISTER_SIGNALS` | No | `false` | Register SIGINT/SIGTERM handlers for flush |
| `LOGGER_DAEMON_THREAD` | No | `true` | Writer thread daemon status |
| `LOGGER_MAX_BUFFER_SIZE` | No | `200000` | Maximum buffer entries before drop |

### Common Timezones

```
Asia:       Asia/Kolkata, Asia/Dubai, Asia/Singapore, Asia/Tokyo
Europe:     Europe/London, Europe/Paris, Europe/Berlin
Americas:   America/New_York, America/Chicago, America/Los_Angeles
UTC:        UTC
```

## Output

### File Structure

```
{PROJECT_DIRECTORY}/_LOGS_DIRECTORY/{YYYY_MM_DD}/LOGS/{logfile_name}.jsonl
```

Example: `/path/to/logs/_LOGS_DIRECTORY/2026_04_03/LOGS/orders.jsonl`

### JSONL Format

Each log entry is a valid JSON object on a single line:

```json
{"timestamp":"2026-04-03T05:30:00.000Z","timestamp_local":"2026-04-03T11:00:00.000+05:30","level":"INFO","logfile_name":"orders","module_name":"orders","source_file":"order_service.py","message":"User logged in","user_id":123}
```

### Dual-Write Behavior

| Function | Writes To |
|----------|------------|
| `log_info()` | `{module}.jsonl` |
| `log_warn()` | `{module}.jsonl`, `{module}.warn.jsonl` |
| `log_error()` | `{module}.jsonl`, `{module}.errors.jsonl` |
| `log_metric()` | `{module}.metrics.jsonl` |
| `send_notification()` | `main_logger.jsonl` |

## Key Features

1. **Queue-Based Async Logging** — Background thread writes to disk; API calls return immediately
2. **Per-Module Log Files** — Each module gets its own JSONL file for independent retention
3. **Dual Timestamps** — UTC for machine parsing, local time for human readability
4. **Retry with Exponential Backoff + Jitter** — 3 attempts with exponential backoff and ±25% jitter
5. **Zero Data Loss** — Buffer capped at 200k entries; oldest dropped on overflow
6. **Signal Handlers** — Optional SIGINT/SIGTERM flush on shutdown
7. **Lazy Config Loading** — No side effects on import; validated on first use

## Performance

```
┌───────────────────┬──────────┬──────────┬────────────┬─────────────────────────┐
│ Test              │ Logs     │ Time     │ Throughput │ Status                  │
├───────────────────┼──────────┼──────────┼────────────┼─────────────────────────┤
│ Single-thread     │ 10,000   │ 1.04 sec │ 9,643/sec  │ ✅ PASS                 │
│ Multi-thread      │ 100,000  │ 16.87 sec│ 5,902/sec  │ ✅ PASS                 │
└───────────────────┴──────────┴──────────┴────────────┴─────────────────────────┘
System: Ubuntu 24.04.4 LTS | AMD Ryzen 7 5800H (16 cores, 13Gi RAM) | Python 3.12.3
Tested: 2026-04-03 12:38 UTC (10 runs average)
```

## API Reference

### log_info(message, logfile_name=None, module_name=None, **extra_fields)
Log an info message with optional structured fields.
- `message`: Log message
- `logfile_name`: Log file name (auto-detected if omitted)
- `module_name`: Source module name (auto-detected if omitted)
- `**extra_fields`: Additional structured fields

### log_warn(message, logfile_name=None, module_name=None, **extra_fields)
Log a warning message. Writes to both main and `.warn.jsonl` files.
- Same parameters as `log_info`

### log_error(message, logfile_name=None, module_name=None, **extra_fields)
Log an error message. Writes to both main and `.errors.jsonl` files.
- Same parameters as `log_info`

### log_metric(metric_name, value, unit="", logfile_name=None, module_name=None, **tags)
Log a metric with custom METR level (between INFO and WARNING).
- `metric_name`: Metric identifier (e.g. "api_latency_ms")
- `value`: Numeric value (int or float)
- `unit`: Unit label (e.g. "ms", "bytes")
- `logfile_name`: Log file name (auto-detected if omitted)
- `module_name`: Source module name (auto-detected if omitted)
- `**tags`: Additional fields for grouping/filtering

### send_notification(message, logfile_name=None, module_name=None, source_file=None)
Send a notification to the main_logger.jsonl file (blocking).
- `message`: Notification text
- `source_file`: Actual source filename (auto-detected if omitted)

### send_notification_async(message, logfile_name=None, module_name=None)
Send a notification to the main_logger.jsonl file (non-blocking).
- Same parameters as `send_notification`

### _flush_logs()
Manually flush all buffered logs to disk.

## Error Handling

The logger will raise `ValueError` if:
- `PROJECT_DIRECTORY` is not set or doesn't exist (on first use, not at import)
- `LOGS_LOCAL_TIMEZONE` is not set
- Directory is not writable

Non-primitive types in extra fields will emit a stderr warning and be stringified.

## Retry Policy

Disk writes use `_with_file_retry()`:
- **Max attempts**: 3
- **Delay**: Exponential backoff (0.1s, 0.2s, 0.4s) with ±25% jitter
- **Retryable**: All exceptions during file write
- **On exhaustion**: Lines re-buffered; oldest dropped if buffer exceeds cap

## Test Coverage

```bash
python3 -m pytest JSONL_LOGGER.py -v
```

| Function | Tier | Tests | What is tested |
|----------|------|-------|----------------|
| `log_info()` | 1 | 6 | level, message, auto-detect, empty, special chars, 10k msg, dual-source |
| `log_warn()` | 1 | 6 | level, message, auto-detect, empty, special chars, 10k msg, dual-source |
| `log_error()` | 1 | 8 | level, message, auto-detect, empty, special chars, 10k msg, dual-write handlers, info/warn isolation, dual-source |
| `log_metric()` | 1 | 10 | METR level, float/int value, tags, auto-detect, unit default, zero, negative, metric_name field, audit isolation, dual-source |
| `send_notification()` | 1 | 8 | INFO level, message, module_name routing, source_file, empty msg, emoji/unicode, audit isolation, async variant |
| `_get_logfile_name()` | 2 | 3 | LOGGER_FILE_NAME priority, filename fallback, exception safety |
| `_get_actual_source_file()` | 2 | 2 | bypasses LOGGER_FILE_NAME, exception safety |
| `_get_log_path()` | 2 | 8 | suffix routing, ValueError on missing dirs, path construction, .py stripped, LOGS subdir |
| `_with_file_retry()` | 2 | 5 | success on attempt 1, success on attempt 3, exhaustion sentinel, all exception types, exponential backoff |
| `_flush_buffer()` | 2 | 5 | no-op on empty, cleared after write, re-buffered on exhaustion, buffer cap drop, stderr warning |
| `QueueHandler.emit()` | 2 | 3 | queue.put_nowait called, .errors suffix, full queue prints to stderr |
| `_flush_logs()` | 2 | 2 | sets _shutdown flag, drains queue when thread alive |
| `_get_timestamp()` | 3 | 4 | returns dict, UTC ends with Z, ISO-8601 ms precision, real clock |
| `_debug_print()` | 3 | 2 | stderr when True, silent when False |
| `_warn_non_primitive_fields()` | 3 | 5 | list warning, dict warning, datetime warning, primitives silent, caller module name |
| `ColoredFormatter.format()` | 3 | 6 | INFO green, WARNING yellow, ERROR red, METR emoji, message preserved, RESET present |
| `UniformLevelFormatter.format()` | 3 | 9 | valid JSON, required keys, compact separators, WARN level, METR level, extra fields, no reserved leaks, source absent/present, non-serialisable stringified |
