Metadata-Version: 2.4
Name: project-toolkit
Version: 1.0.8
Summary: A modular toolkit for managing project paths, cloud services, and configuration in Python projects
Author: guocity
License: MIT
Project-URL: Homepage, https://github.com/guocity/project-toolkit
Project-URL: Repository, https://github.com/guocity/project-toolkit
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Requires-Python: >=3.10
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: pydantic>=2.0
Requires-Dist: pydantic-settings>=2.0
Provides-Extra: google
Requires-Dist: gspread; extra == "google"
Requires-Dist: gspread-dataframe; extra == "google"
Requires-Dist: ratelimit; extra == "google"
Requires-Dist: google-api-python-client; extra == "google"
Requires-Dist: google-auth; extra == "google"
Requires-Dist: google-cloud-bigquery; extra == "google"
Requires-Dist: pandas; extra == "google"
Requires-Dist: pandas_gbq; extra == "google"
Provides-Extra: cloudflare
Requires-Dist: cloudflare; extra == "cloudflare"
Requires-Dist: boto3; extra == "cloudflare"
Requires-Dist: requests; extra == "cloudflare"
Provides-Extra: vault
Requires-Dist: hvac; extra == "vault"
Provides-Extra: wan-ip
Provides-Extra: notifications
Requires-Dist: requests; extra == "notifications"
Provides-Extra: dev
Requires-Dist: pytest>=7.0; extra == "dev"
Requires-Dist: pytest-cov; extra == "dev"
Requires-Dist: requests; extra == "dev"
Provides-Extra: all
Requires-Dist: project-toolkit[cloudflare,dev,google,notifications,vault,wan-ip]; extra == "all"
Dynamic: license-file

# project-toolkit

A modular Python toolkit for managing project paths, cloud services (Google Cloud, Cloudflare R2), HashiCorp Vault, and configuration — powered by **pydantic-settings**.

## Features

- ⚙️ **Auto-Initialized Settings** — Just import and use, no setup needed
- 🔑 **Dynamic Env Access** — Read any env var or `.env` value with `settings.env("KEY")`
- 🔐 **Vault Integration** — Read secrets like `vault kv get` / `vault kv get -field=`
- 📁 **Project Path Management** — Create and navigate project directory structures
- ☁️ **Cloudflare** — Official `cloudflare-python` SDK wrapper, plus R2 `boto3` and `requests` clients
- 📊 **Google Services** — BigQuery, Drive, and Sheets integrations
- 🌐 **WAN IP Tool** — Fetch, track, and log public IPv4/IPv6 address changes via HTTP/2 and HTTP/3.
- 🔔 **Notifications** — Framework-agnostic Discord alerts for Prefect and Airflow

## Installation

```bash
# Core only
pip install project-toolkit

# With specific extras
pip install "project-toolkit[google]"
pip install "project-toolkit[cloudflare]"
pip install "project-toolkit[vault]"
pip install "project-toolkit[wan-ip]"
pip install "project-toolkit[notifications]"

# Everything
pip install "project-toolkit[all]"
```

---

## API Usage Guide

### 1. Auto-Initialized Settings (No Setup Needed)

Settings auto-initialize on first use — no boilerplate required:

```python
from project_toolkit import settings

# Access any module setting directly
print(settings.cloudflare.r2_access_token)
print(settings.google.google_service_account_json)
print(settings.vault.is_configured)
```

**Environment detection** is automatic:
- If `AIRFLOW_HOME` is set → loads `config/.env.airflow`
- Otherwise → loads `config/.env.dev`

The `config/` directory is located by searching from your **current working directory** first,
then walking up the directory tree. This means it works whether you run from the project root,
a subdirectory, or even a Jupyter notebook inside the project.

---

### 2. Dynamic Environment Variable Access

Read **any** env var or `.env` value — no need to pre-define it in code:

```python
from project_toolkit import settings

# Read API keys (from env vars or .env file)
openai_key = settings.env("OPENAI_API_KEY")
gemini_key = settings.env("GEMINI_API_KEY")
claude_key = settings.env("CLAUDE_API_KEY")

# With a default fallback
db_host = settings.env("DATABASE_HOST", "localhost")
debug = settings.env("DEBUG_MODE", "false")
```

**Priority**: Environment variable > `.env` file value > default

Your `config/.env.dev` can contain any key:

```env
OPENAI_API_KEY=sk-xxxxx
GEMINI_API_KEY=AIza-xxxxx
CLAUDE_API_KEY=sk-ant-xxxxx
DATABASE_HOST=db.example.com
```

---

### 3. HashiCorp Vault

Read secrets from Vault — same env vars as the `vault` CLI.

**Setup** — Add to `config/.env.dev`:

```env
VAULT_ADDR=http://192.168.12.2:8200
VAULT_TOKEN=hvs.your-vault-token
```

**Read all fields** (like `vault kv get secret/my-credentials`):

```python
from project_toolkit import settings

secret = settings.vault.read_secret("secret/my-credentials")
print(secret)
# {'username': 'admin', 'password': 'secret-password-123'}
```

**Read a single field** (like `vault kv get -field=OPENAI_API_KEY secret/api`):

```python
from project_toolkit import settings

# Single field reads
api_key = settings.vault.read_field("secret/api", "OPENAI_API_KEY")
password = settings.vault.read_field("secret/my-credentials", "password")

# With fallback default
db_pass = settings.vault.read_field("secret/db", "password", "default-pw")
```

**Use the hvac client directly** for advanced operations:

```python
from project_toolkit import settings

client = settings.vault.get_vault_client()
if client:
    # Write a secret
    client.secrets.kv.v2.create_or_update_secret(
        path="my-new-secret",
        secret={"api_key": "abc123"},
    )

    # List secrets
    secrets = client.secrets.kv.v2.list_secrets(path="", mount_point="secret")
    print(secrets["data"]["keys"])
```

---

### 4. Explicit `.env` File Path (Fallback)

If auto-detection cannot find your `config/.env.dev` — for example, when
running from a **Jupyter notebook**, a different working directory, or a
remote environment — you can pass the path explicitly:

```python
from project_toolkit import settings

# Point to your .env file explicitly
settings.configure(env_file="/path/to/your/config/.env.dev")

# Now use settings as normal
print(settings.cloudflare.r2_access_token)
print(settings.env("OPENAI_API_KEY"))
```

Or via `get_settings()` directly:

```python
from project_toolkit import get_settings

get_settings.cache_clear()
settings = get_settings(env_file="/path/to/your/config/.env.dev")
```

---

### 5. Manual Settings Override

For testing or custom configurations, construct `AppSettings` directly:

```python
from project_toolkit.settings.base import AppSettings
from project_toolkit.settings.cloudflare import CloudflareSettings
from project_toolkit.settings.google import GoogleSettings
from project_toolkit.settings.path import PathSettings
from project_toolkit.settings.vault import VaultSettings
from project_toolkit.settings.wan_ip import WanIpSettings
from project_toolkit.ip_tool import WanIP, IPChangeDetector

custom_settings = AppSettings(
    path_config=PathSettings(
        data_dir="/custom/data",
        project_name="my-project",
    ),
    cloudflare=CloudflareSettings(
        r2_access_token="custom-token",
        r2_account_id="custom-id",
    ),
    google=GoogleSettings(),
    vault=VaultSettings(),
    wan_ip=WanIpSettings(),
)

# Use the custom settings
print(custom_settings.cloudflare.r2_access_token)  # "custom-token"
print(custom_settings.env("R2_ACCESS_TOKEN"))       # reads from env/dotenv
```

To reset the cached auto-initialized singleton:

```python
from project_toolkit import settings, get_settings

# Reset cached settings (e.g., after changing env vars)
get_settings.cache_clear()
settings._reset()

# Next access will re-initialize
print(settings.cloudflare.r2_access_token)
```

---

### 6. Project Path Management

Manage project directory structures with automatic creation:

```python
from project_toolkit import DataPathConfig

# Direct usage
dpc = DataPathConfig(
    project_name="my-project",
    subproject="etl",
    data_dir="/data",
)
print(dpc.data_dir())        # /data
print(dpc.project_dir())     # /data/my-project
print(dpc.sub_project_dir()) # /data/my-project/etl

# Get a timestamped file name and full file path
filename = dpc.get_project_today_file_name("output", "csv")
filepath = dpc.sub_project_dir() / filename  # /data/my-project/etl/output_20260209.csv
```

**From auto-initialized settings** (recommended):

```python
from project_toolkit import settings, DataPathConfig

dpc = DataPathConfig.from_settings(
    settings.path_config,
    project_name="my-project",
)
```

---

### 7. Cloudflare

```python
from project_toolkit.cloudflare import CloudflareSdkClient

# Zero-config (reads CLOUDFLARE_API_TOKEN & CLOUDFLARE_ACCOUNT_ID from env/dotenv)
cf = CloudflareSdkClient()

# DNS
records = cf.list_dns_records(zone_id="...")
cf.create_dns_record(zone_id="...", name="example.com", type="A", content="1.2.3.4")

# R2 Buckets
buckets = cf.list_r2_buckets()

# Workers KV
cf.kv_put(namespace_id="...", key_name="my_key", value="my_value")

# D1 Databases
results = cf.d1_query(database_id="...", sql="SELECT * FROM users")
```

For R2 file uploads via `boto3` or `requests`:

```python
from project_toolkit import settings
from project_toolkit.cloudflare.boto3_client import CloudflareBoto3Client

# Zero-config or from settings
client = CloudflareBoto3Client()
client.upload_file("my-bucket", "file.csv", open("/path/to/file.csv", "rb").read())
```

---

### 8. Google Sheets

```python
from project_toolkit import settings
from project_toolkit.google.sheet import GoogleSheetsManager

sheets = GoogleSheetsManager.from_settings(settings.google)

# Read a sheet as a DataFrame
df = sheets.sheet_df(sheet_id="your-sheet-id")
print(df.head())
```

---

### 9. Google BigQuery

```python
from project_toolkit import settings
from project_toolkit.google.bigquery import BigQueryManager

bq = BigQueryManager.from_settings(settings.google)

# Run a query
df = bq.query_table("SELECT * FROM my_dataset.my_table LIMIT 10")
```

---

### 10. Google Drive

```python
from project_toolkit import settings
from project_toolkit.google.drive import GoogleDriveManager

drive = GoogleDriveManager.from_settings(settings.google)

# List files in a folder
files = drive.list_files(folder_id="your-folder-id")
```

---

### 11. WAN IP Tool (Fetch & Track)

The `ip_tool` provides a convenient `wan_ip` object for immediate fetching, or classes for programmatic change detection.

#### Quick Access (Shorthand)

```python
from project_toolkit.ip_tool import wan_ip

# Property-style (attribute) access
print(wan_ip.ipv4)     # "104.28.236.177"
print(wan_ip.ipv6)     # "2a09:bac5..."
print(wan_ip.h3ipv4)   # HTTP/3 result

# Tab-separated report string
print(wan_ip.report()) # "H2v4: 1.1.1.1  H3v4: 1.1.1.2  v6: 2a09..."
```

#### Programmatic Change Detection

Use the `IPChangeDetector` to maintain a persistent log (`history.txt`) and a current snapshot (`current.json`).

```python
from project_toolkit.ip_tool import IPChangeDetector, WanIP

# Robust class-based usage
wan_ip_inst = WanIP(timeout=10)
detector = IPChangeDetector(
    project_name="my-app",
    subproject="office",
    wan_ip=wan_ip_inst
)

# Compare current IPs vs last saved and record if changed
changed, results = detector.check_and_record()

# Get the formatted line that was appended to history
print(detector.get_history_line())
```

---

### 12. Notifications (Discord)

Rich Discord notifications with built-in adapters for Prefect and Airflow.

Each adapter supports **zero-config** instantiation — just set the matching
env var (`DISCORD_PREFECT_WEBHOOK_URL` or `DISCORD_AIRFLOW_WEBHOOK_URL`) in
your environment or `config/.env.dev` and the adapter creates a
`DiscordNotifier` automatically.

#### Prefect Integration

```python
from project_toolkit.notifications.adapters.prefect import PrefectAdapter

# Zero-config — reads DISCORD_PREFECT_WEBHOOK_URL from env / .env
hooks = PrefectAdapter()

@flow(
    on_failure=[hooks.on_failure],
    on_completion=[hooks.on_success],
    on_crashed=[hooks.on_crashed],
)
def my_flow():
    ...
```

Or pass an explicit notifier:

```python
from project_toolkit.notifications.discord import DiscordNotifier
from project_toolkit.notifications.adapters.prefect import PrefectAdapter

notifier = DiscordNotifier(webhook_url="https://discord.com/api/webhooks/...")
hooks = PrefectAdapter(notifier)
```

#### Airflow Integration

```python
from project_toolkit.notifications.adapters.airflow import AirflowAdapter

# Zero-config — reads DISCORD_AIRFLOW_WEBHOOK_URL from env / .env
hooks = AirflowAdapter()

with DAG(
    dag_id="my_dag",
    on_failure_callback=hooks.on_failure,
    on_success_callback=hooks.on_success,
) as dag:
    ...
```

Or pass an explicit notifier:

```python
from project_toolkit.notifications.discord import DiscordNotifier
from project_toolkit.notifications.adapters.airflow import AirflowAdapter

notifier = DiscordNotifier(webhook_url="https://discord.com/api/webhooks/...")
hooks = AirflowAdapter(notifier)
```

#### Manual Context

```python
from project_toolkit.notifications.discord import DiscordNotifier
from project_toolkit.notifications.context import NotificationContext

notifier = DiscordNotifier(webhook_url="https://discord.com/api/webhooks/...")
notifier.notify(NotificationContext(
    status="failure",
    flow_name="custom_task",
    run_name="manual_run_1",
    error_message="Something went wrong",
))
```

---

## Configuration

### Environment Variables

| Variable | Module | Description |
|---|---|---|
| `CLOUDFLARE_API_TOKEN` | Cloudflare | Cloudflare API token (official) |
| `CLOUDFLARE_ACCOUNT_ID` | Cloudflare | Cloudflare account ID (official) |
| `CLOUDFLARE_ZONE_ID` | Cloudflare | Default zone ID for DNS operations |
| `{KEY}_ZONE_ID` | Cloudflare | Domain-specific zone IDs (e.g. `LGNAT_ZONE_ID`) |
| `R2_ACCESS_TOKEN` | Cloudflare | Cloudflare R2 API token (legacy) |
| `R2_ACCOUNT_ID` | Cloudflare | Cloudflare account ID (legacy) |
| `R2_DOMAIN` | Cloudflare | R2 custom domain URL |
| `GOOGLE_SERVICE_ACCOUNT_JSON` | Google | Path to service account JSON |
| `TEST_GOOGLE_SHEET_ID` | Google | Test sheet ID (dev only) |
| `VAULT_ADDR` | Vault | Vault server address (e.g., `http://192.168.12.2:8200`) |
| `VAULT_TOKEN` | Vault | Vault authentication token |
| `WAN_IP_LOG_DIR` | WAN IP | Directory for IP change logs |
| `DISCORD_PREFECT_WEBHOOK_URL` | Notifications | Discord webhook for Prefect flows |
| `DISCORD_AIRFLOW_WEBHOOK_URL` | Notifications | Discord webhook for Airflow DAGs |
| `AIRFLOW_HOME` | Core | Auto-detected for env switching |

> **Any key** in your `.env` file or environment can be read dynamically
> via `settings.env("KEY")` — no need to define it in Python code.

### `.env` Files

Place config files in the `config/` directory:

```
config/
├── .env.dev        # Local development (loaded when no AIRFLOW_HOME)
└── .env.airflow    # Airflow production (loaded when AIRFLOW_HOME is set)
```

### `.env` File Detection Order

The toolkit searches for `config/` in this order:

1. `<cwd>/config/` — current working directory
2. Walk up from `cwd` (up to 5 parent levels)
3. Walk up from the package install location (works for editable installs)

If auto-detection fails, use `settings.configure(env_file=...)` to specify
the path explicitly.

### Priority Order

Environment variables > `.env` files > Vault secrets > defaults

### Troubleshooting: `.env` File Not Found

If settings load with `None` values, the `config/.env.dev` file was not
found. This commonly happens when:

- Running from a **Jupyter notebook** whose working directory differs from
  the project root.
- Running a script from a **different directory** than the project root.
- The package was installed via `pip install` (not editable mode) and
  there's no `config/` folder in the current working directory.

**Fix**: Call `configure()` before accessing settings:

```python
from project_toolkit import settings

settings.configure(env_file="/absolute/path/to/config/.env.dev")
print(settings.env("OPENAI_API_KEY"))  # ✓ works
```

---

## Development

```bash
# Install with all dependencies
pip install -e ".[all]"

# Run tests
pytest

# Run specific tests
pytest tests/test_settings.py -s -vv
pytest tests/test_vault.py -s -vv
pytest tests/test_auto_settings.py -s -vv
```

## Project Structure

```
project_toolkit/
├── __init__.py           # Package exports + auto-initialized settings singleton
├── project_path.py       # Project path management (DataPathConfig)
├── settings/
│   ├── __init__.py       # Settings exports
│   ├── base.py           # AppSettings + get_settings() + env()
│   ├── path.py           # PathSettings
│   ├── cloudflare.py     # CloudflareSettings
│   ├── google.py         # GoogleSettings
│   ├── vault.py          # VaultSettings + read_secret() + read_field()
│   └── wan_ip.py         # WanIpSettings
├── cloudflare/
│   ├── boto3_client.py   # R2 via boto3
│   └── requests_client.py # R2 via requests
├── google/
│   ├── bigquery.py       # BigQuery operations
│   ├── drive.py          # Google Drive operations
│   └── sheet.py          # Google Sheets operations
├── ip_tool/
│   ├── __init__.py       # Singleton 'wan_ip' proxy exports
│   ├── wan_ip.py         # WanIP core fetcher (H2, H3, v4, v6)
│   └── detector.py       # IP change detection + persistence log logic
└── notifications/
    ├── context.py        # Shared NotificationContext dataclass
    ├── discord.py        # Core Discord notifier
    └── adapters/
        ├── prefect.py    # Prefect (flow, flow_run, state) adapter
        └── airflow.py    # Airflow (context dict) adapter
```

## License

MIT
