Metadata-Version: 2.4
Name: wren-engine
Version: 0.1.0
Summary: Wren Engine CLI and Python SDK — semantic SQL layer for 20+ data sources
Project-URL: Homepage, https://getwren.ai
Project-URL: Repository, https://github.com/Canner/wren-engine
Project-URL: Issues, https://github.com/Canner/wren-engine/issues
Author-email: Wren AI <contact@getwren.ai>
License: Apache-2.0
Keywords: analytics,cli,data-modeling,database,datafusion,mdl,python,sdk,semantic,semantic-layer,sql,wren,wrenai
Classifier: Development Status :: 4 - Beta
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: Apache Software License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Requires-Python: >=3.11
Requires-Dist: boto3>=1.26
Requires-Dist: duckdb>=1.0
Requires-Dist: ibis-framework>=10
Requires-Dist: loguru>=0.7
Requires-Dist: opendal>=0.45
Requires-Dist: pandas>=2
Requires-Dist: pyarrow-hotfix>=0.6
Requires-Dist: pyarrow>=14
Requires-Dist: pyasn1>=0.6.3
Requires-Dist: pydantic>=2
Requires-Dist: pyopenssl>=26.0.0
Requires-Dist: requests>=2.33.0
Requires-Dist: sqlglot>=27
Requires-Dist: typer>=0.12
Requires-Dist: wren-core-py>=0.1
Provides-Extra: all
Requires-Dist: databricks-sdk; extra == 'all'
Requires-Dist: databricks-sql-connector; extra == 'all'
Requires-Dist: google-auth; extra == 'all'
Requires-Dist: ibis-framework[athena]; extra == 'all'
Requires-Dist: ibis-framework[bigquery]; extra == 'all'
Requires-Dist: ibis-framework[clickhouse]; extra == 'all'
Requires-Dist: ibis-framework[mssql]; extra == 'all'
Requires-Dist: ibis-framework[mysql]; extra == 'all'
Requires-Dist: ibis-framework[postgres]; extra == 'all'
Requires-Dist: ibis-framework[snowflake]; extra == 'all'
Requires-Dist: ibis-framework[trino]; extra == 'all'
Requires-Dist: lancedb>=0.6; extra == 'all'
Requires-Dist: mysqlclient>=2.2; extra == 'all'
Requires-Dist: oracledb>=2; extra == 'all'
Requires-Dist: psycopg>=3; extra == 'all'
Requires-Dist: pyspark>=3.5; extra == 'all'
Requires-Dist: redshift-connector; extra == 'all'
Requires-Dist: sentence-transformers>=2.2; extra == 'all'
Requires-Dist: trino>=0.321; extra == 'all'
Provides-Extra: athena
Requires-Dist: ibis-framework[athena]; extra == 'athena'
Provides-Extra: bigquery
Requires-Dist: google-auth; extra == 'bigquery'
Requires-Dist: ibis-framework[bigquery]; extra == 'bigquery'
Provides-Extra: clickhouse
Requires-Dist: ibis-framework[clickhouse]; extra == 'clickhouse'
Provides-Extra: databricks
Requires-Dist: databricks-sdk; extra == 'databricks'
Requires-Dist: databricks-sql-connector; extra == 'databricks'
Provides-Extra: dev
Requires-Dist: orjson>=3; extra == 'dev'
Requires-Dist: pytest>=8; extra == 'dev'
Requires-Dist: ruff>=0.4; extra == 'dev'
Requires-Dist: testcontainers[mysql,postgres]>=4; extra == 'dev'
Provides-Extra: memory
Requires-Dist: lancedb>=0.6; extra == 'memory'
Requires-Dist: sentence-transformers>=2.2; extra == 'memory'
Provides-Extra: mssql
Requires-Dist: ibis-framework[mssql]; extra == 'mssql'
Provides-Extra: mysql
Requires-Dist: ibis-framework[mysql]; extra == 'mysql'
Requires-Dist: mysqlclient>=2.2; extra == 'mysql'
Provides-Extra: oracle
Requires-Dist: oracledb>=2; extra == 'oracle'
Provides-Extra: postgres
Requires-Dist: ibis-framework[postgres]; extra == 'postgres'
Requires-Dist: psycopg>=3; extra == 'postgres'
Provides-Extra: redshift
Requires-Dist: redshift-connector; extra == 'redshift'
Provides-Extra: snowflake
Requires-Dist: ibis-framework[snowflake]; extra == 'snowflake'
Provides-Extra: spark
Requires-Dist: pyspark>=3.5; extra == 'spark'
Provides-Extra: trino
Requires-Dist: ibis-framework[trino]; extra == 'trino'
Requires-Dist: trino>=0.321; extra == 'trino'
Description-Content-Type: text/markdown

# wren-engine

[![PyPI version](https://img.shields.io/pypi/v/wren-engine.svg)](https://pypi.org/project/wren-engine/)
[![Python](https://img.shields.io/pypi/pyversions/wren-engine.svg)](https://pypi.org/project/wren-engine/)
[![License](https://img.shields.io/pypi/l/wren-engine.svg)](https://github.com/Canner/wren-engine/blob/main/LICENSE)

Wren Engine CLI and Python SDK — semantic SQL layer for 20+ data sources.

Translate natural SQL queries through an [MDL (Modeling Definition Language)](https://docs.getwren.ai/) semantic layer and execute them against your database. Powered by [Apache DataFusion](https://datafusion.apache.org/) and [Ibis](https://ibis-project.org/).

## Installation

```bash
pip install wren-engine              # Core (DuckDB included)
pip install wren-engine[postgres]    # PostgreSQL
pip install wren-engine[mysql]       # MySQL
pip install wren-engine[bigquery]    # BigQuery
pip install wren-engine[snowflake]   # Snowflake
pip install wren-engine[clickhouse]  # ClickHouse
pip install wren-engine[trino]       # Trino
pip install wren-engine[mssql]       # SQL Server
pip install wren-engine[databricks]  # Databricks
pip install wren-engine[redshift]    # Redshift
pip install wren-engine[spark]       # Spark
pip install wren-engine[athena]      # Athena
pip install wren-engine[oracle]      # Oracle
pip install 'wren-engine[memory]'    # Schema & query memory (LanceDB)
pip install 'wren-engine[all]'       # All connectors + memory
```

Requires Python 3.11+.

## Quick start

**1. Create `~/.wren/mdl.json`** — your semantic model:

```json
{
  "catalog": "wren",
  "schema": "public",
  "models": [
    {
      "name": "orders",
      "tableReference": { "schema": "mydb", "table": "orders" },
      "columns": [
        { "name": "order_id",    "type": "integer" },
        { "name": "customer_id", "type": "integer" },
        { "name": "total",       "type": "double" },
        { "name": "status",      "type": "varchar" }
      ],
      "primaryKey": "order_id"
    }
  ]
}
```

**2. Create `~/.wren/connection_info.json`** — your connection:

```json
{
  "datasource": "mysql",
  "host": "localhost",
  "port": 3306,
  "database": "mydb",
  "user": "root",
  "password": "secret"
}
```

**3. Run queries** — `wren` auto-discovers both files from `~/.wren`:

```bash
wren --sql 'SELECT order_id FROM "orders" LIMIT 10'
```

For the full CLI reference and per-datasource `connection_info.json` formats, see [`docs/cli.md`](docs/cli.md) and [`docs/connections.md`](docs/connections.md).

**4. Index schema for semantic search** (optional, requires `wren-engine[memory]`):

```bash
wren memory index                              # index MDL schema
wren memory fetch -q "customer order price"    # fetch relevant schema context
wren memory store --nl "top customers" --sql "SELECT ..."  # store NL→SQL pair
wren memory recall -q "best customers"         # retrieve similar past queries
```

---

## Python SDK

```python
import base64, orjson
from wren import WrenEngine, DataSource

manifest = { ... }  # your MDL dict
manifest_str = base64.b64encode(orjson.dumps(manifest)).decode()

with WrenEngine(manifest_str, DataSource.mysql, {"host": "...", ...}) as engine:
    result = engine.query('SELECT * FROM "orders" LIMIT 10')
    print(result.to_pandas())
```

---

## Development

```bash
just install-dev    # Install with dev dependencies
just lint           # Ruff format check + lint
just format         # Auto-fix
```

| Command | What it runs | Docker needed |
|---------|-------------|---------------|
| `just test-unit` | Unit tests | No |
| `just test-duckdb` | DuckDB connector tests | No |
| `just test-postgres` | PostgreSQL connector tests | Yes |
| `just test-mysql` | MySQL connector tests | Yes |
| `just test` | All tests | Yes |

## Publishing

```bash
./scripts/publish.sh            # Build + publish to PyPI
./scripts/publish.sh --test     # Build + publish to TestPyPI
./scripts/publish.sh --build    # Build only
```

## License

Apache-2.0
