Metadata-Version: 2.4
Name: hera2-sdk
Version: 1.11.8.8
Summary: Hera2 Python SDK for metadata ingestion and management
Author: TMDC
License: Apache-2.0
Project-URL: Homepage, https://dataos.info/
Project-URL: Source, https://github.com/tmdc-io/hera2
Requires-Python: >=3.9
Description-Content-Type: text/markdown
Requires-Dist: hera2-ingestion~=1.11.8.1
Provides-Extra: airflow
Requires-Dist: hera2-ingestion[airflow]~=1.11.8.1; extra == "airflow"
Provides-Extra: amundsen
Requires-Dist: hera2-ingestion[amundsen]~=1.11.8.1; extra == "amundsen"
Provides-Extra: athena
Requires-Dist: hera2-ingestion[athena]~=1.11.8.1; extra == "athena"
Provides-Extra: atlas
Requires-Dist: hera2-ingestion[atlas]~=1.11.8.1; extra == "atlas"
Provides-Extra: azuresql
Requires-Dist: hera2-ingestion[azuresql]~=1.11.8.1; extra == "azuresql"
Provides-Extra: azure-sso
Requires-Dist: hera2-ingestion[azure-sso]~=1.11.8.1; extra == "azure-sso"
Provides-Extra: backup
Requires-Dist: hera2-ingestion[backup]~=1.11.8.1; extra == "backup"
Provides-Extra: bigquery
Requires-Dist: hera2-ingestion[bigquery]~=1.11.8.1; extra == "bigquery"
Provides-Extra: bigtable
Requires-Dist: hera2-ingestion[bigtable]~=1.11.8.1; extra == "bigtable"
Provides-Extra: clickhouse
Requires-Dist: hera2-ingestion[clickhouse]~=1.11.8.1; extra == "clickhouse"
Provides-Extra: dagster
Requires-Dist: hera2-ingestion[dagster]~=1.11.8.1; extra == "dagster"
Provides-Extra: dbt
Requires-Dist: hera2-ingestion[dbt]~=1.11.8.1; extra == "dbt"
Provides-Extra: db2
Requires-Dist: hera2-ingestion[db2]~=1.11.8.1; extra == "db2"
Provides-Extra: db2-ibmi
Requires-Dist: hera2-ingestion[db2-ibmi]~=1.11.8.1; extra == "db2-ibmi"
Provides-Extra: databricks
Requires-Dist: hera2-ingestion[databricks]~=1.11.8.1; extra == "databricks"
Provides-Extra: datalake-azure
Requires-Dist: hera2-ingestion[datalake-azure]~=1.11.8.1; extra == "datalake-azure"
Provides-Extra: datalake-gcs
Requires-Dist: hera2-ingestion[datalake-gcs]~=1.11.8.1; extra == "datalake-gcs"
Provides-Extra: datalake-s3
Requires-Dist: hera2-ingestion[datalake-s3]~=1.11.8.1; extra == "datalake-s3"
Provides-Extra: deltalake
Requires-Dist: hera2-ingestion[deltalake]~=1.11.8.1; extra == "deltalake"
Provides-Extra: deltalake-storage
Requires-Dist: hera2-ingestion[deltalake-storage]~=1.11.8.1; extra == "deltalake-storage"
Provides-Extra: deltalake-spark
Requires-Dist: hera2-ingestion[deltalake-spark]~=1.11.8.1; extra == "deltalake-spark"
Provides-Extra: domo
Requires-Dist: hera2-ingestion[domo]~=1.11.8.1; extra == "domo"
Provides-Extra: doris
Requires-Dist: hera2-ingestion[doris]~=1.11.8.1; extra == "doris"
Provides-Extra: druid
Requires-Dist: hera2-ingestion[druid]~=1.11.8.1; extra == "druid"
Provides-Extra: dynamodb
Requires-Dist: hera2-ingestion[dynamodb]~=1.11.8.1; extra == "dynamodb"
Provides-Extra: elasticsearch
Requires-Dist: hera2-ingestion[elasticsearch]~=1.11.8.1; extra == "elasticsearch"
Provides-Extra: opensearch
Requires-Dist: hera2-ingestion[opensearch]~=1.11.8.1; extra == "opensearch"
Provides-Extra: exasol
Requires-Dist: hera2-ingestion[exasol]~=1.11.8.1; extra == "exasol"
Provides-Extra: glue
Requires-Dist: hera2-ingestion[glue]~=1.11.8.1; extra == "glue"
Provides-Extra: great-expectations
Requires-Dist: hera2-ingestion[great-expectations]~=1.11.8.1; extra == "great-expectations"
Provides-Extra: great-expectations-1xx
Requires-Dist: hera2-ingestion[great-expectations-1xx]~=1.11.8.1; extra == "great-expectations-1xx"
Provides-Extra: greenplum
Requires-Dist: hera2-ingestion[greenplum]~=1.11.8.1; extra == "greenplum"
Provides-Extra: cockroach
Requires-Dist: hera2-ingestion[cockroach]~=1.11.8.1; extra == "cockroach"
Provides-Extra: hive
Requires-Dist: hera2-ingestion[hive]~=1.11.8.1; extra == "hive"
Provides-Extra: iceberg
Requires-Dist: hera2-ingestion[iceberg]~=1.11.8.1; extra == "iceberg"
Provides-Extra: impala
Requires-Dist: hera2-ingestion[impala]~=1.11.8.1; extra == "impala"
Provides-Extra: kafka
Requires-Dist: hera2-ingestion[kafka]~=1.11.8.1; extra == "kafka"
Provides-Extra: kafkaconnect
Requires-Dist: hera2-ingestion[kafkaconnect]~=1.11.8.1; extra == "kafkaconnect"
Provides-Extra: kinesis
Requires-Dist: hera2-ingestion[kinesis]~=1.11.8.1; extra == "kinesis"
Provides-Extra: looker
Requires-Dist: hera2-ingestion[looker]~=1.11.8.1; extra == "looker"
Provides-Extra: mlflow
Requires-Dist: hera2-ingestion[mlflow]~=1.11.8.1; extra == "mlflow"
Provides-Extra: mongo
Requires-Dist: hera2-ingestion[mongo]~=1.11.8.1; extra == "mongo"
Provides-Extra: cassandra
Requires-Dist: hera2-ingestion[cassandra]~=1.11.8.1; extra == "cassandra"
Provides-Extra: couchbase
Requires-Dist: hera2-ingestion[couchbase]~=1.11.8.1; extra == "couchbase"
Provides-Extra: mssql
Requires-Dist: hera2-ingestion[mssql]~=1.11.8.1; extra == "mssql"
Provides-Extra: mssql-odbc
Requires-Dist: hera2-ingestion[mssql-odbc]~=1.11.8.1; extra == "mssql-odbc"
Provides-Extra: mysql
Requires-Dist: hera2-ingestion[mysql]~=1.11.8.1; extra == "mysql"
Provides-Extra: nifi
Requires-Dist: hera2-ingestion[nifi]~=1.11.8.1; extra == "nifi"
Provides-Extra: openlineage
Requires-Dist: hera2-ingestion[openlineage]~=1.11.8.1; extra == "openlineage"
Provides-Extra: oracle
Requires-Dist: hera2-ingestion[oracle]~=1.11.8.1; extra == "oracle"
Provides-Extra: pgspider
Requires-Dist: hera2-ingestion[pgspider]~=1.11.8.1; extra == "pgspider"
Provides-Extra: pinotdb
Requires-Dist: hera2-ingestion[pinotdb]~=1.11.8.1; extra == "pinotdb"
Provides-Extra: postgres
Requires-Dist: hera2-ingestion[postgres]~=1.11.8.1; extra == "postgres"
Provides-Extra: powerbi
Requires-Dist: hera2-ingestion[powerbi]~=1.11.8.1; extra == "powerbi"
Provides-Extra: qliksense
Requires-Dist: hera2-ingestion[qliksense]~=1.11.8.1; extra == "qliksense"
Provides-Extra: presto
Requires-Dist: hera2-ingestion[presto]~=1.11.8.1; extra == "presto"
Provides-Extra: pymssql
Requires-Dist: hera2-ingestion[pymssql]~=1.11.8.1; extra == "pymssql"
Provides-Extra: quicksight
Requires-Dist: hera2-ingestion[quicksight]~=1.11.8.1; extra == "quicksight"
Provides-Extra: redash
Requires-Dist: hera2-ingestion[redash]~=1.11.8.1; extra == "redash"
Provides-Extra: redpanda
Requires-Dist: hera2-ingestion[redpanda]~=1.11.8.1; extra == "redpanda"
Provides-Extra: redshift
Requires-Dist: hera2-ingestion[redshift]~=1.11.8.1; extra == "redshift"
Provides-Extra: sagemaker
Requires-Dist: hera2-ingestion[sagemaker]~=1.11.8.1; extra == "sagemaker"
Provides-Extra: salesforce
Requires-Dist: hera2-ingestion[salesforce]~=1.11.8.1; extra == "salesforce"
Provides-Extra: sample-data
Requires-Dist: hera2-ingestion[sample-data]~=1.11.8.1; extra == "sample-data"
Provides-Extra: sap-hana
Requires-Dist: hera2-ingestion[sap-hana]~=1.11.8.1; extra == "sap-hana"
Provides-Extra: sas
Requires-Dist: hera2-ingestion[sas]~=1.11.8.1; extra == "sas"
Provides-Extra: singlestore
Requires-Dist: hera2-ingestion[singlestore]~=1.11.8.1; extra == "singlestore"
Provides-Extra: sklearn
Requires-Dist: hera2-ingestion[sklearn]~=1.11.8.1; extra == "sklearn"
Provides-Extra: snowflake
Requires-Dist: hera2-ingestion[snowflake]~=1.11.8.1; extra == "snowflake"
Provides-Extra: superset
Requires-Dist: hera2-ingestion[superset]~=1.11.8.1; extra == "superset"
Provides-Extra: tableau
Requires-Dist: hera2-ingestion[tableau]~=1.11.8.1; extra == "tableau"
Provides-Extra: teradata
Requires-Dist: hera2-ingestion[teradata]~=1.11.8.1; extra == "teradata"
Provides-Extra: trino
Requires-Dist: hera2-ingestion[trino]~=1.11.8.1; extra == "trino"
Provides-Extra: vertica
Requires-Dist: hera2-ingestion[vertica]~=1.11.8.1; extra == "vertica"
Provides-Extra: pandas
Requires-Dist: hera2-ingestion[pandas]~=1.11.8.1; extra == "pandas"
Provides-Extra: pyarrow
Requires-Dist: hera2-ingestion[pyarrow]~=1.11.8.1; extra == "pyarrow"
Provides-Extra: pii-processor
Requires-Dist: hera2-ingestion[pii-processor]~=1.11.8.1; extra == "pii-processor"
Provides-Extra: presidio-analyzer
Requires-Dist: hera2-ingestion[presidio-analyzer]~=1.11.8.1; extra == "presidio-analyzer"
Provides-Extra: all
Requires-Dist: hera2-ingestion[all]~=1.11.8.1; extra == "all"
Provides-Extra: slim
Requires-Dist: hera2-ingestion[slim]~=1.11.8.1; extra == "slim"
Provides-Extra: all-dev-env
Requires-Dist: hera2-ingestion[all-dev-env]~=1.11.8.1; extra == "all-dev-env"
Provides-Extra: dev
Requires-Dist: hera2-ingestion[dev]~=1.11.8.1; extra == "dev"
Provides-Extra: test
Requires-Dist: hera2-ingestion[test]~=1.11.8.1; extra == "test"
Provides-Extra: test-unit
Requires-Dist: hera2-ingestion[test-unit]~=1.11.8.1; extra == "test-unit"
Provides-Extra: e2e-test
Requires-Dist: hera2-ingestion[e2e_test]~=1.11.8.1; extra == "e2e-test"
Provides-Extra: playwright
Requires-Dist: hera2-ingestion[playwright]~=1.11.8.1; extra == "playwright"
Provides-Extra: docs
Requires-Dist: hera2-ingestion[docs]~=1.11.8.1; extra == "docs"
Provides-Extra: data-insight
Requires-Dist: hera2-ingestion[data-insight]~=1.11.8.1; extra == "data-insight"
Dynamic: provides-extra
Dynamic: requires-dist

# Hera2 Python SDK

A modern, fluent Python SDK for OpenMetadata that provides an intuitive API for all operations. Authentication is routed through the Heimdall authorization service for DataOS integration.

Installing `hera2-sdk` pulls in `hera2-ingestion` (HERA-modified fork of openmetadata-ingestion) automatically and shares the `metadata` namespace, so you get both `metadata.sdk` (Heimdall auth, fluent API) and the full ingestion stack: `metadata.ingestion`, `metadata.generated`, `metadata.clients`, `metadata.profiler`, `metadata.utils`, `metadata.workflow`, etc.

## Installation

**Python 3.9+ required.** We recommend Python 3.10 or 3.11 for best compatibility.

**macOS note:** On macOS, `python3.10` or `python3.11` in your shell may be a symlink to the system Python 3.9. Always use the **full path** to the Homebrew-installed interpreter when creating the venv:

```bash
# Install Python 3.11 via Homebrew if not already available
brew install python@3.11

# Verify the Homebrew interpreter (M-series Mac uses /opt/homebrew, Intel uses /usr/local)
/opt/homebrew/bin/python3.11 --version   # should print 3.11.x

# Create the venv using the full path
/opt/homebrew/bin/python3.11 -m venv .venv
source .venv/bin/activate

# Always use python3 -m pip (not bare pip) to avoid shell alias interference
python3 -m pip install hera2-sdk
```

After activation, confirm the interpreter is correct before installing:

```bash
python3 --version   # must show 3.11.x (or 3.10.x)
python3 -m pip --version   # must show python 3.11 (or 3.10), not 3.9
```

For an editable install from the repo:

```bash
pip install -e /path/to/hera2/ingestion/
pip install -e /path/to/hera2/hera2-sdk/
```

### Data Quality SDK Installation

For running data quality tests, additional dependencies may be required:

**DataFrame Validation:**
```bash
pip install 'hera2-sdk[pandas]'
```

**Table-Based Testing:**
```bash
pip install 'hera2-sdk[mysql]'        # For MySQL
pip install 'hera2-sdk[postgres]'     # For PostgreSQL
pip install 'hera2-sdk[snowflake]'    # For Snowflake
pip install 'hera2-sdk[clickhouse]'   # For ClickHouse
```

### Troubleshooting

- **`python --version` shows 3.9 inside the venv** — Your shell has an alias (`alias python=...`) that overrides venv activation. Check with:
  ```bash
  type python   # shows "python is an alias for ..."
  grep -n "python" ~/.zshrc ~/.zprofile 2>/dev/null
  ```
  Remove the alias from `~/.zshrc` / `~/.zprofile` and reload (`source ~/.zshrc`). Until then, use `python3` instead of `python` — the venv’s `python3` is not affected by the alias.

- **`pip --version` shows python 3.9** — Similarly, `pip` may be aliased to the system pip (`alias pip=/usr/bin/pip3`). Always use `python3 -m pip` instead of bare `pip` to drive pip through the venv interpreter:
  ```bash
  python3 -m pip install hera2-sdk==<version>
  ```

- **"Defaulting to user installation because normal site-packages is not writeable"** — Pip is installing to your user directory instead of the active venv.
  1. Use the venv’s Python explicitly: `python3 -m pip install hera2-sdk`.
  2. If it still happens, make the venv writable and reinstall: `chmod -R u+w .venv` (or your venv dir), then `python3 -m pip install --force-reinstall hera2-sdk`.
  3. Confirm the venv is the one being used: `python3 -c "import sys; print(sys.prefix)"` should print a path inside your venv. If it prints a system path, the venv wasn’t activated or `python3` in your shell isn’t from the venv.

- **`TypeError: unsupported operand type(s) for |: 'type' and 'NoneType'`** when importing `metadata.ingestion` — The interpreter is loading the **local** `metadata` package from the hera2 ingestion source (`.../hera2/ingestion/src/metadata/`) instead of the installed `hera2-ingestion`. Something is adding that path to `sys.path`.
  1. Unset `PYTHONPATH` before running Python: `unset PYTHONPATH`, then run your script again.
  2. See what adds the repo to the path: `python3 -c "import sys; print([p for p in sys.path if 'hera2' in p or 'ingestion' in p])"`. If you see a path like `.../hera2/ingestion/src`, it was added by `PYTHONPATH` or a `.pth` file in site-packages.
  3. Run Python from a directory that is **outside** the hera2 repo.

- **`UserWarning: pkg_resources is deprecated as an API`** on import — Harmless deprecation warning. Suppress with:
  ```python
  import warnings
  warnings.filterwarnings("ignore", category=UserWarning, module="pkg_resources")
  ```
  Or via environment variable: `PYTHONWARNINGS="ignore::UserWarning:pkg_resources" python3 your_script.py`

## Quick Start

### Configure the SDK (Heimdall Auth — Recommended)

Use **heimdallConfiguration** (same structure as `hera/config/config.yaml`
`authenticationConfiguration.heimdallConfiguration`):

```python
from metadata.sdk import configure

configure(
    host="http://localhost:8585/api",
    api_key="your-dataos-api-key",
    heimdall_configuration={
        "enabled": True,
        "baseUrl": "https://your-instance.dataos.cloud/heimdall",
        "timeout": 10,
        "fallbackOnBasic": True,
    },
)
```

Or use the legacy `heimdall_url`:

```python
configure(
    host="http://localhost:8585/api",
    api_key="your-dataos-api-key",
    heimdall_url="https://your-instance.dataos.cloud/heimdall",
)
```

Or set environment variables and call `configure()` with no arguments:

```bash
export OPENMETADATA_HOST="http://localhost:8585/api"
export OPENMETADATA_API_KEY="your-dataos-api-key"
export HEIMDALL_BASE_URL="https://your-instance.dataos.cloud/heimdall"
```

```python
from metadata.sdk import configure
configure()
```

### Configure Parameters

The `configure()` function supports:
- **`host`** or **`server_url`**: OpenMetadata server URL
- **`api_key`** or **`jwt_token`**: DataOS API key or JWT token
- **`heimdall_configuration`**: Dict matching `hera/config/config.yaml` `heimdallConfiguration` (enabled, baseUrl, timeout, fallbackOnBasic, trustAll)
- **`heimdall_url`**: Heimdall base URL (legacy; use heimdall_configuration when possible)
- Falls back to environment variables:
  - `OPENMETADATA_HOST` or `OPENMETADATA_SERVER_URL` for the server URL
  - `OPENMETADATA_API_KEY` or `OPENMETADATA_JWT_TOKEN` for authentication
  - `HEIMDALL_BASE_URL`: Heimdall service URL (enables Heimdall auth)
  - `HEIMDALL_TIMEOUT`: Heimdall request timeout in seconds (default: 10)
  - `HEIMDALL_TRUST_ALL`: Trust all SSL certs for Heimdall (default: true)
  - `OPENMETADATA_VERIFY_SSL`: Enable SSL verification (default: false)
  - `OPENMETADATA_CA_BUNDLE`: Path to CA bundle
  - `OPENMETADATA_CLIENT_TIMEOUT`: Client timeout in seconds (default: 30)

### Alternative: Builder Pattern

```python
from metadata.sdk.config import OpenMetadataConfig

config = (
    OpenMetadataConfig.builder()
    .server_url("http://localhost:8585/api")
    .api_key("your-dataos-api-key")
    .heimdall_configuration({
        "enabled": True,
        "baseUrl": "https://your-instance.dataos.cloud/heimdall",
        "timeout": 15,
        "fallbackOnBasic": True,
    })
    .build()
)
```

Or with flat params: `.heimdall_url("...").heimdall_timeout(15)`.

### Alternative: Direct JWT (Legacy)

If Heimdall is not available, the SDK falls back to direct JWT authentication:

```python
from metadata.sdk import configure
configure(host="http://localhost:8585/api", jwt_token="your-om-jwt-token")
```

### Using like the [OpenMetadata Python SDK](https://docs.open-metadata.org/v1.11.x/sdk/python#1-initialize-openmetadata)

hera2-sdk depends on `hera2-ingestion`, so you can use the **same low-level API** as in the [official OpenMetadata SDK docs](https://docs.open-metadata.org/v1.11.x/sdk/python#1-initialize-openmetadata): `OpenMetadataConnection` + `OpenMetadata(server_config)`, then `metadata.create_or_update()`, `metadata.get_by_name()`, `metadata.delete()`, etc.

**Option 1 — Standard OpenMetadata style (same as the docs)**

```python
from metadata.ingestion.ometa.ometa_api import OpenMetadata
from metadata.generated.schema.entity.services.connections.metadata.openMetadataConnection import (
    OpenMetadataConnection,
    AuthProvider,
)
from metadata.generated.schema.security.client.openMetadataJWTClientConfig import (
    OpenMetadataJWTClientConfig,
)
from metadata.generated.schema.entity.data.table import Table

server_config = OpenMetadataConnection(
    hostPort="http://localhost:8585/api",
    authProvider=AuthProvider.openmetadata,
    securityConfig=OpenMetadataJWTClientConfig(
        jwtToken="<YOUR-INGESTION-BOT-JWT-TOKEN>",
    ),
)
metadata = OpenMetadata(server_config)

# Same API as in the docs
metadata.health_check()
service_entity = metadata.create_or_update(data=create_service)
my_table = metadata.get_by_name(entity=Table, fqn="test-service-table.test-db.test-schema.test")
metadata.delete(entity=Table, entity_id=my_table.id)
```

**Option 2 — hera2-sdk wrapper (same API + optional Heimdall)**

Use `configure()` or `OpenMetadataConfig`, then get the underlying client via `.ometa` and call the same methods:

```python
from metadata.sdk import configure, client
from metadata.generated.schema.entity.data.table import Table

configure(
    host="http://localhost:8585/api",
    api_key="your-dataos-api-key",
    heimdall_configuration={
        "enabled": True,
        "baseUrl": "https://your-instance.dataos.cloud/heimdall",
        "timeout": 10,
        "fallbackOnBasic": True,
    },
)

metadata = client().ometa   # same interface as OpenMetadata(server_config)

metadata.health_check()
service_entity = metadata.create_or_update(data=create_service)
my_table = metadata.get_by_name(entity=Table, fqn="test-service-table.test-db.test-schema.test")
metadata.delete(entity=Table, entity_id=my_table.id)
```

So you can follow the [OpenMetadata SDK walkthrough](https://docs.open-metadata.org/v1.11.x/sdk/python) (create DatabaseService, Database, Schema, Table, etc.) with either the raw `OpenMetadata` from `metadata.ingestion.ometa.ometa_api` or with `client().ometa` after configuring hera2-sdk.

### Manual Initialization

For more control, you can manually initialize the SDK:

```python
from metadata.sdk import OpenMetadata, OpenMetadataConfig
from metadata.sdk.entities import Table, User
from metadata.sdk.api import Search, Lineage, Bulk

config = OpenMetadataConfig(
    server_url="http://localhost:8585/api",
    api_key="your-dataos-api-key",
    heimdall_configuration={
        "enabled": True,
        "baseUrl": "https://your-instance.dataos.cloud/heimdall",
        "timeout": 10,
        "fallbackOnBasic": True,
    },
)

client = OpenMetadata.initialize(config)

Table.set_default_client(client)
User.set_default_client(client)
Search.set_default_client(client)
Lineage.set_default_client(client)
Bulk.set_default_client(client)
```

### Configuration from Environment Variables Only

```python
from metadata.sdk.config import OpenMetadataConfig

# Reads from OPENMETADATA_HOST, OPENMETADATA_API_KEY, HEIMDALL_BASE_URL, etc.
config = OpenMetadataConfig.from_env()
```

## Entity Operations

### Tables

```python
from metadata.generated.schema.api.data.createTable import CreateTableRequest
from metadata.sdk.entities.table import TableListParams

# Create a table
request = CreateTableRequest(
    name="my_table",
    databaseSchema="my_schema",
    columns=[...]
)
table = Table.create(request)

# Retrieve a table by ID
table = Table.retrieve("table-id")

# Retrieve by fully qualified name with specific fields
table = Table.retrieve_by_name(
    "service.database.schema.table",
    fields=["owners", "tags", "columns"]
)

# List tables with pagination
for table in Table.list().auto_paging_iterable():
    print(table.name)

# List with filters
params = TableListParams.builder() \
    .limit(50) \
    .database("my_database") \
    .fields(["owners", "tags"]) \
    .build()

tables = Table.list(params)

# Update a table
table.description = "Updated description"
updated = Table.update(table.id, table)

# Delete a table
Table.delete("table-id")

# Delete with options
Table.delete("table-id", recursive=True, hard_delete=True)

# Export/Import CSV
csv_data = Table.export_csv("table-name")
Table.import_csv(csv_data, dry_run=False)
```

## Supported Entity Types

The SDK provides the same fluent API for all OpenMetadata entity types:

- **Data Assets**: Table, Database, DatabaseSchema, Dashboard, Pipeline, Topic, Container, Query, StoredProcedure, DashboardDataModel, SearchIndex, MlModel, Report
- **Services**: DatabaseService, MessagingService, DashboardService, PipelineService, MlModelService, StorageService, SearchService, MetadataService, ApiService
- **Teams & Users**: User, Team, Role, Policy
- **Governance**: Glossary, GlossaryTerm, Classification, Tag, DataProduct, Domain
- **Quality**: TestCase, TestSuite, TestDefinition, DataQualityDashboard
- **Ingestion**: Ingestion, Workflow, Connection
- **Other**: Type, Webhook, Kpi, Application, Persona, DocStore, Page, SearchQuery

## Testing

Run the SDK tests:

```bash
# Run all SDK tests
pytest tests/unit/sdk/

# Run specific test
pytest tests/unit/sdk/test_sdk_entities.py
```

## License

This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
