Metadata-Version: 2.4
Name: gcp-agentflow
Version: 0.1.0
Summary: Agentic AI orchestration toolkit for Google Cloud Workflows, Pub/Sub, BigQuery, Datastore, and ML pipelines.
Author: Raghava Chellu
License: MIT
Keywords: google-cloud,workflows,pubsub,bigquery,datastore,agentic-ai,ml,automation,orchestration
Requires-Python: >=3.9
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: typer>=0.12.0
Requires-Dist: rich>=13.7.0
Requires-Dist: pydantic>=2.6.0
Requires-Dist: google-cloud-pubsub>=2.21.0
Requires-Dist: google-cloud-bigquery>=3.17.0
Requires-Dist: google-cloud-datastore>=2.19.0
Provides-Extra: dev
Requires-Dist: pytest>=8.0.0; extra == "dev"
Requires-Dist: ruff>=0.4.0; extra == "dev"
Requires-Dist: build>=1.2.0; extra == "dev"
Requires-Dist: twine>=5.0.0; extra == "dev"
Dynamic: license-file

# GCP AgentFlow

**Agentic AI orchestration toolkit for Google Cloud Workflows, Pub/Sub, BigQuery, Datastore, and ML pipelines.**

`gcp-agentflow` helps engineers design and operate event-driven Google Cloud automation patterns where Pub/Sub events trigger workflow decisions, Datastore stores operational state, BigQuery stores analytics, and ML/Agentic AI logic recommends the next action.

## What it is for

- Google Cloud event-driven architectures
- Agentic AI workflow orchestration
- Pub/Sub message routing
- Datastore operational state tracking
- BigQuery analytics event logging
- ML-based routing and decision support
- Workflow-ready payload generation

## Install

```bash
pip install gcp-agentflow
```

## Python Usage

```python
from gcp_agentflow import AgentDecisionInput, decide_next_action

event = AgentDecisionInput(
    event_type="file_arrived",
    source="pubsub",
    risk_score=72,
    payload={"bucket": "incoming", "name": "file.csv"}
)

decision = decide_next_action(event)
print(decision.action)
print(decision.reason)
```

## CLI Usage

```bash
gcp-agentflow decide --event-type file_arrived --risk-score 72
```

## Key Concepts

`gcp-agentflow` is intentionally lightweight. It does not force a specific architecture. It provides reusable building blocks:

- **Decision Engine**: Determines whether to route, retry, quarantine, approve, or alert.
- **Workflow Payload Builder**: Creates clean payloads for Google Workflows or Cloud Run services.
- **Pub/Sub Publisher Wrapper**: Publishes JSON events safely.
- **BigQuery Event Logger**: Inserts structured analytics events.
- **Datastore State Store**: Saves and retrieves operational state by key.

## Build and Publish

```bash
python -m pip install --upgrade build twine
python -m build
twine check dist/*
twine upload dist/*
```

## License

MIT

Author: Raghava Chellu
