Metadata-Version: 2.4
Name: create-agent-app
Version: 0.1.0
Summary: Vite-style scaffolding CLI for Agentic AI Python projects
Author: Saichandra
License: MIT
Project-URL: Homepage, https://github.com/Saichandra2520/create-agent-app
Project-URL: Repository, https://github.com/Saichandra2520/create-agent-app
Project-URL: Issues, https://github.com/Saichandra2520/create-agent-app/issues
Keywords: ai,agentic-ai,langgraph,scaffolding,cli,templates
Classifier: Development Status :: 4 - Beta
Classifier: Environment :: Console
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Topic :: Software Development :: Code Generators
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Requires-Python: >=3.10
Description-Content-Type: text/markdown
Requires-Dist: typer>=0.12.0
Requires-Dist: questionary>=2.0.1
Requires-Dist: Jinja2>=3.1.4
Requires-Dist: rich>=13.7.0
Provides-Extra: dev
Requires-Dist: build>=1.2.0; extra == "dev"
Requires-Dist: twine>=6.0.0; extra == "dev"
Requires-Dist: pytest>=8.0.0; extra == "dev"

# create-agent-app

Scaffold production-ready Agentic AI Python projects in seconds.

## Quick Demo

![CLI demo placeholder](https://via.placeholder.com/1200x675?text=create-agent-app+demo+GIF)

Replace this placeholder with your real terminal GIF before publishing.

## Install

```bash
pip install create-agent-app
```

For local development:

```bash
pip install -e .
```

## Usage

```bash
create-agent-app my-agent-project
```

The CLI prompts for:
- template (`single_agent`, `multi_agent`, `rag_agent`)
- LLM provider (Groq, Gemini, Azure OpenAI, Ollama)
- model name (provider-specific)

Then it generates a complete project folder and prints next steps.

## Template Comparison

| Template | Best For | Generated Architecture |
|---|---|---|
| `single_agent` | One assistant with tool-calling | LangGraph single-node loop + ToolNode |
| `multi_agent` | Staged workflows (research + writing) | Supervisor + worker agents (researcher, writer) |
| `rag_agent` | Document-grounded answers | ChromaDB + local embeddings + retriever tool + agent |

## LLM Provider Setup

Set values in the generated `.env` file:

| Provider | Required Keys |
|---|---|
| Groq | `GROQ_API_KEY`, `LLM_PROVIDER=groq`, `MODEL_NAME=...` |
| Gemini | `GEMINI_API_KEY`, `LLM_PROVIDER=gemini`, `MODEL_NAME=...` |
| Azure OpenAI | `AZURE_OPENAI_API_KEY`, `AZURE_OPENAI_ENDPOINT`, optional `AZURE_OPENAI_DEPLOYMENT`, `LLM_PROVIDER=azure`, `MODEL_NAME=...` |
| Ollama | `OLLAMA_BASE_URL` (default `http://localhost:11434`), `LLM_PROVIDER=ollama`, `MODEL_NAME=...` |

## Development

```bash
pip install -e .[dev]
python -m build
twine check dist/*
```

## Contributing

1. Fork the repository.
2. Create a feature branch.
3. Run local checks and verify generated templates.
4. Open a pull request with a clear summary and sample scaffold output.
