Metadata-Version: 2.4
Name: specverify
Version: 0.0.1
Summary: Verify your AI specs before execution. Tool-agnostic specification verification for SDD workflows.
Author-email: Francesco Marinoni Moretto <francesco.marinoni.moretto@gmail.com>
License: MIT
Project-URL: Homepage, https://github.com/frmoretto/specverify
Project-URL: Repository, https://github.com/frmoretto/specverify
Project-URL: Issues, https://github.com/frmoretto/specverify/issues
Keywords: spec-driven-development,sdd,ai-coding,specification,verification,spec-gate,clarity-gate,stream-coding,coding-agents,context-engineering
Classifier: Development Status :: 1 - Planning
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Topic :: Software Development :: Quality Assurance
Classifier: Topic :: Software Development :: Documentation
Requires-Python: >=3.9
Description-Content-Type: text/markdown
License-File: LICENSE
Dynamic: license-file

# specverify

**Verify your AI specs before execution.**

The missing layer between writing specifications and handing them to coding agents.

## The Problem

Every SDD tool writes specs. No SDD tool verifies them.

You use GSD, BMAD, Spec Kit, Superpowers — they all produce specifications. Then you hand those specs to a coding agent and hope for the best. When the agent produces wrong code, you tweak the prompt. But the problem isn't the agent. It's the spec.

## What specverify Does

specverify adds a verification step between your spec-writing tool and your coding agent:

```
Your SDD Tool → Write Spec → specverify → Pass? → Execute with Agent
```

**The core test:** *"Can a different AI session generate functionally equivalent code from this specification alone — without additional context?"*

If yes, the spec passes. If not, specverify tells you what's missing.

## Install

```bash
pip install specverify
```

Or with npm:

```bash
npm install specverify
```

## Features (coming in v1.0)

- **13-item Spec Gate** — Structural completeness verification
- **Adversarial Review** — Cross-model validation (different AI attacks your spec)
- **Clarity Gate integration** — Epistemic quality checks
- **Tool-agnostic** — Works with any SDD workflow
- **Git-verified proof** — Built and validated with two case studies (46 endpoints in 4.5 hours, hackathon battle test)

## Status

🚧 **Under active development.** Full release planned for April 2026.

Follow progress: [github.com/frmoretto/specverify](https://github.com/frmoretto/specverify)

## Background

specverify is the operationalized tool from the [Stream Coding](https://stream-coding.com) methodology's Spec Gate protocol. The concept was first presented at AI Engineer Europe 2026.

## Author

Francesco Marinoni Moretto  
[LinkedIn](https://linkedin.com/in/francesco-moretto) · [GitHub](https://github.com/frmoretto)

## License

MIT
