Metadata-Version: 2.4
Name: noless
Version: 0.2.1a1
Summary: AI-powered ML project generation with local LLMs - lightweight and modular
Home-page: https://github.com/your-org/NoLess
Author: NoLess Team
License: MIT
Project-URL: Source, https://github.com/your-org/NoLess
Project-URL: Issues, https://github.com/your-org/NoLess/issues
Project-URL: Documentation, https://github.com/your-org/NoLess#readme
Keywords: cli ai machine-learning autopilot multi-agent llm
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Requires-Python: >=3.8
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: click>=8.1.0
Requires-Dist: requests>=2.31.0
Requires-Dist: beautifulsoup4>=4.12.0
Requires-Dist: rich>=13.0.0
Requires-Dist: pydantic>=2.0.0
Requires-Dist: aiohttp>=3.8.0
Requires-Dist: aiofiles>=23.1.0
Requires-Dist: pyyaml>=6.0
Requires-Dist: jinja2>=3.1.0
Provides-Extra: ml
Requires-Dist: torch>=2.0.0; extra == "ml"
Requires-Dist: tensorflow>=2.13.0; extra == "ml"
Requires-Dist: scikit-learn>=1.3.0; extra == "ml"
Provides-Extra: data
Requires-Dist: pandas>=2.0.0; extra == "data"
Requires-Dist: numpy>=1.24.0; extra == "data"
Provides-Extra: datasets
Requires-Dist: huggingface-hub>=0.19.0; extra == "datasets"
Requires-Dist: kaggle>=1.5.16; extra == "datasets"
Requires-Dist: openml>=0.14.0; extra == "datasets"
Provides-Extra: apis
Requires-Dist: anthropic>=0.39.0; extra == "apis"
Requires-Dist: openai>=1.0.0; extra == "apis"
Provides-Extra: ui
Requires-Dist: questionary>=2.0.0; extra == "ui"
Requires-Dist: prompt_toolkit>=3.0.0; extra == "ui"
Requires-Dist: colorama>=0.4.6; extra == "ui"
Requires-Dist: pyfiglet>=1.0.0; extra == "ui"
Provides-Extra: analysis
Requires-Dist: bandit>=1.7.4; extra == "analysis"
Requires-Dist: radon>=6.0.0; extra == "analysis"
Provides-Extra: dev
Requires-Dist: pytest>=7.0.0; extra == "dev"
Requires-Dist: pytest-cov>=4.0.0; extra == "dev"
Requires-Dist: black>=23.0.0; extra == "dev"
Requires-Dist: flake8>=6.0.0; extra == "dev"
Requires-Dist: mypy>=1.0.0; extra == "dev"
Provides-Extra: all
Requires-Dist: torch>=2.0.0; extra == "all"
Requires-Dist: tensorflow>=2.13.0; extra == "all"
Requires-Dist: scikit-learn>=1.3.0; extra == "all"
Requires-Dist: pandas>=2.0.0; extra == "all"
Requires-Dist: numpy>=1.24.0; extra == "all"
Requires-Dist: huggingface-hub>=0.19.0; extra == "all"
Requires-Dist: kaggle>=1.5.16; extra == "all"
Requires-Dist: openml>=0.14.0; extra == "all"
Requires-Dist: anthropic>=0.39.0; extra == "all"
Requires-Dist: openai>=1.0.0; extra == "all"
Requires-Dist: questionary>=2.0.0; extra == "all"
Requires-Dist: prompt_toolkit>=3.0.0; extra == "all"
Requires-Dist: colorama>=0.4.6; extra == "all"
Requires-Dist: pyfiglet>=1.0.0; extra == "all"
Requires-Dist: bandit>=1.7.4; extra == "all"
Requires-Dist: radon>=6.0.0; extra == "all"
Requires-Dist: pytest>=7.0.0; extra == "all"
Requires-Dist: pytest-cov>=4.0.0; extra == "all"
Requires-Dist: black>=23.0.0; extra == "all"
Requires-Dist: flake8>=6.0.0; extra == "all"
Requires-Dist: mypy>=1.0.0; extra == "all"
Provides-Extra: full
Requires-Dist: torch>=2.0.0; extra == "full"
Requires-Dist: tensorflow>=2.13.0; extra == "full"
Requires-Dist: scikit-learn>=1.3.0; extra == "full"
Requires-Dist: pandas>=2.0.0; extra == "full"
Requires-Dist: numpy>=1.24.0; extra == "full"
Requires-Dist: huggingface-hub>=0.19.0; extra == "full"
Requires-Dist: kaggle>=1.5.16; extra == "full"
Requires-Dist: openml>=0.14.0; extra == "full"
Requires-Dist: anthropic>=0.39.0; extra == "full"
Requires-Dist: openai>=1.0.0; extra == "full"
Requires-Dist: questionary>=2.0.0; extra == "full"
Requires-Dist: prompt_toolkit>=3.0.0; extra == "full"
Requires-Dist: colorama>=0.4.6; extra == "full"
Requires-Dist: pyfiglet>=1.0.0; extra == "full"
Requires-Dist: bandit>=1.7.4; extra == "full"
Requires-Dist: radon>=6.0.0; extra == "full"
Requires-Dist: pytest>=7.0.0; extra == "full"
Requires-Dist: pytest-cov>=4.0.0; extra == "full"
Requires-Dist: black>=23.0.0; extra == "full"
Requires-Dist: flake8>=6.0.0; extra == "full"
Requires-Dist: mypy>=1.0.0; extra == "full"
Provides-Extra: light
Requires-Dist: anthropic>=0.39.0; extra == "light"
Requires-Dist: openai>=1.0.0; extra == "light"
Dynamic: author
Dynamic: classifier
Dynamic: description
Dynamic: description-content-type
Dynamic: home-page
Dynamic: keywords
Dynamic: license
Dynamic: license-file
Dynamic: project-url
Dynamic: provides-extra
Dynamic: requires-dist
Dynamic: requires-python
Dynamic: summary

# NoLess Library 🚀

NoLess is a Python library for building complete machine-learning projects with the help of cooperative AI agents and local LLMs. It exposes the same generators, planners, and feedback loops that power the original CLI so you can embed them inside notebooks, services, or custom tooling.

## Why NoLess

- 🤖 **Multi-agent automation** – orchestrate dataset discovery, architecture design, and code generation with reusable agents.
- 🧱 **LLM-powered builders** – generate configs, models, training loops, and tests with a single call to `ModelGenerator`.
- 🔍 **Unified dataset search** – query OpenML, Hugging Face, Kaggle, and UCI from the `DatasetSearcher` API.
- 🛠️ **Interactive refinement** – use `InteractiveFeedbackLoop` to iteratively review and improve generated code.
- ✅ **Production-ready outputs** – every project ships with config, model, training, tests, requirements, and README files.

## Installation

```bash
# Clone the repository
git clone https://github.com/your-org/noless.git
cd noless

# Install dependencies
pip install -r requirements.txt

# Install the library in editable mode
pip install -e .
```

Optional extras:
- Install [Ollama](https://ollama.com) and pull at least one model (e.g. `ollama pull deepseek-r1:1.5b`).
- Configure Kaggle or Hugging Face credentials if you plan to download datasets automatically.

## Quick Start

```python
from noless.generator import ModelGenerator
from noless.ollama_client import OllamaClient

client = OllamaClient()
generator = ModelGenerator(llm_model="deepseek-r1:1.5b", ollama_client=client)
project = generator.create_project(
    task="image-classification",
    framework="pytorch",
    dataset="mnist",
    output_dir="./mnist_classifier",
)

print(project["files"])
```

The output directory now contains `config.yaml`, `model.py`, `train.py`, `test_model.py`, `requirements.txt`, and a README ready for version control.

## Key Modules

### `noless.generator.ModelGenerator`
Builds end-to-end ML project skeletons. Supports PyTorch, TensorFlow, and scikit-learn targets, optional Ollama-powered authoring, and automatic AI reviews via `CodeValidator`.

### `noless.autopilot.AutopilotPlanner`
Uses an LLM to interpret natural-language goals, ask clarifying questions, and produce structured blueprints (task, framework, dataset hints, architecture recommendations).

### `noless.search.DatasetSearcher`
Aggregates dataset discovery across OpenML, Hugging Face, Kaggle, and UCI. Returns normalized metadata and can download ready-to-use files.

### `noless.feedback_loop.InteractiveFeedbackLoop`
Provides a conversational refinement loop for any generated file. Supply code plus context and iteratively apply user or AI feedback.

### `noless.code_validator.CodeValidator`
Runs AI reviews against generated code using a larger reviewer model when available, returning improved code plus issue/suggestion lists.

### `noless.agents.MultiAgentSystem`
Optional cooperative layer that lets you run the six specialized agents (orchestrator, dataset, model, code, training, optimization) inside your own applications.

## End-to-End Pipeline Example

```python
from noless.autopilot import AutopilotPlanner
from noless.generator import ModelGenerator
from noless.search import DatasetSearcher
from noless.ollama_client import OllamaClient

client = OllamaClient()
planner = AutopilotPlanner(client, llm_model="deepseek-r1:1.5b")
analysis = planner.plan_project("detect defects in solar panel images")

searcher = DatasetSearcher()
datasets = searcher.search(analysis.dataset_query, limit=5)

project = ModelGenerator(
    llm_model="deepseek-r1:1.5b",
    reviewer_model="mixtral:8x7b",
    ollama_client=client,
).create_project(
    task=analysis.task,
    framework=analysis.framework,
    dataset=datasets[0].name if datasets else None,
    output_dir="./solar_inspector",
    dataset_metadata=datasets[0].__dict__ if datasets else None,
)
```

## Testing

Run the unit test suite after making changes:

```bash
python -m unittest
```

## License

NoLess is released under the MIT License. See `LICENSE` for details.
