Metadata-Version: 2.4
Name: structuinput
Version: 2025.12.22085003
Summary: structuinput transforms natural language inputs into structured outputs like API templates, config files, and data schemas, streamlining development and integration.
Author-email: structuinput <hi@eugene.plus>
License: MIT
Project-URL: Homepage, https://github.com/chigwell/structuinput
Requires-Python: >=3.9
Description-Content-Type: text/markdown
Requires-Dist: langchain-llm7>=0.0.0
Requires-Dist: llmatch-messages>=0.0.0
Requires-Dist: langchain-core>=0.3.0

# structuinput
[![PyPI version](https://badge.fury.io/py/structuinput.svg)](https://badge.fury.io/py/structuinput)
[![License: MIT](https://img.shields.io/badge/License-MIT-green.svg)](https://opensource.org/licenses/MIT)
[![Downloads](https://static.pepy.tech/badge/structuinput)](https://pepy.tech/project/structuinput)
[![LinkedIn](https://img.shields.io/badge/LinkedIn-blue)](https://www.linkedin.com/in/eugene-evstafev-716669181/)


**structuinput** – a lightweight Python package that converts unstructured user inputs (natural‑language descriptions, queries, or specifications) into structured, machine‑readable outputs such as API request templates, configuration files, or data schemas. It leverages `llmatch‑messages` together with a default LLM (ChatLLM7) to guarantee that the generated text matches a predefined regular‑expression pattern, making the result ready for direct integration.

---

## Table of Contents

- [Installation](#installation)
- [Quick Start](#quick-start)
- [Function Signature](#function-signature)
- [Using the Default LLM (ChatLLM7)](#using-the-default-llm-chatllm7)
- [Providing Your Own LLM](#providing-your-own-llm)
- [Environment Variables & API Keys](#environment-variables--api-keys)
- [Rate Limits & Quotas](#rate-limits--quotas)
- [Troubleshooting & Errors](#troubleshooting--errors)
- [Contributing & Issues](#contributing--issues)
- [License](#license)
- [Author](#author)

---

## Installation

```bash
pip install structuinput
```

---

## Quick Start

```python
from structuinput import structuinput

# Example unstructured description
user_input = """
I need an endpoint to upload a user avatar. 
It should accept a multipart/form‑data body with a field called `image`,
return JSON with `url` and `size`, and require an `Authorization` header.
"""

# Use the default ChatLLM7 (API key taken from env or fallback)
responses = structuinput(user_input)

print(responses)   # → List of strings that match the defined output pattern
```

The function returns a list of strings that already conform to the regular‑expression pattern defined in `structuinput.prompts.pattern`.

---

## Function Signature

```python
def structuinput(
    user_input: str,
    api_key: Optional[str] = None,
    llm: Optional[BaseChatModel] = None,
) -> List[str]:
    """
    Convert free‑form text into a structured output.

    Parameters
    ----------
    user_input: str
        The free‑form description or query to be transformed.
    llm: Optional[BaseChatModel]
        A LangChain chat model instance. If omitted, the package creates a
        `ChatLLM7` instance internally.
    api_key: Optional[str]
        API key for the LLM7 service. If omitted the function will read the
        `LLM7_API_KEY` environment variable. When both are missing a placeholder
        key `"None"` is used (suitable only for testing).

    Returns
    -------
    List[str]
        A list of extracted strings that satisfy the output pattern.
    """
```

---

## Using the Default LLM (ChatLLM7)

`structuinput` ships with **ChatLLM7** (from the `langchain_llm7` package) as the built‑in language model.

```python
from structuinput import structuinput

response = structuinput(
    user_input="Create a JSON config for a Redis cache with host, port, and db index."
)
print(response)
```

If an API key is required, set it in your environment:

```bash
export LLM7_API_KEY="your_llm7_api_key"
```

or pass it directly:

```python
response = structuinput(user_input, api_key="your_llm7_api_key")
```

You can obtain a free key by registering at <https://token.llm7.io/>. The free tier’s rate limits are sufficient for most development and prototyping scenarios.

---

## Providing Your Own LLM

You may replace the default model with any LangChain‑compatible chat model, e.g., OpenAI, Anthropic, or Google Gemini.

### OpenAI

```python
from langchain_openai import ChatOpenAI
from structuinput import structuinput

llm = ChatOpenAI(model="gpt-4o-mini")
response = structuinput("Describe a PostgreSQL connection string.", llm=llm)
print(response)
```

### Anthropic

```python
from langchain_anthropic import ChatAnthropic
from structuinput import structuinput

llm = ChatAnthropic(model="claude-3-haiku-20240307")
response = structuinput("Generate a Terraform module for an S3 bucket.", llm=llm)
print(response)
```

### Google Generative AI

```python
from langchain_google_genai import ChatGoogleGenerativeAI
from structuinput import structuinput

llm = ChatGoogleGenerativeAI(model="gemini-1.5-flash")
response = structuinput("Write a Kubernetes Deployment YAML for a Node.js app.", llm=llm)
print(response)
```

All custom LLMs must implement the `BaseChatModel` interface from LangChain.

---

## Environment Variables & API Keys

| Variable          | Description                                            |
|-------------------|--------------------------------------------------------|
| `LLM7_API_KEY`    | API key for the default ChatLLM7 service.              |
| `LLM7_BASE_URL`   | (Optional) Override the base URL for the LLM7 service.|

If you provide `api_key` directly to `structuinput`, it takes precedence over the environment variable.

---

## Rate Limits & Quotas

- **ChatLLM7 free tier**: generous daily limits suitable for typical development workloads.
- For higher throughput, obtain a paid plan from the LLM7 provider or switch to another LLM (OpenAI, Anthropic, etc.) that matches your quota requirements.

---

## Troubleshooting & Errors

`structuinput` uses `llmatch` to enforce that the LLM output matches a regular expression. If the pattern is not satisfied, a `RuntimeError` is raised:

```python
RuntimeError: LLMS call failed
```

Typical reasons:

1. **Invalid API key** – double‑check `LLM7_API_KEY` or the key passed to the function.
2. **Network issues** – ensure your environment can reach the LLM endpoint.
3. **Prompt/Pattern mismatch** – adjust your input so the model can generate text aligned with the expected format.

Enable verbose mode (set `verbose=True` inside the source) for more detailed logs.

---

## Contributing & Issues

Feel free to open bug reports, feature requests, or pull requests:

- **GitHub Issues:** <https://github.com/chigwell/structuinput/issues>

Please follow the usual contribution guidelines (tests, documentation, style) when submitting PRs.

---

## License

Distributed under the **MIT License**. See the `LICENSE` file for details.

---

## Author

**Eugene Evstafev**  
✉️ Email: <hi@euegne.plus>  
🐙 GitHub: <https://github.com/chigwell>

---

Happy structuring! 🚀
