Metadata-Version: 2.4
Name: text-summarizer-gi
Version: 0.1.0
Summary: LLM based context aware text summarizer
Author-email: Dhivya J <dhivyashankar27@example.com>
Requires-Python: >=3.8
Description-Content-Type: text/markdown
Requires-Dist: openai
Requires-Dist: tiktoken

# azure-llm-summarizer

A lightweight Python library for summarizing text using **Azure OpenAI**, with built-in **token counting** for both input and output — no extra dependencies beyond `openai`.

## Installation

```bash
pip install azure-llm-summarizer
```

## Quick Start

```python
from llm_summarizer import AzureSummarizer

summarizer = AzureSummarizer(
    api_key="<your-azure-api-key>",
    azure_endpoint="https://<your-resource>.openai.azure.com",
    deployment_name="gpt-4o-mini",   # your deployed model name
)

result = summarizer.summarize(
    text="Your long document or passage goes here...",
    summary_type="medium",    # "short" | "medium" | "detailed"
    tone="neutral",           # "neutral" | "formal" | "casual"
    focus_area="general",     # "general" | "technical insights" | "financial" ...
    output_format="text",     # "text" | "bullets" | "json"
)

print(result.summary)          # the summary
print(result.input_tokens)     # token count of the original passage
print(result.output_tokens)    # token count of the summary
print(result)                  # summary + token counts in one print
```

### Example Output

```
Generative AI (GenAI) refers to AI systems that create new content by learning
from large datasets, using architectures like LLMs and diffusion models...

[Tokens — Input: 312 | Summary: 47]
```

## API Reference

### `AzureSummarizer(azure_endpoint, api_key, api_version, deployment_name)`

| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `azure_endpoint` | `str` | env `AZURE_OPENAI_ENDPOINT` | Azure resource endpoint |
| `api_key` | `str` | env `AZURE_OPENAI_API_KEY` | Azure API key |
| `api_version` | `str` | `"2024-02-15-preview"` | API version |
| `deployment_name` | `str` | `"gi-local-gpt-5-mini"` | Deployed model name |

### `.summarize(text, summary_type, tone, focus_area, output_format) → SummaryResult`

| Parameter | Options | Default |
|-----------|---------|---------|
| `summary_type` | `"short"`, `"medium"`, `"detailed"` | `"medium"` |
| `tone` | `"neutral"`, `"formal"`, `"casual"` | `"neutral"` |
| `focus_area` | any string | `"general"` |
| `output_format` | `"text"`, `"bullets"`, `"json"` | `"text"` |

### `SummaryResult`

| Attribute | Type | Description |
|-----------|------|-------------|
| `summary` | `str` | The generated summary |
| `input_tokens` | `int` | Estimated tokens in the original text |
| `output_tokens` | `int` | Estimated tokens in the summary |

### Standalone token counter

```python
from llm_summarizer import count_tokens

count_tokens("Hello, world!")  # → 4
```

## Environment Variables

You can skip passing credentials directly and use env vars instead:

```bash
export AZURE_OPENAI_ENDPOINT="https://<resource>.openai.azure.com"
export AZURE_OPENAI_API_KEY="<your-key>"
```

## Publishing to PyPI

```bash
pip install build twine
python -m build
twine upload dist/*
```

## License

MIT
