Metadata-Version: 2.4
Name: omni-nli
Version: 0.1.0a2
Summary: A multi-interface (REST and MCP) server for natural language inference (NLI)
Project-URL: Repository, https://github.com/CogitatorTech/omni-nli
Project-URL: Documentation, https://cogitatortech.github.io/omni-nli/
Author-email: Hassan Abedi <hassan.abedi.t+omninli@gmail.com>
Maintainer-email: Hassan Abedi <hassan.abedi.t+omninli@gmail.com>
License: MIT
License-File: LICENSE
Keywords: contradiction,entailment,huggingface,llm,mcp,microservice,natural-language-inference,nli,ollama,openrouter,rest-api
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Programming Language :: Python :: 3.14
Classifier: Topic :: Internet :: WWW/HTTP :: WSGI :: Application
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Classifier: Topic :: Text Processing :: Linguistic
Classifier: Topic :: Utilities
Requires-Python: <4.0,>=3.10
Requires-Dist: accelerate<2.0.0,>=1.12.0
Requires-Dist: async-lru<3.0.0,>=2.0.4
Requires-Dist: click<9.0.0,>=8.2.1
Requires-Dist: gunicorn<24.0.0,>=23.0.0
Requires-Dist: httpx<0.29.0,>=0.28.1
Requires-Dist: huggingface-hub<1.0.0,>=0.27.0
Requires-Dist: json-repair<1.0.0,>=0.55.1
Requires-Dist: mcp[cli]<2.0.0,>=1.12.3
Requires-Dist: ollama<1.0.0,>=0.4.0
Requires-Dist: openai<2.0.0,>=1.60.0
Requires-Dist: pydantic-settings<3.0.0,>=2.10.1
Requires-Dist: pydantic<3.0.0,>=2.12.0
Requires-Dist: python-dotenv<2.0.0,>=1.1.0
Requires-Dist: python-json-logger<4.0.0,>=3.3.0
Requires-Dist: spectree[starlette]<3.0.0,>=2.0.1
Requires-Dist: tiktoken<1.0.0,>=0.8.0
Requires-Dist: torch<3.0.0,>=2.0.0
Requires-Dist: transformers<5.0.0,>=4.57.6
Provides-Extra: dev
Requires-Dist: asgi-lifespan[dev]<3.0.0,>=2.1.0; extra == 'dev'
Requires-Dist: hypothesis<7.0.0,>=6.150.2; extra == 'dev'
Requires-Dist: mkdocs-material<10.0.0,>=9.6.12; extra == 'dev'
Requires-Dist: mkdocs<2.0.0,>=1.6.1; extra == 'dev'
Requires-Dist: mkdocstrings-python<2.0.0,>=1.16.10; extra == 'dev'
Requires-Dist: mypy<2.0.0,>=1.11.1; extra == 'dev'
Requires-Dist: pre-commit<5.0.0,>=4.2.0; extra == 'dev'
Requires-Dist: pytest-asyncio<2.0.0,>=1.1.0; extra == 'dev'
Requires-Dist: pytest-cov<8.0.0,>=6.0.0; extra == 'dev'
Requires-Dist: pytest-mock<4.0.0,>=3.14.0; extra == 'dev'
Requires-Dist: pytest<10.0.0,>=8.0.1; extra == 'dev'
Requires-Dist: pyyaml<7.0.0,>=6.0.1; extra == 'dev'
Requires-Dist: ruff<1.0.0,>=0.9.3; extra == 'dev'
Requires-Dist: rundoc<0.3.0,>=0.2.3; extra == 'dev'
Description-Content-Type: text/markdown

<div align="center">
  <picture>
    <img alt="Omni-NLI Logo" src="logo.svg" width="200">
  </picture>
<br>

<h2>Omni-NLI</h2>

[![Tests](https://img.shields.io/github/actions/workflow/status/CogitatorTech/omni-nli/tests.yml?label=tests&style=flat&labelColor=333333&logo=github&logoColor=white)](https://github.com/CogitatorTech/omni-nli/actions/workflows/tests.yml)
[![Code Coverage](https://img.shields.io/codecov/c/github/CogitatorTech/omni-nli?style=flat&label=coverage&labelColor=333333&logo=codecov&logoColor=white)](https://codecov.io/gh/CogitatorTech/omni-nli)
[![Python Version](https://img.shields.io/badge/python-%3E=3.10-3776ab?style=flat&labelColor=333333&logo=python&logoColor=white)](https://github.com/CogitatorTech/omni-nli)
[![PyPI](https://img.shields.io/pypi/v/omni-nli?style=flat&labelColor=333333&logo=pypi&logoColor=white)](https://pypi.org/project/omni-nli/)
[![Documentation](https://img.shields.io/badge/docs-read-00acc1?style=flat&labelColor=282c34&logo=readthedocs)](https://CogitatorTech.github.io/omni-nli/)
[![License](https://img.shields.io/badge/license-MIT-00acc1?style=flat&labelColor=333333&logo=open-source-initiative&logoColor=white)](https://github.com/CogitatorTech/omni-nli/blob/main/LICENSE)
<br>
[![Examples](https://img.shields.io/badge/examples-view-green?style=flat&labelColor=382c34)](https://github.com/CogitatorTech/omni-nli/tree/main/examples)
[![Docker Image (CPU)](https://img.shields.io/badge/Docker-CPU-007ec6?style=flat&logo=docker)](https://github.com/CogitatorTech/omni-nli/pkgs/container/omni-nli-cpu)
[![Docker Image (CUDA)](https://img.shields.io/badge/Docker-CUDA-007ec6?style=flat&logo=docker)](https://github.com/CogitatorTech/omni-nli/pkgs/container/omni-nli-cuda)

A multi-interface (REST and MCP) server for natural language inference

</div>

---

Omni-NLI is a self-hostable server that provides [natural language inference (NLI)](https://en.wikipedia.org/wiki/Textual_entailment) capabilities via
RESTful and the Model Context Protocol (MCP) interfaces.
It can be used both as a very scalable standalone stateless microservice and also as an MCP server for AI agents to implement a verification layer
for AI-based applications like chatbots or virtual assistants.

### What is NLI?

Given two pieces of text called premise and hypothesis, NLI is the task of determining the logical relationship between them if it was done by a human.
The relationship is typically shown by one of three labels:

- `"entailment"`: the hypothesis is supported by the premise
- `"contradiction"`: the hypothesis is contradicted by the premise
- `"neutral"`: the hypothesis is neither supported nor contradicted by the premise

NLI is useful for a lot of applications, like fact-checking the output of large language models (LLMs) and checking the correctness of the answers a
question-answering system generates.

### Features

- Supports models provided by different backends, including Ollama, HuggingFace, and OpenRouter
- Supports REST API (for traditional applications) and MCP (for AI agents) interfaces
- Fully configurable and very scalable

See [ROADMAP.md](ROADMAP.md) for the list of implemented and planned features.

> [!IMPORTANT]
> Omni-NLI is in early development, so bugs and breaking changes are expected.
> Please use the [issues page](https://github.com/CogitatorTech/omni-nli/issues) to report bugs or request features.

---

### Quickstart

#### 1. Installation

```sh
pip install omni-nli
```

#### 2. Configure Backends

Copy the example config and add your API keys and other settings in the `.env` file.

```sh
cp .env.example .env
```

#### 3. Start the Server

```sh
omni-nli
```

The server will be listening on `http://127.0.0.1:8000` by default.

#### 4. Evaluate NLI

```sh
curl -X POST \
  -H "Content-Type: application/json" \
  -d '{
    "premise": "A football player kicks a ball into the goal.",
    "hypothesis": "The football player is asleep on the field."
  }' \
  http://127.0.0.1:8000/api/v1/nli/evaluate
```

Example response:

```json
{
    "label": "contradiction",
    "confidence": 0.99,
    "model": "microsoft/Phi-3.5-mini-instruct",
    "backend": "huggingface"
}
```

---

### Documentation

Check out the [Omni-NLI Documentation](https://cogitatortech.github.io/omni-nli/) for more information, including configuration options, API
reference, and examples.

---

### Contributing

Contributions are always welcome!
Please see [CONTRIBUTING.md](CONTRIBUTING.md) for details on how to get started.

### License

Omni-NLI is licensed under the MIT License (see [LICENSE](LICENSE)).

### Acknowledgements

- The logo is from [SVG Repo](https://www.svgrepo.com/svg/480613/puzzle-9) with some modifications.


<!-- Need to add this line for MCP registry publication -->
<!-- mcp-name: io.github.cogitatortech/omni-nli -->
