Metadata-Version: 2.4
Name: ovos-wolfram-alpha-plugin
Version: 1.0.0a1
Summary: ovos wolfram-alpha plugin
Author-email: JarbasAi <jarbasai@mailfence.com>
License: Apache-2.0
Project-URL: Homepage, https://github.com/OpenVoiceOS/ovos-wolfram-alpha-plugin
Keywords: ovos,plugin,wolfram
Description-Content-Type: text/markdown
Requires-Dist: ovos-utils<1.0.0,>=0.7.0
Requires-Dist: ovos-plugin-manager<3.0.0,>=2.3.0a1
Requires-Dist: ovos-config<2.0.0,>=0.0.12
Requires-Dist: requests>=2.0.0
Requires-Dist: pydantic<3.0.0,>=2.0.0

# ovos-wolfram-alpha-plugin

[![PyPI](https://img.shields.io/pypi/v/ovos-wolfram-alpha-plugin)](https://pypi.org/project/ovos-wolfram-alpha-plugin/)
[![License](https://img.shields.io/badge/license-Apache%202.0-blue)](LICENSE)
[![Python](https://img.shields.io/badge/python-%3E%3D3.10-blue)](https://www.python.org/)

[Wolfram Alpha](https://www.wolframalpha.com/) integration for [OpenVoiceOS](https://openvoiceos.org). Provides a **retrieval engine** for RAG pipelines and an **agent toolbox** for tool-using agents, both as standard OPM plugins.

Wolfram Alpha excels at questions with a single definitive answer: maths, unit conversions, scientific constants, chemical properties, astronomy, nutrition, geography, and historical dates. It is not a search engine — it computes answers from curated data.

An [API key](https://products.wolframalpha.com/api/) is required. A demo key is bundled for development but is rate-limited and should not be used in production.

---

## Installation

```bash
pip install ovos-wolfram-alpha-plugin
```

---

## OPM Entry Points

| Entry point | Class | Use case |
|---|---|---|
| `opm.agents.retrieval` — `ovos-wolfram-alpha-solver` | `WolframAlphaRetrievalEngine` | RAG — returns `(answer, score)` tuples |
| `opm.agents.toolbox` — `ovos-wolfram-alpha-tools` | `WolframAlphaToolbox` | Agent tool use — exposes `search_wolfram_alpha` |

---

## Retrieval Engine

`WolframAlphaRetrievalEngine` implements the `RetrievalEngine` OPM interface. It calls the Wolfram Alpha spoken-answer endpoint and handles non-English queries by translating them to English before the request and back after.

```python
from ovos_wolfram_alpha_plugin import WolframAlphaRetrievalEngine

engine = WolframAlphaRetrievalEngine(config={"appid": "YOUR-KEY"})

# Maths & conversions
engine.get_spoken_answer("integral of x^2 sin(x)", lang="en")
# "x^2 (-cos(x)) + 2 x sin(x) + 2 cos(x) + constant"

engine.get_spoken_answer("100 miles in kilometers", lang="en")
# "160.934 kilometers"

engine.get_spoken_answer("1000 USD in EUR", lang="en")
# "approximately 923 euros"  (live rate)

# Science & constants
engine.get_spoken_answer("speed of light", lang="en")
# "about 2.998 × 10^8 meters per second"

engine.get_spoken_answer("boiling point of ethanol", lang="en")
# "78.37 degrees Celsius"

engine.get_spoken_answer("distance from Earth to Mars", lang="en")
# "currently about 1.69 AU"  (live ephemeris)

# Factual lookups
engine.get_spoken_answer("population of Brazil", lang="en")
# "approximately 215.3 million people"

engine.get_spoken_answer("calories in 100g of almonds", lang="en")
# "579 kilocalories"

engine.get_spoken_answer("when was the Eiffel Tower built", lang="en")
# "construction was from January 28, 1887 to March 31, 1889"

# Non-English — translated automatically
engine.get_spoken_answer("massa do Sol", lang="pt")
# "aproximadamente 1,989 × 10^30 kg"

# Image result — returns a local file path to a Wolfram visual
engine.get_image("benzene molecular structure", lang="en")

# Full structured pod results — list of {"title", "summary"} dicts
for pod in engine.get_expanded_answer("Neptune", lang="en"):
    print(pod["title"], "—", pod.get("summary", pod.get("img")))
# "Orbital period — 164.8 years"
# "Surface gravity — 11.15 m/s²"
# ...

# RAG interface: List[Tuple[str, float]]  (answer, score)
results = engine.query("half-life of carbon-14", lang="en")
# [("5730 years", 0.9)]
```

### Translation

Non-English queries require a translation plugin. Configure it by passing `translate_plugin` in the config:

```python
engine = WolframAlphaRetrievalEngine(config={
    "appid": "YOUR-KEY",
    "translate_plugin": "ovos-translate-plugin-server",
})
```

If no translation plugin is available, only English queries are answered.

---

## Agent Toolbox

`WolframAlphaToolbox` exposes a single `search_wolfram_alpha` tool that any OPM-compatible agent loop (e.g. [ovos-agentic-loop](https://github.com/OpenVoiceOS/ovos-agentic-loop)) can discover and call. The tool uses the LLM-optimised Wolfram endpoint, which returns a more structured answer than the spoken endpoint.

### Persona JSON

Reference the toolbox by its entry point name inside any agentic persona. Pass a `system_prompt` to the brain plugin so the LLM knows how to query Wolfram correctly:

```json
{
  "name": "Wolfram Alpha",
  "solvers": ["ovos-react-loop"],
  "ovos-react-loop": {
    "brain": "ovos-chat-openai-plugin",
    "toolboxes": ["ovos-wolfram-alpha-tools"],
    "ovos-chat-openai-plugin": {
      "api_url": "http://localhost:11434/v1/chat/completions",
      "system_prompt": "You have access to Wolfram Alpha. Use it for maths, science, unit conversions, and factual questions with a definite answer. Always send queries in English as concise keywords (e.g. 'France population', not 'how many people live in France'). Use the exponent notation 6*10^14, never 6e14. If the result is not relevant, retry with a more specific query rather than rephrasing."
    }
  }
}
```

> 💡 The [official LLM API docs](https://products.wolframalpha.com/llm-api/documentation) have more tips on writing effective Wolfram system prompts.

### Direct usage

```python
from ovos_wolfram_alpha_plugin import WolframAlphaToolbox, SearchWolframAlphaArgs

tb = WolframAlphaToolbox(config={"appid": "YOUR-KEY"})

tools = tb.discover_tools()
# [AgentTool(name="search_wolfram_alpha", ...)]

output = tb.search_wolfram(SearchWolframAlphaArgs(query="France population", units="metric"))
print(output.result)
```

---

## License

Apache 2.0 — see [LICENSE](LICENSE).
