Metadata-Version: 2.4
Name: ipyai
Version: 0.0.5
Summary: Minimal IPython backtick-to-AI extension
Author-email: Jeremy Howard <info@answer.ai>
License: Apache-2.0
Project-URL: Source, https://github.com/answerdotai/ipyai
Requires-Python: >=3.10
Description-Content-Type: text/markdown
Requires-Dist: fastcore
Requires-Dist: ipython>=9
Requires-Dist: lisette>=0.0.47
Requires-Dist: rich
Provides-Extra: dev
Requires-Dist: build>=1.0.0; extra == "dev"
Requires-Dist: fastship>=0.0.6; extra == "dev"
Requires-Dist: pytest; extra == "dev"
Requires-Dist: twine>=5.0.0; extra == "dev"

# ipyai

`ipyai` is an IPython extension that turns any input starting with `` ` `` into an AI prompt.

It is aimed at terminal IPython, not notebook frontends. Prompts stream through `lisette`, final output is rendered with `rich`, and prompt history is stored alongside normal IPython history in the same SQLite database.

When imported, `ipyai` also applies two small IPython compatibility fixes borrowed from `ipykernel_helper` for traceback and `inspect.getfile` edge cases.

## Install

```bash
pip install ipyai
```

## Load

```python
%load_ext ipyai
```

If you change the package in a running shell:

```python
%reload_ext ipyai
```

## How To Auto-Load `ipyai`

`ipyai` is designed for terminal IPython. To auto-load it, add this to an `ipython_config.py` file used by terminal `ipython`:

```python
c.TerminalIPythonApp.extensions = ["ipyai.core"]
```

Good places for that file include:

- env-local: `{sys.prefix}/etc/ipython/ipython_config.py`
- user-local: `~/.ipython/profile_default/ipython_config.py`
- system-wide IPython config directories

In a virtualenv, the env-local path is usually:

- `.venv/etc/ipython/ipython_config.py`

To see which config paths your current `ipython` is searching, run:

```bash
ipython --debug -c 'exit()' 2>&1 | grep Searching
```

## Usage

Only the leading backtick is special. There is no closing delimiter.

Single line:

```python
`write a haiku about sqlite
```

Multiline paste:

```python
`summarize this module:
focus on state management
and persistence behavior
```

Backslash-Enter continuation in the terminal:

```python
`draft a migration plan \
with risks and rollback steps
```

`ipyai` also provides a line and cell magic named `%ipyai` / `%%ipyai`.

## `%ipyai` commands

```python
%ipyai
%ipyai model claude-sonnet-4-6
%ipyai think m
%ipyai search h
%ipyai code_theme monokai
%ipyai save
%ipyai reset
```

Behavior:

- `%ipyai` prints the active model, think level, search level, code theme, logging flag, and the current config file paths
- `%ipyai model ...`, `%ipyai think ...`, `%ipyai search ...`, `%ipyai code_theme ...` change the current session only
- `%ipyai save` writes the current session's code and AI prompts to `startup.json`
- `%ipyai reset` deletes AI prompt history for the current IPython session and resets the code-context baseline

## Tools

To expose a function from the active IPython namespace as a tool for the current conversation, reference it as `&\`name\`` in the prompt:

```python
def weather(city): return f"Sunny in {city}"

`use &`weather` to answer the question about Brisbane
```

The tool name exposed to the model is the namespace name you referenced, so callable objects bound in `user_ns` also work as expected. Async callables are also supported.

## Output Rendering

Responses are streamed directly to the terminal during generation.

- in a TTY, `ipyai` uses Rich `Live(Markdown(...))` so the visible response is rendered as markdown while it streams
- the stored response remains the original full `lisette` output
- tool call detail blocks are compacted in the visible output to a short single-line form such as `🔧 f(x=1) => 2`
- streamed AI responses are intentionally suppressed from IPython's normal `output_history`; `ipyai` stores them in `ai_prompts` instead

## Startup Replay

On first load, `ipyai` also creates `~/.config/ipyai/startup.json`.

`%ipyai save` snapshots the current IPython session into that file:

- normal code cells are saved as code events
- AI prompts are saved as prompt/response events

When `ipyai` loads into a fresh terminal IPython session:

- saved code events are replayed with `run_cell(..., store_history=True)`
- saved prompt/response pairs are inserted into `ai_prompts` for the new session

This is intended for priming new sessions with imports, helper definitions, tools, and prior AI context without re-running the prompts themselves.

## Configuration

At import time, `ipyai` defines these XDG-backed module path variables:

- `~/.config/ipyai/config.json`
- `~/.config/ipyai/sysp.txt`
- `~/.config/ipyai/startup.json`
- `~/.config/ipyai/exact-log.jsonl`

Those files are created on demand when `ipyai` first needs them.

`config.json` currently supports:

```json
{
  "model": "claude-sonnet-4-6",
  "think": "l",
  "search": "l",
  "code_theme": "monokai",
  "log_exact": false
}
```

Notes:

- `model` defaults from `IPYAI_MODEL` if that environment variable is set when the config file is first created
- `think` and `search` must be one of `l`, `m`, or `h`
- `code_theme` is passed to Rich for fenced and inline code styling
- `log_exact`, when true, appends the exact full prompt sent to the model and the exact raw response returned by the model to `~/.config/ipyai/exact-log.jsonl`

`sysp.txt` is used as the system prompt passed to `lisette.AsyncChat`.

## Development

See [DEV.md](DEV.md) for project layout, architecture, persistence details, and development workflow.
