jeevesagent.model.anthropic

Adapter for Anthropic’s Claude models via the official anthropic SDK.

Streams via messages.stream; normalises Anthropic’s event types into our ModelChunk shape:

  • text_delta -> ModelChunk(kind="text", text=...)

  • input_json_delta accumulates partial tool-use JSON; on content_block_stop we emit ModelChunk(kind="tool_call", ...)

  • message_delta carries the final stop_reason and output token count; message_start carries the input token count

  • a single trailing ModelChunk(kind="finish", ...) is emitted when the stream ends, regardless of whether tools were called

The SDK is imported lazily inside __init__ so users can from jeevesagent.model import AnthropicModel without the anthropic extra installed; the import only fires when the constructor needs to build a default client.

Attributes

Classes

AnthropicModel

Talks to Claude via anthropic.AsyncAnthropic.

Module Contents

class jeevesagent.model.anthropic.AnthropicModel(model: str = 'claude-opus-4-7', *, client: Any = None, api_key: str | None = None, max_tokens: int = DEFAULT_MAX_TOKENS)[source]

Talks to Claude via anthropic.AsyncAnthropic.

async complete(messages: list[jeevesagent.core.types.Message], *, tools: list[jeevesagent.core.types.ToolDef] | None = None, temperature: float = 1.0, max_tokens: int | None = None) tuple[str, list[jeevesagent.core.types.ToolCall], jeevesagent.core.types.Usage, str][source]

Single-shot non-streaming completion.

Calls client.messages.create(...) (no stream=True, no stream context manager) — Anthropic returns the full Message in one HTTP response. We walk its content blocks once to assemble (text, tool_calls, usage, stop_reason). Used by the non-streaming hot path (agent.run()); agent.stream() keeps using stream().

Falls back to consuming stream() if the underlying client raises (test fakes that only support streaming, or transports that don’t honour single-shot creation).

async stream(messages: list[jeevesagent.core.types.Message], *, tools: list[jeevesagent.core.types.ToolDef] | None = None, temperature: float = 1.0, max_tokens: int | None = None) collections.abc.AsyncIterator[jeevesagent.core.types.ModelChunk][source]
name = 'claude-opus-4-7'
jeevesagent.model.anthropic.DEFAULT_MAX_TOKENS = 4096