Metadata-Version: 2.1
Name: ghoshell-moss
Version: 0.1.0.dev0
Summary: LLM-oriented operating system shell, providing interpreter for llm to control everything
Author: thirdgerb, 17wang
License: MIT
Requires-Python: >=3.10
Requires-Dist: ghoshell-common>=0.4.0.dev1
Requires-Dist: ghoshell-container>=0.3.0.dev1
Requires-Dist: numpy>=2.2.6
Requires-Dist: click>=8.3.0
Provides-Extra: zmq
Requires-Dist: zmq>=0.0.0; extra == "zmq"
Requires-Dist: aiozmq>=1.0.0; extra == "zmq"
Provides-Extra: mcp
Requires-Dist: mcp[cli]>=1.17.0; extra == "mcp"
Provides-Extra: wss
Requires-Dist: fastapi>=0.121.1; extra == "wss"
Requires-Dist: websockets>=15.0.1; extra == "wss"
Requires-Dist: uvicorn>=0.37.0; extra == "wss"
Provides-Extra: redis
Requires-Dist: fakeredis>=2.32.1; extra == "redis"
Requires-Dist: redis>=7.0.1; extra == "redis"
Description-Content-Type: text/markdown

# MOSShell

`MOSShell` (Model-Operated System Shell) is a Bash-like shell not for humans, but for AI models:
a dedicated runtime that translates model reasoning into structured,
executable commands for real-time tool and robot coordination.

In short, MOSShell does:

* `Present`: Presents function's source code directly as model-readable prompts.
* `Parse`: Requires and parses the model's structured `CTML` (Command Token Marked Language) output stream.
* `Execute`: Schedules and executes commands under a synchronous-blocking (same-channel) or asynchronous-parallel (
  cross-channel) strategy for streaming execution.

This allows the model to not just think, but act in real-time, providing a foundational layer for building Embodied AI.