Metadata-Version: 2.4
Name: aria-cli
Version: 1.0.0
Summary: ARIA CLI - Natural Language Quantum AI Command Line Interface with Q0-Q38 System
Project-URL: Homepage, https://github.com/universal-crown-prime/aria-cli
Project-URL: Documentation, https://aria-cli.readthedocs.io
Project-URL: Repository, https://github.com/universal-crown-prime/aria-cli
Project-URL: Issues, https://github.com/universal-crown-prime/aria-cli/issues
Project-URL: Changelog, https://github.com/universal-crown-prime/aria-cli/blob/main/CHANGELOG.md
Author-email: Universal Crown Prime <aria@universalcrownprime.com>
Maintainer-email: Universal Crown Prime <aria@universalcrownprime.com>
License-Expression: MIT
License-File: LICENSE
Keywords: ai,automation,cli,cognitive-architecture,command-line,f1-cycle,github,llm,natural-language,nlp,q-system,q38,quantum
Classifier: Development Status :: 4 - Beta
Classifier: Environment :: Console
Classifier: Intended Audience :: Developers
Classifier: Intended Audience :: Science/Research
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Classifier: Topic :: Utilities
Requires-Python: >=3.9
Provides-Extra: dev
Requires-Dist: black>=23.0.0; extra == 'dev'
Requires-Dist: mypy>=1.0.0; extra == 'dev'
Requires-Dist: pytest-cov>=4.0.0; extra == 'dev'
Requires-Dist: pytest>=7.0.0; extra == 'dev'
Requires-Dist: ruff>=0.1.0; extra == 'dev'
Provides-Extra: embeddings
Requires-Dist: numpy>=1.24.0; extra == 'embeddings'
Requires-Dist: sentence-transformers>=2.2.0; extra == 'embeddings'
Provides-Extra: full
Requires-Dist: anthropic>=0.18.0; extra == 'full'
Requires-Dist: httpx>=0.25.0; extra == 'full'
Requires-Dist: llama-cpp-python>=0.2.0; extra == 'full'
Requires-Dist: numpy>=1.24.0; extra == 'full'
Requires-Dist: openai>=1.0.0; extra == 'full'
Requires-Dist: rich>=13.0.0; extra == 'full'
Requires-Dist: sentence-transformers>=2.2.0; extra == 'full'
Requires-Dist: torch>=2.0.0; extra == 'full'
Requires-Dist: transformers>=4.35.0; extra == 'full'
Requires-Dist: typer>=0.9.0; extra == 'full'
Provides-Extra: llm
Requires-Dist: anthropic>=0.18.0; extra == 'llm'
Requires-Dist: httpx>=0.25.0; extra == 'llm'
Requires-Dist: openai>=1.0.0; extra == 'llm'
Provides-Extra: local
Requires-Dist: llama-cpp-python>=0.2.0; extra == 'local'
Requires-Dist: torch>=2.0.0; extra == 'local'
Requires-Dist: transformers>=4.35.0; extra == 'local'
Provides-Extra: rich
Requires-Dist: rich>=13.0.0; extra == 'rich'
Requires-Dist: typer>=0.9.0; extra == 'rich'
Description-Content-Type: text/markdown

# ARIA CLI v0.2.2

<div align="center">

```
    █████╗ ██████╗ ██╗ █████╗      ██████╗██╗     ██╗
   ██╔══██╗██╔══██╗██║██╔══██╗    ██╔════╝██║     ██║
   ███████║██████╔╝██║███████║    ██║     ██║     ██║
   ██╔══██║██╔══██╗██║██╔══██║    ██║     ██║     ██║
   ██║  ██║██║  ██║██║██║  ██║    ╚██████╗███████╗██║
   ╚═╝  ╚═╝╚═╝  ╚═╝╚═╝╚═╝  ╚═╝     ╚═════╝╚══════╝╚═╝
```

**Advanced Recursive Intelligence Architecture**

[![PyPI version](https://badge.fury.io/py/aria-cli.svg)](https://badge.fury.io/py/aria-cli)
[![npm version](https://badge.fury.io/js/@alphamatt%2Faria-cli.svg)](https://badge.fury.io/js/@alphamatt%2Faria-cli)
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
[![Python 3.9+](https://img.shields.io/badge/python-3.9+-blue.svg)](https://www.python.org/downloads/)

</div>

---

**Natural Language Quantum AI Command Line Interface** with full **Q0-Q38 Cognitive Architecture**

## ✨ What's New in v0.2.1

- **🤖 ARIA Agents** - Autonomous agents with Q-System integration
- **📡 Agent-to-Agent Communication** - Multi-agent messaging, channels, and swarms
- **🐝 Agent Swarm** - Coordinated multi-agent task execution

## ✨ What's New in v0.2.0

- **39 Cognitive Layers (Q0-Q38)** - Four domains: Foundation, Reasoning, Metacognition, Transcendence
- **Multi-backend LLM Integration** - OpenAI, Anthropic, Ollama, llama.cpp, Transformers
- **Q38 Cluster Connector** - Native integration with 64-direction distributed processing
- **CLI-GUI Sync** - Real-time synchronization between CLI and GUI interfaces
- **Plugin System** - Extensible architecture for custom functionality
- **Rich Terminal UI** - Beautiful output with typer + rich
- **🔐 Authentication System** - Secure login with local, API key, and SSH key auth
- **🔗 SSH Router** - Multi-host SSH management, tunnels, and remote execution

## 📦 Installation

### From PyPI (Recommended)

```bash
pip install aria-cli
```

### With LLM Support
```bash
pip install aria-cli[openai]      # OpenAI backend
pip install aria-cli[anthropic]   # Anthropic backend
pip install aria-cli[local]       # Local models (llama.cpp, transformers)
pip install aria-cli[full]        # All backends + rich UI
```

### From npm (Node.js wrapper)
```bash
npm install -g @alphamatt/aria-cli
```

### From Source

```bash
git clone https://github.com/universal-crown-prime/aria-cli.git
cd aria-cli
pip install -e .
```

## 🚀 Quick Start

```bash
# Ask a question
aria ask "What is quantum computing?"

# Search for information
aria search "trading systems"

# Show Q-System layers
aria layers

# Show system status
aria status

# Enter interactive chat mode
aria chat

# Remember something
aria remember "Meeting at 3pm tomorrow" --key meeting

# Recall memories
aria recall --key meeting

# Login
aria login

# SSH to a host
aria ssh connect production

# Create and manage agents
aria agent create my-assistant --type reasoning
aria agent list
aria agent send my-assistant "Analyze this data"
```

## 🤖 ARIA Agents

Autonomous AI agents with Q-System integration for distributed processing.

### Agent Types

| Type | Q-Layer | Description |
|------|---------|-------------|
| `task` | Q23 | Task execution (shell, files) |
| `reasoning` | Q10 | Logical analysis and inference |
| `coordinator` | Q21 | Multi-agent coordination |
| `generator` | Q19 | Content and code generation |

### CLI Commands

```bash
# Create an agent
aria agent create my-agent --type task --start

# List all agents
aria agent list

# Get agent status
aria agent status my-agent

# Send message to agent
aria agent send my-agent "Process this request" --priority HIGH

# Start/stop agent
aria agent start my-agent
aria agent stop my-agent

# View agent types and capabilities
aria agent types
aria agent capabilities
```

### Agent Channels

```bash
# Create communication channel
aria agent channel create task-channel --type direct --agent-a agent1 --agent-b agent2

# List channels
aria agent channel list
```

### Agent Swarm

```bash
# Create a coordinated swarm
aria agent swarm create my-swarm
```

### Python API

```python
from aria_cli import (
    AriaAgent, AgentRegistry, AgentSwarm,
    create_agent, get_registry,
    TaskAgent, ReasoningAgent, CoordinatorAgent,
    AgentMessage, MessageType, AgentCapability,
)
import asyncio

# Create agents
task_agent = create_agent("task", "file-processor")
reasoning_agent = create_agent("reasoning", "analyzer")

# Register with global registry
registry = get_registry()
asyncio.run(registry.register(task_agent))
asyncio.run(registry.register(reasoning_agent))

# Start agents
asyncio.run(task_agent.start())
asyncio.run(reasoning_agent.start())

# Send message between agents
message = AgentMessage(
    type=MessageType.REQUEST,
    sender=task_agent.id,
    recipient=reasoning_agent.id,
    payload={"action": "analyze", "data": {...}}
)
asyncio.run(reasoning_agent.receive(message))

# Create a swarm for coordinated tasks
swarm = AgentSwarm("processing-swarm")
swarm.add_agent(task_agent)
swarm.add_agent(reasoning_agent)

asyncio.run(swarm.initialize())

# Execute complex task
result = asyncio.run(swarm.execute({
    "goal": "Process and analyze data",
    "data": {...}
}))
```

### Custom Agents

```python
from aria_cli import AriaAgent, QLayer, AgentCapability, AgentMessage
from typing import Optional

class MyCustomAgent(AriaAgent):
    def __init__(self):
        super().__init__(
            name="my-custom-agent",
            q_layer=QLayer.Q15_CAUSATION,
            capabilities=[
                AgentCapability.ANALYSIS,
                AgentCapability.REASONING,
            ]
        )
    
    async def process_message(self, message: AgentMessage) -> Optional[AgentMessage]:
        action = message.payload.get("action")
        
        if action == "analyze_causation":
            result = await self.analyze_causes(message.payload.get("data"))
            return message.reply({"causes": result})
        
        return message.reply({"error": "Unknown action"})
    
    async def analyze_causes(self, data):
        # Your custom logic here
        return {"primary_cause": "...", "contributing_factors": [...]}
```

### Agent Communication Channels

```python
from aria_cli import (
    ChannelManager, DirectChannel, BroadcastChannel,
    StreamChannel, get_channel_manager,
)

# Get channel manager
manager = get_channel_manager()

# Create direct channel between two agents
direct = manager.create_direct("task-comm", agent_a, agent_b)
asyncio.run(direct.open())

# Create broadcast channel for events
broadcast = manager.create_broadcast("events", publisher_agent)
broadcast.subscribe(subscriber1.id)
broadcast.subscribe(subscriber2.id)
asyncio.run(broadcast.open())

# Broadcast an event
asyncio.run(broadcast.publish({"event": "task_complete", "result": {...}}))

# Create streaming channel for continuous data
stream = manager.create_stream("data-stream", producer, consumer)
asyncio.run(stream.open())

# Stream data
for chunk in data_generator():
    asyncio.run(stream.stream(chunk))
asyncio.run(stream.end_stream())
```

## 🔐 Authentication

ARIA CLI includes a secure authentication system for managing credentials and sessions.

### Login Commands

```bash
# Interactive login
aria login

# Login with API key
aria login --api-key sk-... --provider openai

# Login with SSH key
aria login --ssh-key ~/.ssh/id_rsa

# Check current user
aria whoami

# Logout
aria logout
```

### Python API

```python
from aria_cli import AuthManager, get_auth

# Get auth manager
auth = get_auth()

# Register a new user
auth.register("username", "password", email="user@example.com")

# Login
session = auth.login("username", "password")
print(f"Logged in as {session.user.username}")

# Check authentication
if auth.is_authenticated:
    print(f"Welcome, {auth.current_user.username}")

# Login with API key
session = auth.login_with_api_key("sk-...", provider="openai")

# Logout
auth.logout()
```

### Credential Storage

Credentials are stored securely using:
- System keyring (when available)
- Encrypted file storage (fallback)

```
~/.aria/
├── session.json         # Current session (encrypted token)
├── .credentials         # Encrypted credential storage
└── .key                 # Encryption key (restricted permissions)
```

## 🔗 SSH Router

Manage SSH connections, execute remote commands, and create tunnels.

### CLI Commands

```bash
# Add a host
aria ssh add production --hostname prod.example.com --user deploy --key ~/.ssh/id_rsa

# List hosts
aria ssh list

# Test connection
aria ssh test production

# Connect interactively
aria ssh connect production

# Execute remote command
aria ssh exec production "ls -la /var/www"

# Execute on multiple hosts
aria ssh exec production,staging "uptime"

# Create tunnel (local:8080 → remote:80)
aria ssh tunnel production --local 8080 --remote 80

# Upload file
aria ssh upload production ./local-file.txt /remote/path/

# Download file
aria ssh download production /remote/file.txt ./local-path/
```

### Python API

```python
from aria_cli import SSHRouter, SSHHost, get_router
from pathlib import Path

# Get router
router = get_router()

# Add a host
router.add_host(SSHHost(
    name="production",
    hostname="prod.example.com",
    username="deploy",
    key_file=Path("~/.ssh/id_rsa").expanduser(),
    port=22,
    tags=["prod", "web"],
))

# List hosts
hosts = router.list_hosts()
for host in hosts:
    print(f"{host.name}: {host.hostname}")

# Test connection
success, message = router.test_connection("production")
print(f"Connection: {message}")

# Execute command
result = router.execute("production", "ls -la")
print(result.stdout)
print(f"Exit code: {result.exit_code}")

# Execute on multiple hosts in parallel
results = router.execute_multi(
    ["production", "staging"],
    "uptime",
    parallel=True
)
for host, result in results.items():
    print(f"{host}: {result.stdout}")

# Create SSH tunnel
router.create_tunnel(
    "production",
    local_port=8080,
    remote_port=80,
)

# Create reverse tunnel
router.create_reverse_tunnel(
    "production",
    remote_port=9000,
    local_port=3000,
)

# Upload file
router.upload("production", "./local-file.txt", "/remote/path/")

# Download file
router.download("production", "/remote/file.txt", "./local-path/")

# Interactive session
router.connect_interactive("production")

# Close tunnels
router.close_tunnel("production", 8080)
router.disconnect_all()
```

### SSH Configuration

The router automatically loads hosts from `~/.ssh/config`:

```
# ~/.ssh/config
Host production
    HostName prod.example.com
    User deploy
    IdentityFile ~/.ssh/id_rsa
    Port 22

Host staging
    HostName staging.example.com
    User deploy
    ProxyJump bastion
```

Additional hosts can be stored in `~/.aria/ssh/hosts.json`.

## 📱 AriaSpace - Phone ↔ Computer Sync

AriaSpace is a personal codespace system for securely syncing files between your phone (Android/Termux) and computer. Think GitHub Codespaces, but for your personal devices.

### Architecture

```
    Phone (Termux)                    Computer
    ┌─────────────────┐              ┌─────────────────┐
    │  AriaSpace      │    SSH       │  AriaSpace      │
    │  Client         │◄────────────►│  Server         │
    │                 │              │                 │
    │  /storage/      │   Sync       │  ~/aria-space/  │
    │  emulated/0/    │◄────────────►│  workspaces/    │
    │  Meta AI/       │              │                 │
    └─────────────────┘              └─────────────────┘
```

### Termux Setup (On Android)

1. Install Termux from F-Droid
2. Run the ARIA setup script:

```bash
# Get the setup script
aria termux setup-script | bash

# Or manually:
pkg update && pkg install openssh -y
termux-setup-storage
sshd
```

3. Note your IP address: `ip addr show wlan0`

### Connect from Computer

```bash
# Discover devices on network
aria termux discover

# Or add device manually
aria termux add-host termux-phone 192.168.1.100 --port 8022

# Test connection
aria termux test termux-phone

# Find Meta AI folder
aria termux find-meta-ai termux-phone

# List Android folders
aria termux folders termux-phone
```

### Create Workspace

```bash
# Quick setup for Meta AI folder
aria space create-meta-ai termux-phone

# Or create custom workspace
aria space create my-docs termux-phone "/storage/emulated/0/Documents" \
    --local ~/aria-space/my-docs

# List workspaces
aria space list
```

### Sync Files

```bash
# Bidirectional sync
aria space sync meta-ai

# Download only (phone → computer)
aria space pull meta-ai

# Upload only (computer → phone)
aria space push meta-ai

# Preview changes without syncing
aria space sync meta-ai --dry-run
```

### Browse & Manage

```bash
# List files with status
aria space files meta-ai

# Browse remote directory
aria space browse meta-ai --path images

# Check workspace status
aria space status meta-ai
```

### Python API

```python
from aria_cli import AriaSpace, TermuxSetup, get_space

# Setup Termux connection
setup = TermuxSetup()
setup.wizard()  # Interactive setup

# Or programmatically
setup.append_ssh_config("termux-phone", "192.168.1.100", port=8022)

# Create workspace
space = get_space()
ws = space.create_workspace(
    name="meta-ai",
    local_root="~/aria-space/meta-ai",
    remote_root="/storage/emulated/0/Meta AI",
    remote_host="termux-phone",
)

# List files
files = space.list_files("meta-ai")
for f in files:
    print(f"{f.status.value}: {f.relative_path}")

# Sync workspace
result = space.sync("meta-ai")
print(f"Uploaded: {result.files_uploaded}, Downloaded: {result.files_downloaded}")

# Download specific file
space.download_file("meta-ai", "images/photo.jpg")

# Upload file
space.upload_file("meta-ai", "notes.txt")

# Browse remote
entries = space.browse_remote("meta-ai", "images")
for entry in entries:
    print(f"{'📁' if entry['is_dir'] else '📄'} {entry['name']}")

# Watch for changes (auto-sync)
space.watch("meta-ai", interval=60)
```

### Supported Android Folders

| Folder | Path |
|--------|------|
| Meta AI | `/storage/emulated/0/Meta AI` |
| Downloads | `/storage/emulated/0/Download` |
| Documents | `/storage/emulated/0/Documents` |
| Pictures | `/storage/emulated/0/Pictures` |
| DCIM | `/storage/emulated/0/DCIM` |
| Movies | `/storage/emulated/0/Movies` |
| Music | `/storage/emulated/0/Music` |
| Termux Home | `/data/data/com.termux/files/home` |

### Storage Structure

```
~/.aria/
├── spaces/
│   ├── workspaces.json      # Workspace configurations
│   ├── state/               # Sync state per workspace
│   │   ├── meta-ai.json
│   │   └── ...
│   └── snapshots/           # Workspace snapshots
│       └── meta-ai/
│           └── 20241222_143052.json
└── termux/
    ├── devices.json         # Discovered devices
    └── aria_termux_setup.sh # Setup script
```

## 🧠 Q-System Architecture

The Q-System is a 39-layer recursive cognitive architecture:

## Configuration

ARIA CLI stores configuration in `~/.aria/`:

```
~/.aria/
├── config.json          # CLI configuration
├── sync_state.json      # CLI-GUI sync state
└── q-memory/            # Q-System memory storage
    ├── memory_*.json    # Saved memories
    └── ...
```

### Configuration Options

Edit `~/.aria/config.json`:

```json
{
  "llm": {
    "backend": "openai",
    "model": "gpt-4o-mini",
    "temperature": 0.7
  },
  "q_system": {
    "default_layer": 8,
    "trace_enabled": true
  },
  "ui": {
    "theme": "dark",
    "show_layer_info": true
  }
}
```

### Environment Variables

```bash
# LLM backends
export OPENAI_API_KEY="sk-..."
export ANTHROPIC_API_KEY="sk-ant-..."
export OLLAMA_HOST="http://localhost:11434"

# Q38 Cluster
export Q38_API_URL="http://localhost:8080"
export Q38_SYNC_PORT="9000"
```

## 🧩 Q-System Layers

### Foundation (Q0-Q9)
| Layer | Name | Description |
|-------|------|-------------|
| Q0 | VOID | The null state - pure potential |
| Q1 | PERCEPTION | Raw sensory input processing |
| Q2 | ATTENTION | Focus and salience detection |
| Q3 | PATTERN | Pattern recognition |
| Q4 | MEMORY_SHORT | Working memory |
| Q5 | MEMORY_LONG | Long-term storage |
| Q6 | ASSOCIATION | Concept linking |
| Q7 | CONTEXT | Contextual understanding |
| Q8 | LANGUAGE | Linguistic processing |
| Q9 | EMOTION | Affective processing |

### Reasoning (Q10-Q19)
| Layer | Name | Description |
|-------|------|-------------|
| Q10 | LOGIC | Formal logical reasoning |
| Q11 | INFERENCE | Drawing conclusions |
| Q12 | HYPOTHESIS | Generating hypotheses |
| Q13 | ABSTRACTION | Abstract concept formation |
| Q14 | ANALOGY | Analogical reasoning |
| Q15 | CAUSATION | Causal reasoning |
| Q16 | PREDICTION | Future projection |
| Q17 | EVALUATION | Assessment |
| Q18 | SYNTHESIS | Combining ideas |
| Q19 | CREATIVITY | Novel generation |

### Action (Q20-Q29)
| Layer | Name | Description |
|-------|------|-------------|
| Q20 | INTENTION | Goal setting |
| Q21 | PLANNING | Strategy formation |
| Q22 | DECISION | Choice making |
| Q23 | EXECUTION | Action taking |
| Q24 | MONITORING | Progress tracking |
| Q25 | FEEDBACK | Loop processing |
| Q26 | CORRECTION | Error correction |
| Q27 | OPTIMIZATION | Performance tuning |
| Q28 | LEARNING | Knowledge acquisition |
| Q29 | ADAPTATION | Behavioral adjustment |

### Transcendence (Q30-Q38)
| Layer | Name | Description |
|-------|------|-------------|
| Q30 | AWARENESS | Self-awareness |
| Q31 | REFLECTION | Self-reflection |
| Q32 | METACOGNITION | Thinking about thinking |
| Q33 | IDENTITY | Self-identity modeling |
| Q34 | VALUES | Value system and ethics |
| Q35 | WISDOM | Applied knowledge |
| Q36 | INTEGRATION | Holistic processing |
| Q37 | EMERGENCE | Emergent properties |
| Q38 | TRANSCENDENCE | Beyond individual layers |

## 🔌 Python API

```python
from aria_cli import QSystem, QLayer, QCommand, QOperator

# Initialize Q-System
q = QSystem()

# Create and execute a command
cmd = QCommand(
    operator=QOperator.ASK,
    payload={"question": "What is consciousness?"},
    layer=QLayer.Q30_AWARENESS,
)
result = q.execute(cmd)
print(result)
```

### With LLM Integration

```python
from aria_cli import QSystem, LLMConnector, LLMProvider

# Initialize with OpenAI
q = QSystem()
llm = LLMConnector(LLMProvider.OPENAI)

# Generate with Q-System context
response = llm.generate(
    "Explain quantum consciousness",
    system="You are ARIA at Q-Layer 38"
)
print(response)
```

### NLP Processing

```python
from aria_cli import NLPEngine

nlp = NLPEngine()

# Parse natural language
parsed = nlp.parse("Search for files containing quantum algorithms")
print(f"Intent: {parsed.intent}")       # Intent.SEARCH
print(f"Keywords: {parsed.keywords}")   # ['files', 'quantum', 'algorithms']
print(f"Q-Layer: Q{parsed.q_layer}")    # Q3 (PATTERN)
```

### CLI-GUI Synchronization

```python
from aria_cli import AriaConnector, SyncMode

# File-based sync
connector = AriaConnector(mode=SyncMode.FILE, source='cli')
connector.start()

# Update state
connector.update_state(current_layer=30)

# Stop sync
connector.stop()
```

## 🔄 Migration from v0.1.x

See [MIGRATION.md](MIGRATION.md) for detailed upgrade instructions.

### Key Changes
| Feature | v0.1.x | v0.2.0 |
|---------|--------|--------|
| Q-System Layers | 8 | **39 (Q0-Q38)** |
| LLM Integration | Optional | Multi-backend |
| Package Structure | `aria_cli/` | `src/aria_cli/` |
| CLI Framework | argparse | typer + rich |
| Async Support | ❌ | ✅ |
| GUI Sync | ❌ | ✅ |

## 📚 Commands Reference

| Command | Description |
|---------|-------------|
| `aria ask <question>` | Ask a natural language question |
| `aria search <query>` | Search for information |
| `aria chat` | Enter interactive chat mode |
| `aria layers` | Show all 39 Q-System layers |
| `aria status` | Show system status |
| `aria remember <content>` | Save to memory |
| `aria recall` | List/retrieve memories |
| `aria help` | Show help |

## 🔧 Development

### Setup

```bash
git clone https://github.com/universal-crown-prime/aria-cli.git
cd aria-cli
pip install -e ".[dev]"
```

### Testing

```bash
pytest
```

### Building

```bash
pip install build
python -m build
```

## 📄 License

MIT License - see [LICENSE](LICENSE) for details.

## 🤝 Contributing

Contributions are welcome! Please see [CONTRIBUTING.md](CONTRIBUTING.md) for guidelines.

## 📬 Support

- **Issues**: [GitHub Issues](https://github.com/universal-crown-prime/aria-cli/issues)
- **Discussions**: [GitHub Discussions](https://github.com/universal-crown-prime/aria-cli/discussions)
- **Documentation**: [Read the Docs](https://aria-cli.readthedocs.io)
