Metadata-Version: 2.4
Name: ignis_lfx
Version: 0.1.0
Summary: IGNIS LFX - Langflow Executor for advanced flow execution and component management
Author-email: Infogain <nishanth.p@infogain.com>
License: MIT
Project-URL: Homepage, https://github.com/Infogain-GenAI/ignis-lfx
Project-URL: Repository, https://github.com/Infogain-GenAI/ignis-lfx.git
Project-URL: Issues, https://github.com/Infogain-GenAI/ignis-lfx/issues
Project-URL: Documentation, https://github.com/Infogain-GenAI/ignis-lfx#readme
Keywords: langflow,executor,lfx,flow-execution,automation,ignis
Classifier: Development Status :: 4 - Beta
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Classifier: Topic :: Utilities
Requires-Python: >=3.9
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: fastapi>=0.128.0
Requires-Dist: pydantic>=2.0.0
Requires-Dist: langchain-core>=0.3.0
Requires-Dist: langchain-ollama>=0.1.0
Requires-Dist: langchain-openai>=0.1.0
Requires-Dist: langchain-ibm>=0.1.0
Requires-Dist: orjson>=3.10.0
Requires-Dist: typer>=0.12.0
Requires-Dist: httpx>=0.25.0
Provides-Extra: dev
Requires-Dist: pytest>=7.0; extra == "dev"
Requires-Dist: pytest-cov>=4.0; extra == "dev"
Requires-Dist: black>=23.0; extra == "dev"
Requires-Dist: ruff>=0.1.0; extra == "dev"
Requires-Dist: mypy>=1.0; extra == "dev"
Provides-Extra: docs
Requires-Dist: sphinx>=7.0; extra == "docs"
Requires-Dist: sphinx-rtd-theme>=1.3.0; extra == "docs"
Dynamic: license-file

# IGNIS LFX - Langflow Executor

[![Python Version](https://img.shields.io/badge/Python-3.9+-blue.svg)](https://www.python.org/downloads/)
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
[![GitHub Repository](https://img.shields.io/badge/GitHub-Infogain--GenAI%2Fignis--lfx-blue)](https://github.com/Infogain-GenAI/ignis-lfx)

**IGNIS LFX** is a powerful Python package that provides execution capabilities for Langflow flows with advanced component management, memory handling, and flow execution features.

## Features

- 🚀 **Flow Execution**: Execute Langflow flows programmatically with full control
- 💾 **Memory Management**: Built-in session-based memory management for conversational flows
- 🔧 **Component System**: Access and utilize extensive component library for flow building
- 📊 **MCP Integration**: Model Context Protocol support for advanced integrations
- 🛡️ **Type Safety**: Full type hints for better IDE support and development experience
- ⚡ **FastAPI Integration**: Ready-to-use FastAPI integration for web services
- 🔌 **Multi-LLM Support**: Support for OpenAI, Ollama, IBM LangChain, and more

## Installation

Install **ignis_lfx** from PyPI:

```bash
pip install ignis_lfx
```

Or with optional dependencies:

```bash
# Development tools and testing
pip install ignis_lfx[dev]

# Documentation tools
pip install ignis_lfx[docs]

# All optional dependencies
pip install ignis_lfx[dev,docs]
```

## Quick Start

### Basic Flow Execution

```python
from ignis_lfx import execute_flow

# Load and execute a flow
result = execute_flow(
    flow_name="my_flow.json",
    input_data={"question": "What is Python?"}
)

print(result)
```

### FastAPI Integration

```python
from fastapi import FastAPI
from ignis_lfx import execute_flow

app = FastAPI()

@app.post("/execute")
async def run_flow(input_data: dict):
    result = await execute_flow(
        flow_name="assistant.json",
        input_data=input_data
    )
    return {"result": result}
```

### Memory-Based Chat

```python
from ignis_lfx.memory import SessionMemory

# Initialize session memory
memory = SessionMemory(session_id="user_123")

# Store conversation history
memory.save("user", "Hello, how are you?")
memory.save("assistant", "I'm doing great! How can I help?")

# Load conversation history
history = memory.load("user_123")
```

## Configuration

### Environment Variables

```bash
# LFX Configuration
LANGFLOW_DEV=false
LFX_API_KEY=your_api_key_here
LFX_BASE_URL=http://localhost:7860

# LLM Configuration
OPENAI_API_KEY=sk-...
OLLAMA_BASE_URL=http://localhost:11434
```

### Configuration File (ignis_lfx_config.json)

```json
{
  "LFX_URL": "http://localhost:7860",
  "LFX_API_KEY": "your-api-key",
  "DEFAULT_FLOW": "default.json",
  "MEMORY_TYPE": "session",
  "DEBUG": false
}
```

## Project Structure

```
ignis_lfx/
├── __init__.py           # Package initialization
├── core/                 # Core functionality
│   ├── flow.py          # Flow execution engine
│   ├── executor.py      # Flow executor
│   └── schema.py        # Data schemas
├── components/          # Component library
│   ├── __init__.py
│   └── base.py          # Base component class
├── memory/              # Memory management
│   ├── __init__.py
│   ├── base.py          # Base memory class
│   └── session.py       # Session memory implementation
├── integrations/        # External integrations
│   ├── fastapi.py       # FastAPI integration
│   ├── mcp.py           # MCP protocol support
│   └── llm.py           # LLM provider support
├── cli/                 # Command-line interface
│   ├── __init__.py
│   └── commands.py      # CLI commands
└── utils/               # Utility functions
    ├── __init__.py
    ├── logger.py        # Logging configuration
    └── validators.py    # Input validation
```

## Dependencies

### Core Dependencies
- **fastapi** (≥0.128.0) - Web framework
- **pydantic** (≥2.0.0) - Data validation
- **langchain-core** (≥0.3.0) - LangChain core library
- **orjson** (≥3.10.0) - Fast JSON serialization

### LLM Provider Support
- **langchain-openai** - OpenAI API support
- **langchain-ollama** - Ollama local LLM support
- **langchain-ibm** - IBM LangChain support

### CLI & HTTP
- **typer** (≥0.12.0) - CLI framework
- **httpx** (≥0.25.0) - Async HTTP client

## Advanced Usage

### Custom Memory Backend

```python
from ignis_lfx.memory import BaseMemory

class CustomMemory(BaseMemory):
    def save(self, session_id: str, role: str, content: str) -> None:
        # Implement custom save logic
        pass
    
    def load(self, session_id: str) -> list:
        # Implement custom load logic
        pass

# Use custom memory
from ignis_lfx import execute_flow
result = execute_flow(
    flow_name="my_flow.json",
    memory_backend=CustomMemory()
)
```

### Component Development

```python
from ignis_lfx.components import BaseComponent
from pydantic import Field

class MyCustomComponent(BaseComponent):
    name: str = "MyComponent"
    description: str = "A custom component"
    
    input_param: str = Field(..., description="Input parameter")
    
    def run(self, **kwargs) -> dict:
        # Implement component logic
        return {"result": f"Processed: {self.input_param}"}
```

## Contributing

We welcome contributions! Please follow these steps:

1. Fork the repository at [Infogain-GenAI/ignis-lfx](https://github.com/Infogain-GenAI/ignis-lfx)
2. Create a feature branch: `git checkout -b feature/amazing-feature`
3. Commit changes: `git commit -m 'Add amazing feature'`
4. Push to branch: `git push origin feature/amazing-feature`
5. Open a Pull Request

### Development Setup

```bash
# Clone repository
git clone https://github.com/Infogain-GenAI/ignis-lfx.git
cd ignis-lfx

# Create virtual environment
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate

# Install in development mode
pip install -e ".[dev]"

# Run tests
pytest

# Format code
black .

# Lint code
ruff check . --fix
```

## Testing

```bash
# Run all tests
pytest

# Run tests with coverage
pytest --cov=ignis_lfx --cov-report=html

# Run specific test file
pytest tests/test_execution.py

# Run tests matching pattern
pytest -k "memory" -v
```

## Troubleshooting

### Common Issues

**Issue**: `ImportError: cannot import name 'execute_flow'`
- **Solution**: Ensure ignis_lfx is properly installed: `pip install --upgrade ignis_lfx`

**Issue**: `Connection refused to LFX server`
- **Solution**: Verify LFX server is running and `LFX_URL` is correctly configured

**Issue**: `API Key authentication failed`
- **Solution**: Check your API key in the configuration file or environment variable

## Documentation

For detailed documentation, examples, and API reference, visit:
- [GitHub Wiki](https://github.com/Infogain-GenAI/ignis-lfx/wiki)
- [API Documentation](https://github.com/Infogain-GenAI/ignis-lfx#api-reference)

## License

This project is licensed under the MIT License - see the [LICENSE](https://github.com/Infogain-GenAI/ignis-lfx/blob/main/LICENSE) file for details.

## Support

- 📧 **Email**: nishanth.p@infogain.com
- 🐛 **Issue Tracker**: [GitHub Issues](https://github.com/Infogain-GenAI/ignis-lfx/issues)
- 💬 **Discussions**: [GitHub Discussions](https://github.com/Infogain-GenAI/ignis-lfx/discussions)

## Changelog

### Version 0.1.0 (Initial Release)
- Initial release of ignis_lfx
- Core flow execution engine
- Memory management system
- FastAPI integration
- MCP protocol support
- Comprehensive documentation

## Acknowledgments

Built by the [Infogain GenAI](https://github.com/Infogain-GenAI) team using the [Langflow](https://github.com/langflow-ai/langflow) framework.

---

**Made with ❤️ by Infogain**
