Metadata-Version: 2.4
Name: evm-log-father
Version: 0.2.0
Classifier: Development Status :: 4 - Beta
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Rust
Classifier: Topic :: Software Development :: Libraries
Summary: Fast EVM log decoding library
Keywords: ethereum,evm,abi,decoding,logs,blockchain
Home-Page: https://rehashed.work
Author: adidonato
License: MIT
Requires-Python: >=3.8
Description-Content-Type: text/markdown; charset=UTF-8; variant=GFM
Project-URL: Homepage, https://rehashed.work
Project-URL: Repository, https://github.com/adidonato/evm-log-father

<p align="center">
  <img src="assets/logo.png" alt="evm-log-father" width="200">
</p>

# evm-log-father

Fast EVM log decoding library with Python bindings.

## Performance

![Benchmark Results](assets/benchmark.png)

**400-500k logs/second** with parallel decoding on large parquet files.

## Features

- Decode Ethereum event logs using alloy's dynamic ABI
- Read logs from parquet files (multiple schema formats supported)
- Batch decode raw logs from any source (RPC, databases, etc.)
- Parallel decoding with rayon
- Python bindings via PyO3
- CLI for quick testing

## Installation

### Python (from PyPI)

```bash
pip install evm-log-father
```

### Python (from source)

```bash
pip install maturin
maturin develop --features python
```

### Rust

```toml
[dependencies]
evm-log-father = "0.1"
```

## Usage

### Python

```python
from evm_log_father import EventSchema, decode_parquet, decode_logs

# Create schema from event signature
schema = EventSchema("Transfer(address indexed from, address indexed to, uint256 value)")

# Decode logs from parquet file
logs = decode_parquet("transfers.parquet", schema, parallel=True)

for log in logs:
    print(f"Block {log['block_number']}: {log['params']['from']} -> {log['params']['to']}")

# Batch decode raw logs (e.g. from RPC or other sources)
raw_logs = [
    {
        "topics": ["0xddf252ad...", "0x000...sender", "0x000...receiver"],
        "data": "0x00000000000000000000000000000000000000000000000000000000000f4240",
        "block_number": 12345678,
        "tx_hash": "0xabc...",
        "log_index": 0,
        "contract": "0xdac17f958d2ee523a2206206994597c13d831ec7",
    }
]
decoded = decode_logs(schema, raw_logs, parallel=True)
```

### Rust

```rust
use evm_log_father::{EventSchema, decode_parquet_parallel};

let schema = EventSchema::new("Transfer(address indexed from, address indexed to, uint256 value)")?;
let logs = decode_parquet_parallel("transfers.parquet", &schema)?;

for log in logs {
    println!("Block {}: {:?}", log.block_number, log.params);
}
```

### CLI

```bash
# Decode logs and output JSON
evm-log-father decode \
  --parquet transfers.parquet \
  --event "Transfer(address indexed from, address indexed to, uint256 value)" \
  --output decoded.json \
  --parallel \
  --timing

# Show event info
evm-log-father info --event "Transfer(address indexed from, address indexed to, uint256 value)"
```

## Parquet Schema Support

Flexible schema support for various parquet formats:

### Column Names
Both snake_case and camelCase supported:
- `block_number` / `blockNumber`
- `transaction_hash` / `transactionHash` / `tx_hash`
- `log_index` / `logIndex`
- `contract` / `address`

### Topics Format
- Individual columns: `topic0`, `topic1`, `topic2`, `topic3`
- List column: `topics` (Spark format)

### Data Types
- `block_number`: u64 or i64
- `log_index`: u32, u64, or i64
- `data`: binary or hex string

## Benchmarking

```bash
python examples/benchmark.py transfers.parquet
```

## License

MIT

