Metadata-Version: 2.4
Name: philiprehberger-timerfunc
Version: 0.1.6
Summary: Measure execution time of any function
Project-URL: Homepage, https://github.com/philiprehberger/py-timerfunc#readme
Project-URL: Repository, https://github.com/philiprehberger/py-timerfunc
Project-URL: Issues, https://github.com/philiprehberger/py-timerfunc/issues
Project-URL: Changelog, https://github.com/philiprehberger/py-timerfunc/blob/main/CHANGELOG.md
Author: Philip Rehberger
License-Expression: MIT
License-File: LICENSE
Keywords: benchmark,decorator,performance,profiling,timer
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Typing :: Typed
Requires-Python: >=3.10
Description-Content-Type: text/markdown

# philiprehberger-timerfunc

[![Tests](https://github.com/philiprehberger/py-timerfunc/actions/workflows/publish.yml/badge.svg)](https://github.com/philiprehberger/py-timerfunc/actions/workflows/publish.yml)
[![PyPI version](https://img.shields.io/pypi/v/philiprehberger-timerfunc.svg)](https://pypi.org/project/philiprehberger-timerfunc/)
[![License](https://img.shields.io/github/license/philiprehberger/py-timerfunc)](LICENSE)

Measure execution time of any function.

## Installation

```bash
pip install philiprehberger-timerfunc
```

## Usage

### Context Manager

```python
from philiprehberger_timerfunc import timer

with timer() as t:
    do_work()
print(f"Took {t.elapsed_ms:.1f}ms")
```

### Decorator

```python
from philiprehberger_timerfunc import timed

@timed
def process_data():
    ...  # logs: "process_data took 42.3ms"

@timed(threshold_ms=100)
def api_call():
    ...  # only logs if slower than 100ms
```

### Benchmark

```python
from philiprehberger_timerfunc import benchmark

result = benchmark(my_function, args=(data,), iterations=1000)
print(f"Mean: {result.mean_ms:.2f}ms, P95: {result.p95_ms:.2f}ms")
print(result)  # full stats summary
```

## API

| Function / Class | Description |
|------------------|-------------|
| `timer()` | Context manager returning `TimerResult` |
| `@timed` / `@timed(threshold_ms=0)` | Decorator logging execution time |
| `benchmark(fn, args, kwargs, iterations, warmup)` | Returns `BenchmarkResult` |


## Development

```bash
pip install -e .
python -m pytest tests/ -v
```

## License

MIT
