Metadata-Version: 2.4
Name: repp-tapping
Version: 1.4.0rc0
Summary: Online technology for measuring sensorimotor synchronization
Project-URL: Documentation, https://computational-audition.gitlab.io/repp/
Project-URL: Homepage, https://gitlab.com/computational-audition/repp
Project-URL: Issues, https://gitlab.com/computational-audition/repp/-/issues
Project-URL: Repository, https://gitlab.com/computational-audition/repp
Project-URL: Dataset, https://osf.io/r2pxd/
Author-email: Manuel Anglada-Tort <manel.anglada.tort@gmail.com>, Peter Harrison <pmch2@cam.ac.uk>, Nori Jacoby <nori.jacoby@ae.mpg.de>
Maintainer-email: Manuel Anglada-Tort <manel.anglada.tort@gmail.com>, Frank Höger <fh337@cornell.edu>
License: MIT License
        
        Copyright (c) 2026, Manuel Anglada-Tort, Peter Harrison, and Nori Jacoby
        
        Permission is hereby granted, free of charge, to any person obtaining a copy
        of this software and associated documentation files (the "Software"), to deal
        in the Software without restriction, including without limitation the rights
        to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
        copies of the Software, and to permit persons to whom the Software is
        furnished to do so, subject to the following conditions:
        
        The above copyright notice and this permission notice shall be included in all
        copies or substantial portions of the Software.
        
        THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
        IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
        FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
        AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
        LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
        OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
        SOFTWARE.
        
License-File: LICENSE
Keywords: experiments,psychology,sensorimotor,synchronization,tapping
Classifier: Development Status :: 5 - Production/Stable
Classifier: Intended Audience :: Science/Research
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Topic :: Scientific/Engineering
Requires-Python: >=3.9
Requires-Dist: click
Requires-Dist: matplotlib
Requires-Dist: numpy
Requires-Dist: scipy
Provides-Extra: dev
Requires-Dist: black; extra == 'dev'
Requires-Dist: flake8; extra == 'dev'
Requires-Dist: pytest; extra == 'dev'
Provides-Extra: docs
Requires-Dist: sphinx; extra == 'docs'
Requires-Dist: sphinx-rtd-theme; extra == 'docs'
Provides-Extra: notebook
Requires-Dist: jupyter; extra == 'notebook'
Requires-Dist: sounddevice; extra == 'notebook'
Requires-Dist: soundfile; extra == 'notebook'
Description-Content-Type: text/markdown

# REPP: Large-Scale Online Tapping Experiments

**Manuel Anglada-Tort, Peter Harrison, and Nori Jacoby**\
[Computational Auditory Perception Group](https://www.aesthetics.mpg.de/en/research/research-group-computational-auditory-perception.html)\
Max Planck Institute for Empirical Aesthetics

_REPP (Rhythm ExPeriment Platform)_ is a Python package for measuring sensorimotor synchronization (SMS) in laboratory and online settings.

## Quick Links

- [Documentation](https://computational-audition.gitlab.io/repp/)
- [Source Code](https://gitlab.com/computational-audition/repp)
- [Dataset](https://osf.io/r2pxd/) - Tapping datasets supporting the paper (2022)

## Citation

If you use this package, please cite:

```text
Anglada-Tort, M., Harrison, P.M.C. & Jacoby, N. REPP: A robust cross-platform solution for online sensorimotor synchronization experiments. Behavior Research Methods (2022). https://doi.org/10.3758/s13428-021-01722-2
```

## Installation

### Prerequisites

- Python 3.9 or newer (tested with 3.9–3.13)
- macOS (primary testing platform)

### Setting up a Virtual Environment

1. Install virtualenv and virtualenvwrapper:

```bash
pip3 install virtualenv virtualenvwrapper
```

2. Configure virtualenvwrapper:

```bash
export WORKON_HOME=$HOME/.virtualenvs
mkdir -p $WORKON_HOME
export VIRTUALENVWRAPPER_PYTHON=$(which python3)
source $(which virtualenvwrapper.sh)
```

3. Create and activate virtual environment:

```bash
mkvirtualenv repp --python $(which python3)
```

Optional: Add to shell configuration (~/.zshrc):

```bash
echo "export VIRTUALENVWRAPPER_PYTHON=$(which python3)" >> ~/.zshrc
echo "source $(which virtualenvwrapper.sh)" >> ~/.zshrc
```

To activate the environment later:

```bash
workon repp
```

### Installing REPP

The published distribution name is `repp-tapping`, while the Python import
namespace and CLI remain `repp` for compatibility.

1. Clone the repository:

```bash
git clone git@gitlab.com:computational-audition-lab/repp.git
```

2. Navigate to the project directory:

```bash
cd repp
```

3. Install REPP in editable mode (this also installs all runtime dependencies declared in `pyproject.toml`):

```bash
pip3 install -e .
```

To install the published package from an index instead of a local checkout:

```bash
pip3 install repp-tapping
```

4. Verify installation:

```bash
repp --version
```

### Running Tests

Basic test execution:

```bash
# Run all tests in the project
pytest -v -s

# Run a specific test file
pytest tests/test_repp.py
pytest tests/test_signal_processing.py -v
pytest tests/test_analysis.py -v
pytest tests/test_stimulus.py -v
```

## Requirements

Runtime dependencies (installed automatically with `pip install repp-tapping`):

- numpy
- scipy
- matplotlib
- click

Optional extras:

- `dev` — pytest, flake8, black
- `docs` — sphinx, sphinx-rtd-theme
- `notebook` — jupyter, sounddevice, soundfile (needed to run the demo notebooks)

Install an extra with, e.g.:

```bash
pip install -e ".[dev]"
```

The full list and any version constraints are declared in [`pyproject.toml`](pyproject.toml).

## Contributors

- **Vani Rajendran**: Developed analysis and plotting methods for beat detection tasks (see `beatfinding_cyclic` demo)

## Contributing

We welcome contributions supporting new paradigms or improvements to the current code. Please contact the authors if you would like to contribute.

## Running Demos

REPP comes with several demo notebooks to help you get started. The
demos live in the source repository (under `demos/`) and rely on
sample audio assets in `input/`. Those assets are kept in the
repository but are **not shipped with the PyPI distribution** (they
are too large for PyPI's per-file limits). To run the demos you need
a source checkout of the
[GitLab repository](https://gitlab.com/computational-audition/repp).

1. Activate your virtual environment if not already active:

```bash
workon repp
```

2. Install the notebook extras:

```bash
pip install -e ".[notebook]"
```

3. Start Jupyter Notebook:

```bash
jupyter notebook
```

4. Navigate to the `demos/` directory in the Jupyter interface and open any of the available demos:

- `sms_tapping.ipynb`: SMS tapping experiments, including metronome and music
- `unconstrained_tapping.ipynb`: Unconstrained tapping paradigms, including free and fast tapping
- `debug_recording.ipynb`: Debugging and troubleshooting examples
- `beat_detection.ipynb`: Beat-detection workflow using `repp.extensions.beat_detection`
- `beat_finding.ipynb`: Beat-finding demo using a repeating audio token
- `beat_finding_prerecorded.ipynb`: Beat-finding analysis using prerecorded audio
- `beatfinding_cyclic.ipynb`: Cyclic beat-finding analysis and visualization
- `iterated_tapping.ipynb`: Iterated tapping workflow using `repp.extensions.iterated_tapping`
- `itap_memory.ipynb`: Memory-based iterated tapping workflow

## License

MIT License

## Future Improvements

- [ ] Add type hints and a `py.typed` marker
- [ ] Add code-formatting and linting configuration (black/ruff/flake8)
- [ ] Add `CONTRIBUTING.md` and `CODE_OF_CONDUCT.md`
- [ ] Add a tag-triggered PyPI publish job to the GitLab CI pipeline (using PyPI Trusted Publishing)
- [ ] Add more robust error handling
- [ ] Improve demos
