Metadata-Version: 2.4
Name: vr-gesture-drum-trainer-tools
Version: 1.0.0
Summary: Python utilities for VR Gesture Drum Trainer - lesson generation, pattern validation, and Phantom Rack export tools
Author: VR Gesture Drum Trainer Team
License: MIT
Keywords: vr,drums,music-education,unity,hand-tracking
Requires-Python: >=3.9
Description-Content-Type: text/markdown
Requires-Dist: stripe>=7.0.0
Provides-Extra: dev
Requires-Dist: pytest>=7.4.0; extra == "dev"
Requires-Dist: black>=23.0.0; extra == "dev"

# VR Gesture Drum Trainer

Learn drumming through spatial muscle memory in VR before graduating to your real drum kit.

## What is this?

VR Gesture Drum Trainer is a rhythm training application for Meta Quest 3 that teaches fundamental drumming patterns through hand-tracked gesture following. Players pursue animated gesture trails in 3D space, building coordination and muscle memory in a low-stakes, gamified environment. It bridges the gap between "no drumming experience" and "ready to play Phantom Rack," capitalizing on Quest 3's improved hand tracking and the growing market demand for skill-building rhythm games.

## Features

- **Hand-tracked gesture following** – Follow floating 3D trails using precise hand tracking
- **Progressive lesson system** – Structured curriculum from basic patterns to complex combinations
- **Real-time accuracy feedback** – Instant scoring on timing, spatial precision, and consistency
- **Spatial muscle memory** – Build coordination through repetition in 3D space
- **Ready-to-graduate path** – Transition smoothly to Phantom Rack or other VR drum instruments
- **Quest 3 optimized** – Leverages latest hand tracking hardware for responsive gameplay

## Quick Start

### Installation

1. Clone the repository:
   ```bash
   git clone https://github.com/yourusername/vr-gesture-drum-trainer.git
   cd vr-gesture-drum-trainer
   ```

2. Open in Unity (2022 LTS or later) with Meta XR SDK installed

3. Install dependencies:
   ```bash
   pip install -r requirements.txt
   ```

4. Build and deploy to Quest 3:
   ```bash
   unity -quit -batchmode -buildTarget Android -executeMethod BuildScript.Build
   ```

## Usage

### Starting a Lesson

1. Launch the app on your Quest 3
2. Select a lesson from the main menu (`Assets/Scenes/MainMenu.unity`)
3. Follow the animated gesture trails with your hands
4. Receive real-time accuracy scores and feedback

### Lesson Structure

Lessons are defined in JSON (e.g., `Assets/Resources/Lessons/lesson_01.json`):

```json
{
  "id": "lesson_01",
  "name": "Basic Grip",
  "difficulty": 1,
  "patterns": [
    {
      "gesture": "vertical_tap",
      "hand": "both",
      "duration": 2.0,
      "target_height": 1.5
    }
  ]
}
```

### Core API

**GesturePattern** – Defines a single drumming motion:
```csharp
GesturePattern pattern = new GesturePattern(
  gestureType: "tap",
  targetPosition: new Vector3(0, 1.5f, 0),
  duration: 1.0f
);
```

**AccuracyTracker** – Scores player performance:
```csharp
float score = accuracyTracker.CalculateAccuracy(
  detectedPosition: handPosition,
  targetPosition: patternTarget,
  timingOffset: deltaTime
);
```

**HandPoseDetector** – Recognizes hand positions and gestures via Quest tracking:
```csharp
HandPose pose = handPoseDetector.DetectPose(Hand.Right);
```

## Tech Stack

- **Engine**: Unity 2022 LTS
- **VR Platform**: Meta Quest 3 (Oculus XR)
- **Hand Tracking**: Meta Hand Tracking SDK
- **Scripting**: C#
- **Lesson Data**: JSON
- **Build/Config**: Python

## Project Structure

```
Assets/
├── Scenes/
│   └── MainMenu.unity
├── Scripts/
│   ├── Core/
│   │   ├── LessonManager.cs
│   │   ├── GesturePattern.cs
│   │   └── AccuracyTracker.cs
│   ├── HandTracking/
│   │   └── HandPoseDetector.cs
│   └── Visualization/
│       └── GestureTrail.cs
└── Resources/
    └── Lessons/
        ├── lesson_01.json
        └── lesson_02.json
```

## License

MIT

---

**Questions?** See [OVERVIEW.md](OVERVIEW.md) for design philosophy and [MONETIZATION.md](MONETIZATION.md) for business strategy.
