Metadata-Version: 2.1
Name: blackmagic-io
Version: 0.17.0b2
Summary: Python library for video I/O with Blackmagic DeckLink devices
Keywords: blackmagic,decklink,video,output,input,io,sdi,hdmi,broadcast
Author-Email: Nick Shaw <nick@antlerpost.com>
Maintainer-Email: Nick Shaw <nick@antlerpost.com>
Classifier: Development Status :: 4 - Beta
Classifier: Intended Audience :: Developers
Classifier: Topic :: Multimedia :: Video
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Programming Language :: Python :: 3.14
Classifier: Programming Language :: C++
Classifier: Operating System :: OS Independent
Project-URL: Homepage, https://github.com/nick-shaw/blackmagic-output
Project-URL: Repository, https://github.com/nick-shaw/blackmagic-output
Project-URL: Issues, https://github.com/nick-shaw/blackmagic-output/issues
Requires-Python: >=3.8
Requires-Dist: numpy<2.3,>=1.19.0
Requires-Dist: pybind11>=2.6.0
Description-Content-Type: text/markdown

# Blackmagic DeckLink Python I/O Library

*Continuation of [blackmagic-output](https://github.com/nick-shaw/blackmagic-output) (now archived) with input/capture support added; the package was renamed in 0.16.0b0 to reflect the broader scope.*

A Python library for video I/O with Blackmagic DeckLink devices using the official DeckLink SDK. This library provides a simple interface for displaying static frames, solid colors, and dynamic content from NumPy arrays, as well as for capturing frames to NumPy arrays.

Written by Nick Shaw, www.antlerpost.com, with a lot of help from [Claude Code](https://www.claude.com/product/claude-code)!

**⚠️ Note:** The library has only had minimal testing at this time, and is under ongoing development. Please report any issues you encounter. I am particularly interested in feedback from Linux and Windows users.

## Features

### Output
- **Static Frame Output**: Display static images from NumPy arrays
- **Solid Color Output**: Display solid colors for testing and calibration
- **Dynamic Updates**: Update currently displayed frame
- **Multiple Resolutions**: Support for all display modes supported by your DeckLink device (SD, HD, 2K, 4K, 8K, and PC modes)
- **8 and 10-bit Y'CbCr Output**: 2vuy and v210 (default for uint16/float data)
- **10 and 12-bit R'G'B' output**: 10 and 12-bit R'G'B' 4:4:4
- **HDR Support**: SMPTE ST 2086 / CEA-861.3 HDR static metadata
- **Y'CbCr matrix control**: Rec.601 (SD only), Rec.709 (HD+), and Rec.2020 (HD+) matrix support

### Input
- **Video Capture**: Capture video frames from DeckLink devices
- **Automatic Format Conversion**: Convert all DeckLink pixel formats to RGB float
- **Format Detection**: Automatic detection of input signal format (resolution, frame rate, colorspace, EOTF)
- **Metadata Access**: Access to format metadata (pixel format, colorspace, EOTF, source range)
- **Timecode Capture**: Automatic extraction of embedded timecode (RP188 VITC/LTC/HFRTC)
- **HDMI EDID Configuration**: Advertises SDR, HDR PQ, and HDR HLG support over HDMI by default so HDR sources transmit HDR Static Metadata (the SDK default omits HLG); the advertised bitmask is configurable via `set_hdmi_input_dynamic_ranges()`

### General
- **Cross-Platform**: Works on Windows, macOS, and Linux (this is in theory – only macOS build fully tested so far)

## Requirements

### System Requirements
- Python 3.8 or higher
- Blackmagic DeckLink device (DeckLink, UltraStudio, or Intensity series)
- Blackmagic Desktop Video software installed

### Python Dependencies
Python dependencies (NumPy >= 1.19.0, pybind11 >= 2.6.0) are automatically installed (if needed) during the build process.

### DeckLink SDK
SDK v14.1 headers for all platforms are included in the repository - no separate download needed.

## Installation

### 1. DeckLink SDK

**All Platforms (macOS, Windows, Linux):**

- `_vendor/decklink_sdk/Mac/include/` - macOS headers
- `_vendor/decklink_sdk/Win/include/` - Windows headers
- `_vendor/decklink_sdk/Linux/include/` - Linux headers

The build system (CMake + scikit-build-core) automatically uses the correct platform-specific headers.

**⚠️ Important:** This library was built against SDK v14.1 to maintain compatibility with older macOS versions. If you need to download the SDK separately, ensure you get v14.1 from the [Blackmagic Design developer site](https://www.blackmagicdesign.com/developer/). Newer versions (v15.0+) may cause API compatibility issues and build failures.

### 2. Install the Library

**Option A (recommended for most users): from PyPI**

```bash
pip install blackmagic-io==0.17.0b0
```

The explicit version pin is required because this is a beta release and `pip` skips pre-releases by default. (Once a non-beta version is published, `pip install blackmagic-io` without the version will work.) Pre-built wheels are available for Python 3.8–3.14 on macOS, Linux, and Windows; pip falls back to a source build on unsupported Pythons (which requires a C++ compiler — Xcode Command Line Tools on macOS, build-essential on Linux, or MSVC on Windows).

To use the library at runtime you also need Blackmagic Desktop Video installed on your system (separate from this Python package) — the runtime DeckLink driver and framework are provided by Desktop Video, available from [blackmagicdesign.com/support](https://www.blackmagicdesign.com/support).

**Option B (for contributors or for modifying the source): from a clone**

```bash
# Clone the repository
git clone https://github.com/nick-shaw/blackmagic-io.git
cd blackmagic-io

# Initialize submodules (required for the advanced T-Pat example only)
git submodule update --init --recursive

# Install in development mode (this also installs numpy and pybind11 dependencies)
pip install -e .

# If upgrading from a previous development version, force reinstall:
pip install --force-reinstall -e .
```

### 3. Install Optional Dependencies

For examples and additional functionality:
```bash
pip install opencv-python imageio pillow jsonschema
```

**Note:** While imageio / PIL can load 16-bit TIFF files correctly, 16-bit PNG files are often converted to 8-bit during loading due to PIL limitations. For reliable 16-bit workflows, use TIFF format.

## Quick Start

### Output Example

```python
import numpy as np
from blackmagic_io import BlackmagicOutput, DisplayMode

# Create a simple test image (1080p R'G'B', normalized float)
frame = np.ones((1080, 1920, 3), dtype=np.float32)
frame[:, :] = [1.0, 0.0, 0.0]  # Red frame

# Display the frame
with BlackmagicOutput() as output:
    # Initialize device (uses first available device)
    output.initialize()

    # Display static frame at 1080p25
    output.display_static_frame(frame, DisplayMode.HD1080p25)

    # Keep displaying (Enter to stop)
    input("Press Enter to stop...")
```

**Note:** The explicit `initialize()` call is optional - `display_static_frame()` will automatically initialize the first available device (device_index=0) if not already initialized. Explicit initialization is useful for device selection, better error handling, and timing control:

```python
# Simpler alternative - auto-initialization (uses first device)
with BlackmagicOutput() as output:
    output.display_static_frame(frame, DisplayMode.HD1080p25)
    input("Press Enter to stop...")
```

### Input Example

```python
from blackmagic_io import BlackmagicInput

# Capture a frame from DeckLink input
with BlackmagicInput() as input_device:
    # Initialize device (uses first available device)
    input_device.initialize()

    # Capture frame as RGB float array (0.0-1.0 range)
    rgb_frame = input_device.capture_frame_as_rgb(timeout_ms=5000)

    if rgb_frame is not None:
        print(f"Captured frame: {rgb_frame.shape}, dtype: {rgb_frame.dtype}")
        print(f"Value range: {rgb_frame.min():.3f} - {rgb_frame.max():.3f}")
        # Process frame data...
    else:
        print("No signal or timeout")
```

**Capture with metadata:**

```python
from blackmagic_io import BlackmagicInput

with BlackmagicInput() as input_device:
    input_device.initialize()

    # Capture frame with format information
    frame_data = input_device.capture_frame_with_metadata(timeout_ms=5000)

    if frame_data is not None:
        print(f"Resolution: {frame_data['width']}x{frame_data['height']}")
        print(f"Format: {frame_data['format']}")
        print(f"Colorspace: {frame_data['colorspace']}")
        print(f"EOTF: {frame_data['eotf']}")
        print(f"Narrow range source: {frame_data['input_narrow_range']}")

        # Access timecode if present
        if 'timecode' in frame_data:
            tc = frame_data['timecode']
            separator = ';' if tc['is_drop_frame'] else ':'
            print(f"Timecode: {tc['hours']:02d}:{tc['minutes']:02d}:{tc['seconds']:02d}{separator}{tc['frames']:02d}")

        # Access HDR metadata if present
        if 'hdr_metadata' in frame_data:
            hdr = frame_data['hdr_metadata']
            if 'display_primaries' in hdr:
                print(f"Display Primaries:")
                print(f"  Red:   ({hdr['display_primaries']['red']['x']:.4f}, "
                      f"{hdr['display_primaries']['red']['y']:.4f})")
                print(f"  Green: ({hdr['display_primaries']['green']['x']:.4f}, "
                      f"{hdr['display_primaries']['green']['y']:.4f})")
                print(f"  Blue:  ({hdr['display_primaries']['blue']['x']:.4f}, "
                      f"{hdr['display_primaries']['blue']['y']:.4f})")
            if 'mastering_luminance' in hdr:
                print(f"Mastering Luminance: {hdr['mastering_luminance']['max']:.1f} / "
                      f"{hdr['mastering_luminance']['min']:.4f} cd/m²")

        # Access RGB data
        rgb = frame_data['rgb']  # float32 array (H×W×3)
        # Process frame...
```

## API Reference

The library provides APIs for both output and input:

**High-level Python wrappers:**
- `blackmagic_io.BlackmagicOutput` - Convenient API for video output
- `blackmagic_io.BlackmagicInput` - Convenient API for video capture

**Low-level direct access:**
- `decklink_io.DeckLinkOutput` - Fine-grained control for output
- `decklink_io.DeckLinkInput` - Fine-grained control for capture

### High-Level API: BlackmagicOutput Class

Convenient Python wrapper for most video output operations.

#### Methods

**`get_available_devices() -> List[str]`**
Get list of available DeckLink device names.

**`get_device_capabilities(device_index=0) -> dict`**
Get device capabilities (name and supported input/output).
- `device_index`: Index of device to query (default: 0)
- Returns: Dictionary with:
  - `'name'`: Device name
  - `'supports_input'`: True if device can capture video
  - `'supports_output'`: True if device can output video

**Example:**
```python
from blackmagic_io import BlackmagicOutput

output = BlackmagicOutput()
caps = output.get_device_capabilities(0)
print(f"Device: {caps['name']}")
print(f"Supports input: {caps['supports_input']}")
print(f"Supports output: {caps['supports_output']}")
```

**Note:** Some DeckLink devices are single-direction only (e.g., UltraStudio Monitor 3G supports output only, UltraStudio Recorder 3G supports input only). Use this method to check device capabilities before attempting to use them for input or output.

**`initialize(device_index=0) -> bool`**
Initialize the specified DeckLink device.
- `device_index`: Index of device to use (default: 0)
- Returns: True if successful

**Note:** Explicit initialization is optional. Methods like `display_static_frame()` and `display_solid_color()` will automatically initialize the first available device (device_index=0) if not already initialized. Explicit initialization is recommended when you need:
- To select a specific device (when multiple DeckLink devices are present)
- Separate error handling for device initialization vs. frame display
- Control over initialization timing (e.g., to avoid delays during first frame display)
- To verify device availability before preparing frame data

**`get_supported_display_modes() -> List[dict]`**
Get list of supported display modes for the initialized device.
- Returns: List of dictionaries, each containing:
  - `display_mode`: DisplayMode enum value
  - `name`: Human-readable mode name (e.g., "1080p25", "2160p59.94")
  - `width`: Frame width in pixels
  - `height`: Frame height in pixels
  - `framerate`: Frame rate in frames per second
- Raises: RuntimeError if device not initialized

**Example:**
```python
from blackmagic_io import BlackmagicOutput

with BlackmagicOutput() as output:
    output.initialize()

    modes = output.get_supported_display_modes()
    for mode in modes:
        print(f"{mode['name']}: {mode['width']}x{mode['height']} @ {mode['framerate']:.2f} fps")
```

**`is_pixel_format_supported(display_mode, pixel_format) -> bool`**
Check if a pixel format is supported for a given display mode.
- `display_mode`: Display mode to check
- `pixel_format`: Pixel format to check
- Returns: True if the mode / format combination is supported

**`display_static_frame(frame_data, display_mode, pixel_format=PixelFormat.YUV10, matrix=None, hdr_metadata=None, input_narrow_range=False, output_narrow_range=True) -> bool`**
Display a static frame continuously.
- `frame_data`: NumPy array with image data:
  - RGB: shape (height, width, 3), dtype uint8 / uint16 / float32 / float64
  - BGRA: shape (height, width, 4), dtype uint8
- `display_mode`: Video resolution and frame rate
- `pixel_format`: Pixel format (default: YUV10, automatically uses BGRA for uint8 data)
- `matrix`: Optional R'G'B' to Y'CbCr conversion matrix (`Matrix.Rec601`, `Matrix.Rec709` or `Matrix.Rec2020`). Only used with YUV10 format. If not specified, auto-detects based on resolution: SD modes (NTSC, PAL) use Rec.601, HD and higher use Rec.709
- `hdr_metadata`: Optional HDR metadata dict with keys:
  - `'eotf'`: Eotf enum (SDR, PQ, or HLG)
  - `'static_metadata'`: Optional HdrStaticMetadata object with explicit display primaries, white point, mastering luminance, and content light level fields
- `input_narrow_range`: Whether to interpret integer `frame_data` as narrow range (float is always interpreted as full range). Default: False
- `output_narrow_range`: Whether to output a narrow range signal. Default: True
- Returns: True if successful

**`display_solid_color(color, display_mode, pixel_format=PixelFormat.YUV10, matrix=None, hdr_metadata=None, input_narrow_range=False, output_narrow_range=True, patch=None, background_color=None) -> bool`**
Display a solid color continuously.
- `color`: R'G'B' tuple (r, g, b) with values:
  - Integer values (0-1023): Interpreted as 10-bit values
  - Float values (0.0-1.0): Interpreted as normalized full range values
- `display_mode`: Video resolution and frame rate
- `pixel_format`: Pixel format (default: YUV10)
- `matrix`: RGB to Y'CbCr conversion matrix (Rec601, Rec709 or Rec2020). Only applies when pixel_format is YUV10. If not specified, auto-detects based on resolution: SD modes (NTSC, PAL) use Rec.601, HD and higher use Rec.709
- `hdr_metadata`: Optional HDR metadata dict with 'eotf' (and optional 'static_metadata') keys
- `input_narrow_range`: Whether to interpret integer `color` values as narrow range (float is always interpreted as full range). Default: False
- `output_narrow_range`: Whether to output a narrow range signal. Default: True
- `patch`: Optional tuple (center_x, center_y, width, height) with normalized coordinates (0.0-1.0):
  - center_x, center_y: Center position of the patch (0.5, 0.5 = center of screen)
  - width, height: Patch dimensions (1.0, 1.0 = full screen)
  - If None, displays full screen solid color. Default: None
- `background_color`: R'G'B' tuple for background when using patch parameter. Uses same format as `color` parameter (respecting `input_narrow_range`). If None, defaults to black. Default: None
- Returns: True if successful

**`update_frame(frame_data) -> bool`**
Update currently displayed frame with new data.
- `frame_data`: New frame data as NumPy array
- Returns: True if successful

**`get_display_mode_info(display_mode) -> dict`**
Get information about a display mode.
- Returns: Dictionary with 'width', 'height', 'framerate'

**`get_current_output_info() -> dict`**
Get information about the current output configuration.
- Returns: Dictionary with 'display_mode_name', 'pixel_format_name', 'width', 'height', 'framerate', 'rgb444_mode_enabled'

**`stop() -> bool`**
Stop video output.
- Returns: True if successful

Stops displaying frames but keeps the device initialized and ready for immediate reuse. After calling `stop()`, you can call `display_static_frame()` or `display_solid_color()` again without needing to re-initialize.

**`cleanup()`**
Cleanup resources and stop output.

Stops video output (if running) and releases all device resources. After `cleanup()`, the device must be re-initialized with `initialize()` before it can be used again. This method automatically calls `stop()` internally, so there is no need to call `stop()` first.

**Context Manager Support:**
```python
with BlackmagicOutput() as output:
    output.initialize()
    # ... use output ...
# Automatic cleanup() called on exit
```

The context manager automatically calls `cleanup()` when exiting, so explicit cleanup is not needed when using the `with` statement.

#### Utility Functions

**`create_test_pattern(width, height, pattern='gradient', grad_start=0.0, grad_end=1.0) -> np.ndarray`**
Create test patterns for display testing and calibration.
- `width`: Frame width in pixels
- `height`: Frame height in pixels
- `pattern`: Pattern type - `'gradient'`, `'bars'`, or `'checkerboard'`
- `grad_start`: Float starting value for gradient pattern (default: 0.0, use <0.0 for sub-black)
- `grad_end`: Float ending value for gradient pattern (default: 1.0, use >1.0 for super-white)
- Returns: R'G'B' array (H×W×3), dtype float32

### High-Level API: BlackmagicInput Class

Convenient Python wrapper for video capture operations.

#### Methods

**`get_available_devices() -> List[str]`**
Get list of available DeckLink device names.

**`get_device_capabilities(device_index=0) -> dict`**
Get device capabilities (name and supported input/output).
- `device_index`: Index of device to query (default: 0)
- Returns: Dictionary with:
  - `'name'`: Device name
  - `'supports_input'`: True if device can capture video
  - `'supports_output'`: True if device can output video

**`get_available_input_connections(device_index=0) -> List[InputConnection]`**
Get available input connections for a DeckLink device.
- `device_index`: Index of device to query (default: 0)
- Returns: List of InputConnection enum values (e.g., `[InputConnection.SDI, InputConnection.HDMI]`)

Use this to check which physical inputs are available on a device before selecting one with `initialize()`.

**`initialize(device_index=0, input_connection=None, pixel_format=None) -> bool`**
Initialize the specified DeckLink device for input and start capture.
- `device_index`: Index of device to use (default: 0)
- `input_connection`: Optional InputConnection enum to select specific input (e.g., `InputConnection.SDI`, `InputConnection.HDMI`). If None, uses the device's current/default input.
- `pixel_format`: Optional PixelFormat to request from hardware. Use `PixelFormat.BGRA` for fast real-time preview (default: YUV10 for quality capture)
- Returns: True if successful

Immediately activates capture mode, which will:
- Start accepting input signal
- Activate front panel display (if present)
- Enable format detection

**Performance Note:** Requesting `PixelFormat.BGRA` enables real-time ~25fps preview by having the hardware deliver 8-bit BGRA frames directly, avoiding expensive colorspace conversions. This is ideal for monitoring and preview workflows. For quality capture workflows, use the default YUV10 format (or explicitly specify it) to capture full 10-bit precision, then use `capture_frame_as_rgb()` or `capture_frame_with_metadata()` for processing.

**`capture_frame_as_uint8(timeout_ms=5000, input_narrow_range=True) -> Optional[np.ndarray]`**
Capture a single frame and convert to RGB uint8 (faster than float conversion).
- `timeout_ms`: Timeout in milliseconds (default: 5000)
- `input_narrow_range`: Whether input uses narrow range encoding (default: True)
- Returns: RGB uint8 array (H×W×3) with values 0-255, or None if timeout/no signal
- Automatically converts from any DeckLink pixel format to RGB
- Faster than `capture_frame_as_rgb()` due to uint8 output, ideal for preview workflows

**`capture_frame_as_uint8_with_metadata(timeout_ms=5000, input_narrow_range=True) -> Optional[dict]`**
Capture a frame as RGB uint8 with format metadata (fast preview with metadata access).
- `timeout_ms`: Timeout in milliseconds (default: 5000)
- `input_narrow_range`: Whether input uses narrow range encoding (default: True)
- Returns: Dictionary with frame data and metadata, or None if timeout/no signal

Dictionary keys:
- `'rgb'`: RGB uint8 array (H×W×3), values 0-255
- `'width'`: Frame width in pixels
- `'height'`: Frame height in pixels
- `'format'`: Pixel format name (e.g., "YUV10", "RGB10")
- `'mode'`: Display mode name (e.g., "HD1080p25")
- `'colorspace'`: Color matrix name (e.g., "Rec709", "Rec2020")
- `'eotf'`: Transfer function name (e.g., "SDR", "PQ", "HLG")
- `'input_narrow_range'`: Boolean indicating if input was narrow range
- `'hdr_metadata'`: Dictionary with HDR metadata (only present if HDR metadata is in the signal)
  - Same structure as `capture_frame_with_metadata()` below

This function combines the performance of `capture_frame_as_uint8()` with metadata access, making it ideal for real-time preview applications that need to detect signal changes (resolution, colorspace, EOTF) without the overhead of float conversion.

**`capture_frame_as_rgb(timeout_ms=5000) -> Optional[np.ndarray]`**
Capture a single frame and convert to RGB.
- `timeout_ms`: Timeout in milliseconds (default: 5000)
- Returns: RGB float32 array (H×W×3) with values 0.0-1.0, or None if timeout/no signal
- Automatically converts from any DeckLink pixel format to RGB
- Output is always full range (0.0-1.0)

**`capture_frame_with_metadata(timeout_ms=5000) -> Optional[dict]`**
Capture a frame with format metadata.
- `timeout_ms`: Timeout in milliseconds (default: 5000)
- Returns: Dictionary with frame data and metadata, or None if timeout/no signal

Dictionary keys:
- `'rgb'`: RGB float32 array (H×W×3), values 0.0-1.0
- `'width'`: Frame width in pixels
- `'height'`: Frame height in pixels
- `'format'`: Pixel format name (e.g., "YUV10", "RGB10")
- `'mode'`: Display mode name (e.g., "1080p25")
- `'colorspace'`: Color matrix name (e.g., "Rec709", "Rec2020")
- `'eotf'`: Transfer function name (e.g., "SDR", "PQ", "HLG")
- `'input_narrow_range'`: Boolean indicating if input was narrow range
- `'hdr_metadata'`: Dictionary with HDR metadata (only present if HDR metadata is in the signal)
  - `'display_primaries'`: Dictionary with display primaries (if present)
    - `'red'`: Dictionary with `'x'` and `'y'` chromaticity coordinates
    - `'green'`: Dictionary with `'x'` and `'y'` chromaticity coordinates
    - `'blue'`: Dictionary with `'x'` and `'y'` chromaticity coordinates
  - `'white_point'`: Dictionary with `'x'` and `'y'` chromaticity coordinates (if present)
  - `'mastering_luminance'`: Dictionary with `'max'` and `'min'` luminance in cd/m² (if present)
  - `'content_light'`: Dictionary with content light levels (if present)
    - `'max_cll'`: Maximum content light level in cd/m² (MaxCLL) (if present)
    - `'max_fall'`: Maximum frame average light level in cd/m² (MaxFALL) (if present)

**`get_detected_format() -> Optional[dict]`**
Get information about the detected input signal.
- Returns: Dictionary with format information, or None if no signal

Dictionary keys:
- `'mode'`: Display mode name
- `'width'`: Frame width in pixels
- `'height'`: Frame height in pixels
- `'framerate'`: Frame rate in fps

**`cleanup()`**
Cleanup resources and stop capture.

**Context Manager Support:**
```python
with BlackmagicInput() as input_device:
    input_device.initialize()
    # ... use input ...
# Automatic cleanup() called on exit
```

The context manager automatically calls `cleanup()` when exiting.

### Low-Level API: DeckLinkOutput Class

Direct C++ API for more fine-grained control.

#### Methods

**`get_device_list() -> List[str]`**
Get list of available DeckLink devices.

**`initialize(device_index=0) -> bool`**
Initialize the specified DeckLink device.

**`get_supported_display_modes() -> List[DisplayModeInfo]`**
Get list of supported display modes for the initialized device.
- Returns: List of DisplayModeInfo objects with display_mode, name, width, height, framerate

**`is_pixel_format_supported(display_mode, pixel_format) -> bool`**
Check if a pixel format is supported for a given display mode.

**`get_video_settings(display_mode) -> VideoSettings`**
Get video settings object for a display mode.

**`set_hdr_metadata(colorimetry: Gamut, eotf: Eotf)`**
Set HDR metadata with default values. Must be called before `setup_output()`.

**`set_hdr_static_metadata(colorimetry: Gamut, eotf: Eotf, static_metadata: HdrStaticMetadata)`**
Set HDR Static Metadata (per SMPTE ST 2086 / CEA-861.3 Type 1) with explicit display primaries, white point, mastering display luminance, and content light level fields. Must be called before `setup_output()`.

**`clear_hdr_metadata()`**
Clear HDR metadata and reset to SDR. Call before `setup_output()` if you want to ensure no HDR metadata is present.

**HDMI vs SDI when changing HDR metadata mid-stream:** SDI carries metadata per-frame in the VPID, so SDI consumers see updated metadata on the very next frame after a `set_hdr_metadata*()` call. On HDMI, the HDR Static Metadata InfoFrame is sticky — it does not update until new video data is sent — so after changing metadata mid-stream you must call `display_frame()` again for HDMI consumers to see the new values. The frame contents do not need to change; pushing the same frame is enough to refresh the InfoFrame.

**`setup_output(settings: VideoSettings) -> bool`**
Setup output with detailed settings.

**`set_frame_data(data: np.ndarray) -> bool`**
Set frame data from NumPy array (must be in correct format).

**`display_frame() -> bool`**
Display the current frame synchronously. Call this after `set_frame_data()` to update the display.

**`get_current_output_info() -> OutputInfo`**
Get information about the current output configuration.
- Returns: OutputInfo struct with display_mode_name, pixel_format_name, width, height, framerate, rgb444_mode_enabled

**`stop_output() -> bool`**
Stop video output.
- Returns: True if successful

Stops displaying frames but keeps the device initialized and ready for immediate reuse. After calling `stop_output()`, you can call `setup_output()` and `display_frame()` again without needing to re-initialize.

**`cleanup()`**
Cleanup all resources.

Stops video output (if running) and releases all device resources. After `cleanup()`, the device must be re-initialized with `initialize()` before it can be used again. This method automatically calls `stop_output()` internally, so there is no need to call `stop_output()` first.

### Data Structures

**`VideoSettings`**
```python
class VideoSettings:
    mode: DisplayMode      # Video mode (resolution / framerate)
    format: PixelFormat    # Pixel format
    width: int             # Frame width in pixels
    height: int            # Frame height in pixels
    framerate: float       # Frame rate (e.g., 25.0, 29.97, 60.0)
    colorimetry: Gamut     # Y'CbCr matrix (Rec601 / Rec709 / Rec2020)
    eotf: Eotf             # Transfer function (SDR / PQ / HLG)
```

**Note:** What the Blackmagic SDK refers to as the "color space" (BMDColorspace) is in fact the matrix used for R'G'B' to Y'CbCr conversion, not the gamut of the image data. For example, ARRI Wide Gamut data would typically be converted using a Rec.709 matrix.

**`HdrStaticMetadata`**

The fields described in SMPTE ST 2086 (mastering display) and CEA-861.3 (HDR Static Metadata Type 1 InfoFrame: MaxCLL, MaxFALL).

```python
class HdrStaticMetadata:
    # Display primaries (xy chromaticity coordinates)
    display_primaries_red_x: float
    display_primaries_red_y: float
    display_primaries_green_x: float
    display_primaries_green_y: float
    display_primaries_blue_x: float
    display_primaries_blue_y: float
    white_point_x: float
    white_point_y: float

    # Luminance values (nits)
    max_display_mastering_luminance: float
    min_display_mastering_luminance: float
    max_content_light_level: float
    max_frame_average_light_level: float
```

**`OutputInfo`**
```python
class OutputInfo:
    display_mode: DisplayMode         # Current display mode
    pixel_format: PixelFormat         # Current pixel format
    width: int                        # Frame width in pixels
    height: int                       # Frame height in pixels
    framerate: float                  # Frame rate (e.g., 25.0, 29.97, 60.0)
    rgb444_mode_enabled: bool         # Whether RGB 4:4:4 mode is enabled
    display_mode_name: str            # Human-readable display mode name
    pixel_format_name: str            # Human-readable pixel format name
```

**`DisplayModeInfo`**
```python
class DisplayModeInfo:
    display_mode: DisplayMode         # Display mode enum value
    name: str                         # Human-readable mode name
    width: int                        # Frame width in pixels
    height: int                       # Frame height in pixels
    framerate: float                  # Frame rate (e.g., 25.0, 29.97, 60.0)
```

**`DeviceCapabilities`**
```python
class DeviceCapabilities:
    name: str                         # Device name
    supports_input: bool              # True if device can capture video
    supports_output: bool             # True if device can output video
```

Returned by `get_device_capabilities()` to query what a device supports before initializing it.

**`CapturedFrame`**
```python
class CapturedFrame:
    # Frame data
    data: List[uint8]                 # Raw frame data
    width: int                        # Frame width in pixels
    height: int                       # Frame height in pixels
    format: PixelFormat               # Pixel format
    mode: DisplayMode                 # Display mode
    valid: bool                       # Whether frame is valid

    # Format metadata
    colorspace: Gamut                 # Color matrix (Rec601/Rec709/Rec2020)
    eotf: Eotf                        # Transfer function (SDR/PQ/HLG)
    has_metadata: bool                # Whether metadata is present

    # Timecode (if present)
    has_timecode: bool                # Whether timecode is present
    timecode_hours: int               # Timecode hours (0-23)
    timecode_minutes: int             # Timecode minutes (0-59)
    timecode_seconds: int             # Timecode seconds (0-59)
    timecode_frames: int              # Timecode frames (frame number within second)
    timecode_is_drop_frame: bool      # True for drop frame timecode

    # HDR metadata (if present)
    display_primaries_red_x: float
    display_primaries_red_y: float
    display_primaries_green_x: float
    display_primaries_green_y: float
    display_primaries_blue_x: float
    display_primaries_blue_y: float
    has_display_primaries: bool

    white_point_x: float
    white_point_y: float
    has_white_point: bool

    max_display_mastering_luminance: float
    min_display_mastering_luminance: float
    has_mastering_luminance: bool

    max_content_light_level: float
    has_max_cll: bool

    max_frame_average_light_level: float
    has_max_fall: bool
```

Used by the low-level `DeckLinkInput.capture_frame()` method. Contains raw frame data plus all detected metadata including timecode and HDR information.

### Low-Level API: DeckLinkInput Class

Direct C++ API for more fine-grained control over video capture.

#### Methods

**`get_device_list() -> List[str]`**
Get list of available DeckLink devices.

**`get_available_input_connections(device_index=0) -> List[InputConnection]`**
Get available input connections for a DeckLink device.
- `device_index`: Index of device to query (default: 0)
- Returns: List of InputConnection enum values

**`initialize(device_index=0, input_connection=None) -> bool`**
Initialize the specified DeckLink device for input.
- `device_index`: Index of device to use (default: 0)
- `input_connection`: Optional InputConnection enum to select specific input. If None, uses device's current/default input.
- Returns: True if successful

**`start_capture(format=PixelFormat.Format10BitYUV) -> bool`**
Start capturing with specified or auto-detected format.
- `format`: Optional PixelFormat to request from hardware (default: Format10BitYUV)
- Returns: True if successful

Use `PixelFormat.Format8BitBGRA` for fast real-time preview workflows where 8-bit precision is acceptable. This avoids expensive colorspace conversions and enables ~25fps capture rates.

**`capture_frame(frame, timeout_ms=5000) -> bool`**
Capture a single frame.
- `frame`: CapturedFrame object to populate
- `timeout_ms`: Timeout in milliseconds (default: 5000)
- Returns: True if successful

**`stop_capture() -> bool`**
Stop video capture.
- Returns: True if successful

**`get_detected_format() -> VideoSettings`**
Get the detected video format.
- Returns: VideoSettings object with format information

**`get_detected_pixel_format() -> PixelFormat`**
Get the detected pixel format.
- Returns: PixelFormat enum value

**`get_video_settings(mode) -> VideoSettings`**
Get video settings for a display mode.

**`get_supported_display_modes() -> List[DisplayModeInfo]`**
Get list of supported display modes for the initialized device.

**`set_hdmi_input_dynamic_ranges(dynamic_range_mask: int) -> bool`**
Set the BMDDynamicRange bitmask advertised in the HDMI input EDID. Sources read this to decide which transfer functions they may transmit. Pass any combination of BMDDynamicRange bits as a single int — values are passed through to the SDK so newer SDKs adding additional bits work without library changes.
- `dynamic_range_mask`: Bitwise OR of `bmdDynamicRangeSDR` (0), `bmdDynamicRangeHDRStaticPQ` (1 << 29), and/or `bmdDynamicRangeHDRStaticHLG` (1 << 30)
- Returns: True if the mask was stored or applied successfully

The library defaults to advertising `SDR | HDR Static PQ | HDR Static HLG`. The SDK default omits HLG, which causes many HDMI sources to strip HDR Static Metadata when transmitting HLG; the library's default fixes that. May be called before or after `initialize()`. Has no effect on non-HDMI connections or on hardware that does not expose `IDeckLinkHDMIInputEDID` (older devices) — these cases soft-fail and capture proceeds normally. The library releases its EDID interface in `cleanup()`, which restores the default EDID per the SDK.

**`cleanup()`**
Cleanup and release resources.

### Utility Functions

**`rgb_to_bgra(rgb_array, width, height) -> np.ndarray`**
Convert RGB to BGRA format.
- `rgb_array`: NumPy array (H×W×3), dtype uint8
- 8-bit data is always treated as full range, but 8-bit Y'CbCr output will always be narrow range
- Returns: BGRA array (H×W×4), dtype uint8

**`rgb_uint8_to_yuv8(rgb_array, width, height, matrix=Matrix.Rec709, input_narrow_range=False, output_narrow_range=True) -> np.ndarray`**
Convert R'G'B' uint8 to 8-bit Y'CbCr 2vuy format.
- `rgb_array`: NumPy array (H×W×3), dtype uint8 (0-255 range)
- `matrix`: R'G'B' to Y'CbCr conversion matrix (Matrix.Rec601, Matrix.Rec709 or Matrix.Rec2020). Default: Matrix.Rec709
- `input_narrow_range`: Whether to interpret the `rgb_array` as narrow range (16-235). Default: False
- `output_narrow_range`: Whether to encode the Y'CbCr as narrow range (Y: 16-235, CbCr: 16-240). Default: True
- Returns: Packed 2vuy array

**`rgb_uint16_to_yuv8(rgb_array, width, height, matrix=Matrix.Rec709, input_narrow_range=False, output_narrow_range=True) -> np.ndarray`**
Convert R'G'B' uint16 to 8-bit Y'CbCr 2vuy format.
- `rgb_array`: NumPy array (H×W×3), dtype uint16 (0-65535 range)
- `matrix`: R'G'B' to Y'CbCr conversion matrix (Matrix.Rec601, Matrix.Rec709 or Matrix.Rec2020). Default: Matrix.Rec709
- `input_narrow_range`: Whether to interpret the `rgb_array` as narrow range. Default: False
- `output_narrow_range`: Whether to encode the Y'CbCr as narrow range (Y: 16-235, CbCr: 16-240). Default: True
- Returns: Packed 2vuy array

**`rgb_float_to_yuv8(rgb_array, width, height, matrix=Matrix.Rec709, output_narrow_range=True) -> np.ndarray`**
Convert R'G'B' float to 8-bit Y'CbCr 2vuy format.
- `rgb_array`: NumPy array (H×W×3), dtype float32 (0.0-1.0 range)
- `matrix`: R'G'B' to Y'CbCr conversion matrix (Matrix.Rec601, Matrix.Rec709 or Matrix.Rec2020). Default: Matrix.Rec709
- `output_narrow_range`: Whether to encode the Y'CbCr as narrow range (Y: 16-235, CbCr: 16-240). Default: True
- Returns: Packed 2vuy array

**`rgb_uint16_to_yuv10(rgb_array, width, height, matrix=Matrix.Rec709, input_narrow_range=False, output_narrow_range=True) -> np.ndarray`**
Convert R'G'B' uint16 to 10-bit Y'CbCr v210 format.
- `rgb_array`: NumPy array (H×W×3), dtype uint16 (0-65535 range)
- `matrix`: R'G'B' to Y'CbCr conversion matrix (Matrix.Rec601, Matrix.Rec709 or Matrix.Rec2020). Default: Matrix.Rec709
- `input_narrow_range`: Whether to interpret the `rgb_array` as narrow range. Default: False
- `output_narrow_range`: Whether to encode the Y'CbCr as narrow range. Default: True
- Returns: Packed v210 array

**`rgb_float_to_yuv10(rgb_array, width, height, matrix=Matrix.Rec709, output_narrow_range=True) -> np.ndarray`**
Convert R'G'B' float to 10-bit Y'CbCr v210 format.
- `rgb_array`: NumPy array (H×W×3), dtype float32 (0.0-1.0 range)
- `matrix`: R'G'B' to Y'CbCr conversion matrix (Matrix.Rec601, Matrix.Rec709 or Matrix.Rec2020). Default: Matrix.Rec709
- `output_narrow_range`: Whether to encode the Y'CbCr as narrow range. Default: True
- Returns: Packed v210 array

**`rgb_uint16_to_rgb10(rgb_array, width, height, input_narrow_range=True, output_narrow_range=True) -> np.ndarray`**
Convert R'G'B' uint16 to 10-bit R'G'B' (bmdFormat10BitRGBXLE) format.
- `rgb_array`: NumPy array (H×W×3), dtype uint16 (0-65535 range)
- `input_narrow_range`: Whether to interpret the `rgb_array` as narrow range. Default: True
- `output_narrow_range`: Whether to output narrow range. Default: True
- Returns: Packed 10-bit R'G'B' array

**`rgb_float_to_rgb10(rgb_array, width, height, output_narrow_range=True) -> np.ndarray`**
Convert R'G'B' float to 10-bit R'G'B' (bmdFormat10BitRGBXLE) format.
- `rgb_array`: NumPy array (H×W×3), dtype float32 (0.0-1.0 range)
- `output_narrow_range`: Whether to output narrow range. Default: True
- Returns: Packed 10-bit R'G'B' array

**`rgb_uint16_to_rgb12(rgb_array, width, height, input_narrow_range=False, output_narrow_range=False) -> np.ndarray`**
Convert R'G'B' uint16 to 12-bit R'G'B' (bmdFormat12BitRGBLE) format.
- `rgb_array`: NumPy array (H×W×3), dtype uint16 (0-65535 range)
- `input_narrow_range`: Whether to interpret the `rgb_array` as narrow range. Default: False
- `output_narrow_range`: Whether to output narrow range. Default: False
- Returns: Packed 12-bit R'G'B' array

**`rgb_float_to_rgb12(rgb_array, width, height, output_narrow_range=False) -> np.ndarray`**
Convert R'G'B' float to 12-bit R'G'B' (bmdFormat12BitRGBLE) format.
- `rgb_array`: NumPy array (H×W×3), dtype float32 (0.0-1.0 range)
- `output_narrow_range`: Whether to output narrow range. Default: False
- Returns: Packed 12-bit R'G'B' array

**`yuv10_to_rgb_uint16(yuv_array, width, height, matrix=Matrix.Rec709, input_narrow_range=True) -> np.ndarray`**
Convert 10-bit Y'CbCr v210 format to R'G'B' uint16.
- `yuv_array`: NumPy array containing packed v210 data
- `width`: Frame width in pixels
- `height`: Frame height in pixels
- `matrix`: Y'CbCr to R'G'B' conversion matrix (Matrix.Rec601, Matrix.Rec709 or Matrix.Rec2020). Default: Matrix.Rec709
- `input_narrow_range`: Whether to interpret the Y'CbCr as narrow range. Default: True
- Returns: NumPy array (H×W×3), dtype uint16 (0-65535 range)

**`yuv10_to_rgb_float(yuv_array, width, height, matrix=Matrix.Rec709, input_narrow_range=True) -> np.ndarray`**
Convert 10-bit Y'CbCr v210 format to R'G'B' float.
- `yuv_array`: NumPy array containing packed v210 data
- `width`: Frame width in pixels
- `height`: Frame height in pixels
- `matrix`: Y'CbCr to R'G'B' conversion matrix (Matrix.Rec601, Matrix.Rec709 or Matrix.Rec2020). Default: Matrix.Rec709
- `input_narrow_range`: Whether to interpret the Y'CbCr as narrow range. Default: True
- Returns: NumPy array (H×W×3), dtype float32 (0.0-1.0 range)

**`yuv8_to_rgb_uint16(yuv_array, width, height, matrix=Matrix.Rec709) -> np.ndarray`**
Convert 8-bit Y'CbCr 4:2:2 (2vuy) format to R'G'B' uint16.
- `yuv_array`: NumPy array containing packed 2vuy data
- `width`: Frame width in pixels
- `height`: Frame height in pixels
- `matrix`: Y'CbCr to R'G'B' conversion matrix (Matrix.Rec601, Matrix.Rec709 or Matrix.Rec2020). Default: Matrix.Rec709
- Returns: NumPy array (H×W×3), dtype uint16 (0-65535 range)

**`yuv8_to_rgb_float(yuv_array, width, height, matrix=Matrix.Rec709) -> np.ndarray`**
Convert 8-bit Y'CbCr 4:2:2 (2vuy) format to R'G'B' float.
- `yuv_array`: NumPy array containing packed 2vuy data
- `width`: Frame width in pixels
- `height`: Frame height in pixels
- `matrix`: Y'CbCr to R'G'B' conversion matrix (Matrix.Rec601, Matrix.Rec709 or Matrix.Rec2020). Default: Matrix.Rec709
- Returns: NumPy array (H×W×3), dtype float32 (0.0-1.0 range)

**`rgb10_to_uint16(rgb_array, width, height, input_narrow_range=True, output_narrow_range=False) -> np.ndarray`**
Convert 10-bit R'G'B' (bmdFormat10BitRGBXLE) format to R'G'B' uint16.
- `rgb_array`: NumPy array containing packed 10-bit R'G'B' data
- `width`: Frame width in pixels
- `height`: Frame height in pixels
- `input_narrow_range`: Whether to interpret the 10-bit R'G'B' as narrow range. Default: True
- `output_narrow_range`: Whether to output narrow range. Default: False
- Returns: NumPy array (H×W×3), dtype uint16 (0-65535 range)

**`rgb10_to_float(rgb_array, width, height, input_narrow_range=True) -> np.ndarray`**
Convert 10-bit R'G'B' (bmdFormat10BitRGBXLE) format to R'G'B' float.
- `rgb_array`: NumPy array containing packed 10-bit R'G'B' data
- `width`: Frame width in pixels
- `height`: Frame height in pixels
- `input_narrow_range`: Whether to interpret the 10-bit R'G'B' as narrow range. Default: True
- Returns: NumPy array (H×W×3), dtype float32 (0.0-1.0 range)

**`rgb12_to_uint16(rgb_array, width, height, input_narrow_range=False, output_narrow_range=False) -> np.ndarray`**
Convert 12-bit R'G'B' (bmdFormat12BitRGBLE) format to R'G'B' uint16.
- `rgb_array`: NumPy array containing packed 12-bit R'G'B' data
- `width`: Frame width in pixels
- `height`: Frame height in pixels
- `input_narrow_range`: Whether to interpret the 12-bit R'G'B' as narrow range. Default: False
- `output_narrow_range`: Whether to output narrow range. Default: False
- Returns: NumPy array (H×W×3), dtype uint16 (0-65535 range)

**`rgb12_to_float(rgb_array, width, height, input_narrow_range=False) -> np.ndarray`**
Convert 12-bit R'G'B' (bmdFormat12BitRGBLE) format to R'G'B' float.
- `rgb_array`: NumPy array containing packed 12-bit R'G'B' data
- `width`: Frame width in pixels
- `height`: Frame height in pixels
- `input_narrow_range`: Whether to interpret the 12-bit R'G'B' as narrow range. Default: False
- Returns: NumPy array (H×W×3), dtype float32 (0.0-1.0 range)

**`unpack_v210(v210_array, width, height) -> tuple[np.ndarray, np.ndarray, np.ndarray]`**
Unpack 10-bit Y'CbCr v210 format to Y', Cb, Cr component arrays.
- `v210_array`: NumPy array containing packed v210 data
- `width`: Frame width in pixels
- `height`: Frame height in pixels
- Returns: Tuple of (Y, Cb, Cr) NumPy arrays, each dtype uint16 (0-1023 range, 10-bit values)

**`unpack_2vuy(yuv_array, width, height) -> tuple[np.ndarray, np.ndarray, np.ndarray]`**
Unpack 8-bit Y'CbCr 4:2:2 (2vuy) format to Y', Cb, Cr component arrays.
- `yuv_array`: NumPy array containing packed 2vuy data
- `width`: Frame width in pixels
- `height`: Frame height in pixels
- Returns: Tuple of (Y, Cb, Cr) NumPy arrays, each dtype uint8

**`unpack_rgb10(rgb_array, width, height) -> tuple[np.ndarray, np.ndarray, np.ndarray]`**
Unpack 10-bit R'G'B' (bmdFormat10BitRGBXLE) format to R', G', B' component arrays.
- `rgb_array`: NumPy array containing packed 10-bit R'G'B' data
- `width`: Frame width in pixels
- `height`: Frame height in pixels
- Returns: Tuple of (R, G, B) NumPy arrays, each dtype uint16 (0-1023 range, 10-bit values)

**`unpack_rgb12(rgb_array, width, height) -> tuple[np.ndarray, np.ndarray, np.ndarray]`**
Unpack 12-bit R'G'B' (bmdFormat12BitRGBLE) format to R', G', B' component arrays.
- `rgb_array`: NumPy array containing packed 12-bit R'G'B' data
- `width`: Frame width in pixels
- `height`: Frame height in pixels
- Returns: Tuple of (R, G, B) NumPy arrays, each dtype uint16 (0-4095 range, 12-bit values)

### Enums

**`DisplayMode`**

The library supports all display modes available on your DeckLink device. Display mode settings (resolution, framerate) are queried dynamically from the hardware. Common examples include:
- `HD1080p25`: 1920×1080 @ 25fps
- `HD1080p30`: 1920×1080 @ 30fps
- `HD1080p50`: 1920×1080 @ 50fps
- `HD1080p60`: 1920×1080 @ 60fps
- `HD720p50`: 1280×720 @ 50fps
- `HD720p60`: 1280×720 @ 60fps

Additional modes are available including SD (NTSC, PAL), 2K, 4K, 8K, and PC display modes. The complete list of DisplayMode values can be found in `src/blackmagic_io/blackmagic_io.py`.

**Querying Available Display Modes:**

You can query which display modes are supported by your specific DeckLink device using `get_supported_display_modes()`:

```python
from blackmagic_io import BlackmagicOutput

with BlackmagicOutput() as output:
    output.initialize()

    # Get all supported display modes
    modes = output.get_supported_display_modes()
    print(f"Device supports {len(modes)} display modes:\n")
    for mode in modes:
        print(f"{mode['name']}: {mode['width']}x{mode['height']} @ {mode['framerate']:.2f} fps")
```

To determine which pixel formats are supported for a specific display mode, use `is_pixel_format_supported()`:

```python
from blackmagic_io import BlackmagicOutput, DisplayMode, PixelFormat

with BlackmagicOutput() as output:
    output.initialize()

    # Test pixel format support for a specific mode
    print("Pixel formats supported for HD1080p25:")
    test_formats = [PixelFormat.YUV10, PixelFormat.RGB10, PixelFormat.RGB12]
    for fmt in test_formats:
        supported = output.is_pixel_format_supported(DisplayMode.HD1080p25, fmt)
        status = "✓" if supported else "✗"
        print(f"{status} {fmt.name}")
```

**`PixelFormat`**
- `BGRA`: 8-bit BGRA (automatically used for uint8 data)
  - **Note**: Over SDI, BGRA data is output as narrow range 8-bit Y'CbCr 4:2:2, not RGB. The BGRA name refers to the input buffer format, not the SDI wire format.
- `YUV8`: 8-bit Y'CbCr 4:2:2 (2vuy) - direct 8-bit Y'CbCr output
  - uint8 input: Configurable interpretation via `input_narrow_range` parameter
  - uint16 input: Configurable interpretation via `input_narrow_range` parameter
  - float input: Always interpreted as full range (0.0-1.0)
  - Output range configurable via `output_narrow_range` parameter
  - Defaults: `input_narrow_range=False, output_narrow_range=True`
  - More efficient than BGRA for 8-bit Y'CbCr workflows (avoids hardware conversion)
- `YUV10`: 10-bit Y'CbCr 4:2:2 (v210) - default for uint16 / float data
  - Defaults to narrow range: Y: 64-940, UV: 64-960. Supports full range Y'CbCr (0-1023, as per [Rec. ITU-T H.273](https://www.itu.int/rec/T-REC-H.273)) if `output_narrow_range` is False in the high level API
- `RGB10`: 10-bit R'G'B' (bmdFormat10BitRGBXLE) - native R'G'B' output without Y'CbCr conversion
  - uint16 input: Configurable interpretation via `input_narrow_range` parameter
  - float input: Always interpreted as full range (0.0-1.0)
  - Output range configurable via `output_narrow_range` parameter
  - Defaults: `input_narrow_range=True, output_narrow_range=True`
- `RGB12`: 12-bit R'G'B' (bmdFormat12BitRGBLE) - native R'G'B' output with 12-bit precision
  - uint16 input: Configurable interpretation via `input_narrow_range` parameter
  - float input: Always interpreted as full range (0.0-1.0)
  - Output range configurable via `output_narrow_range` parameter
  - Defaults: `input_narrow_range=False, output_narrow_range=False`

### Range Signaling Limitations

**Important:** While this library supports both narrow and full range output encoding via the `output_narrow_range` parameter, the Blackmagic DeckLink SDK (v14.1) does not provide APIs to control the full range flag in the VPID, as per SMPTE ST 425-1 (byte 4, bit 7):

- **YUV10**: The library can encode full range Y'CbCr (0-1023) with `output_narrow_range=False`, but cannot set the full range flag in the VPID. Downstream devices may well assume narrow range.

- **RGB10**: The convention is that 10-bit RGB is narrow range, as described in the Blackmagic SDK, so using `output_narrow_range=False` may cause downstream devices to misinterpret the signal.

- **RGB12**: The convention is that 12-bit RGB is full range, as described in the Blackmagic SDK, so using `output_narrow_range=True` may cause downstream devices to misinterpret the signal.

The `output_narrow_range` parameter controls the **actual encoded values** in the output stream, not metadata signaling. Use it when you know the downstream device will correctly interpret the range, or when the receiving device allows manual range configuration.

**`Matrix`** (High-level API)
- `Rec709`: ITU-R BT.709 R'G'B' to Y'CbCr conversion matrix (standard HD)
- `Rec2020`: ITU-R BT.2020 R'G'B' to Y'CbCr conversion matrix (wide color gamut for HDR)

**`Gamut`** (Low-level API, same values as Matrix)
- `Rec709`: ITU-R BT.709 colorimetry (standard HD)
- `Rec2020`: ITU-R BT.2020 colorimetry (wide color gamut for HDR)

**`Eotf`**
- `SDR`: Standard Dynamic Range (BT.1886 transfer function)
- `PQ`: Perceptual Quantizer (SMPTE ST 2084, HDR10)
- `HLG`: Hybrid Log-Gamma (HDR broadcast standard)

**`InputConnection`**

Physical input connections available on DeckLink devices. Use `get_available_input_connections()` to query which inputs are available on a specific device.

- `SDI`: SDI input (Serial Digital Interface)
- `HDMI`: HDMI input
- `OpticalSDI`: Optical SDI input
- `Component`: Component video input (Y, Pb, Pr)
- `Composite`: Composite video input (CVBS)
- `SVideo`: S-Video input (Y/C)

**Example:**
```python
from blackmagic_io import BlackmagicInput, InputConnection

with BlackmagicInput() as input_device:
    # Query available inputs
    inputs = input_device.get_available_input_connections(0)
    print(f"Available inputs: {[str(inp) for inp in inputs]}")

    # Initialize with HDMI input
    if InputConnection.HDMI in inputs:
        input_device.initialize(0, InputConnection.HDMI)
    else:
        # Use default input
        input_device.initialize(0)

    # Capture frame
    rgb = input_device.capture_frame_as_rgb()
```

## Examples

### Example 1: Color Bars Test Pattern

```python
from blackmagic_io import BlackmagicOutput, DisplayMode, create_test_pattern

# Create color bars test pattern
frame = create_test_pattern(1920, 1080, 'bars')

with BlackmagicOutput() as output:
    output.initialize()  # Optional - auto-initializes on first display if omitted
    output.display_static_frame(frame, DisplayMode.HD1080p25)
    input("Press Enter to stop...")
```

### Example 2: Dynamic Animation

```python
import numpy as np
import time
from blackmagic_io import BlackmagicOutput, DisplayMode

with BlackmagicOutput() as output:
    # Start with black frame
    frame = np.zeros((1080, 1920, 3), dtype=np.float32)
    output.display_static_frame(frame, DisplayMode.HD1080p25)

    # Animate
    for i in range(100):
        # Create moving pattern
        frame.fill(0.0)
        offset = i * 10
        frame[:, offset:offset+100] = [1.0, 1.0, 1.0]  # White bar

        output.update_frame(frame)
        time.sleep(1 / 25)  # Limit update rate (actual rate will be lower due to processing overhead)
```

### Example 3: Load Image from File

```python
import imageio.v3 as iio
import numpy as np
from blackmagic_io import BlackmagicOutput, DisplayMode

# Load image (preserves bit depth for 16-bit TIFFs, etc.)
# Note: Use TIFF for reliable 16-bit support (PNGs may convert to 8-bit)
frame = iio.imread('your_image.tif')

# Resize if needed
if frame.shape[0] != 1080 or frame.shape[1] != 1920:
    from PIL import Image
    img = Image.fromarray(frame)
    img = img.resize((1920, 1080), Image.Resampling.LANCZOS)
    frame = np.array(img)

# Remove alpha channel if present
if frame.shape[2] == 4:
    frame = frame[:, :, :3]

# Display image (format auto-detected from dtype)
with BlackmagicOutput() as output:
    output.display_static_frame(frame, DisplayMode.HD1080p25)
    input("Press Enter to stop...")
```

### Example 4: 10-bit Y'CbCr Output with Float Data

```python
import numpy as np
from blackmagic_io import BlackmagicOutput, DisplayMode

# Create float R'G'B' image (0.0-1.0 range)
frame = np.zeros((1080, 1920, 3), dtype=np.float32)

# Example: gradient in float space
for y in range(1080):
    for x in range(1920):
        frame[y, x] = [
            x / 1920,           # Red gradient
            y / 1080,           # Green gradient
            0.5                 # Blue constant
        ]

# Output as 10-bit Y'CbCr (automatically selected for float data)
with BlackmagicOutput() as output:
    output.display_static_frame(frame, DisplayMode.HD1080p25)
    input("Press Enter to stop...")
```

### Example 5: 10-bit Y'CbCr with uint16 Data

```python
import numpy as np
from blackmagic_io import BlackmagicOutput, DisplayMode

# Create uint16 R'G'B' image (0-65535 range)
# Useful for 10-bit / 12-bit / 16-bit image processing pipelines
frame = np.zeros((1080, 1920, 3), dtype=np.uint16)

# Full range gradient
for x in range(1920):
    frame[:, x, 0] = int(x / 1920 * 65535)  # Red gradient

# Output as 10-bit Y'CbCr (automatically selected for uint16 data)
with BlackmagicOutput() as output:
    output.display_static_frame(frame, DisplayMode.HD1080p25)
    input("Press Enter to stop...")
```

### Example 5a: 10-bit R'G'B' with uint16 Data

```python
import numpy as np
from blackmagic_io import BlackmagicOutput, DisplayMode, PixelFormat

# Create uint16 R'G'B' image (0-65535 range)
frame = np.zeros((1080, 1920, 3), dtype=np.uint16)

# Full range gradient
for x in range(1920):
    frame[:, x, 0] = int(x / 1920 * 65535)  # Red gradient

# Output as 10-bit R'G'B' (bit-shifted from 16-bit to 10-bit)
with BlackmagicOutput() as output:
    output.display_static_frame(frame, DisplayMode.HD1080p25, PixelFormat.RGB10)
    input("Press Enter to stop...")
```

### Example 5b: 10-bit R'G'B' with Float Data (Narrow Range)

```python
import numpy as np
from blackmagic_io import BlackmagicOutput, DisplayMode, PixelFormat

# Create float R'G'B' image (0.0-1.0 range)
frame = np.zeros((1080, 1920, 3), dtype=np.float32)

# Gradient
for x in range(1920):
    frame[:, x, 0] = x / 1920  # Red gradient

# Output as 10-bit R'G'B' with narrow range (0.0-1.0 maps to 64-940)
with BlackmagicOutput() as output:
    output.display_static_frame(
        frame,
        DisplayMode.HD1080p25,
        PixelFormat.RGB10,
        output_narrow_range=True  # Default: narrow range
    )
    input("Press Enter to stop...")
```

### Example 5c: 10-bit R'G'B' with Float Data (Full Range)

```python
import numpy as np
from blackmagic_io import BlackmagicOutput, DisplayMode, PixelFormat

# Create float R'G'B' image (0.0-1.0 range)
frame = np.zeros((1080, 1920, 3), dtype=np.float32)

# Gradient
for x in range(1920):
    frame[:, x, 0] = x / 1920  # Red gradient

# Output as 10-bit R'G'B' with full range (0.0-1.0 maps to 0-1023)
with BlackmagicOutput() as output:
    output.display_static_frame(
        frame,
        DisplayMode.HD1080p25,
        PixelFormat.RGB10,
        output_narrow_range=False  # Full range
    )
    input("Press Enter to stop...")
```

### Example 6: 8-bit BGRA Output

For simple applications or quick testing, 8-bit RGB data can be used directly without conversion to float or uint16. Note that 8-bit data is always treated as full range RGB input and output as narrow range 8-bit Y'CbCr 4:2:2 over SDI.

```python
import numpy as np
from blackmagic_io import BlackmagicOutput, DisplayMode

# Create 8-bit R'G'B' image (0-255 range, full range)
frame = np.zeros((1080, 1920, 3), dtype=np.uint8)

# Simple gradient
for x in range(1920):
    frame[:, x, 0] = int(255 * x / 1920)  # Red gradient

# Output as 8-bit (BGRA format automatically selected for uint8 data)
with BlackmagicOutput() as output:
    output.display_static_frame(frame, DisplayMode.HD1080p25)
    input("Press Enter to stop...")
```

### Example 7: HDR Output with Rec.2020 and HLG (Simplified API)

```python
import numpy as np
from blackmagic_io import BlackmagicOutput, DisplayMode, Matrix, Eotf

# Create HDR content in normalised float (0.0-1.0 range)
frame = np.zeros((1080, 1920, 3), dtype=np.float32)

# Example: HDR gradient with extended range
for y in range(1080):
    for x in range(1920):
        frame[y, x] = [
            x / 1920,           # Red gradient
            y / 1080,           # Green gradient
            0.5                 # Blue constant
        ]

# Configure for HLG HDR output using the simplified API
with BlackmagicOutput() as output:
    # Single call with matrix and HDR metadata
    # YUV10 automatically selected for float data
    output.display_static_frame(
        frame,
        DisplayMode.HD1080p25,
        matrix=Matrix.Rec2020,           # Use Rec.2020 matrix
        hdr_metadata={'eotf': Eotf.HLG}  # HLG with default metadata
    )

    input("Press Enter to stop...")
```

**Alternative: Low-level API for more control**

```python
import numpy as np
import decklink_io as dl

# Create HDR content
frame = np.zeros((1080, 1920, 3), dtype=np.float32)
# ... fill frame ...

# Configure for HLG HDR output using low-level API
output = dl.DeckLinkOutput()
output.initialize()

# Set HDR metadata BEFORE setup_output()
output.set_hdr_metadata(dl.Gamut.Rec2020, dl.Eotf.HLG)

# Setup output
settings = output.get_video_settings(dl.DisplayMode.HD1080p25)
settings.format = dl.PixelFormat.YUV10
output.setup_output(settings)

# Convert R'G'B' to Y'CbCr using Rec.2020 matrix
yuv_data = dl.rgb_float_to_yuv10(frame, 1920, 1080, dl.Matrix.Rec2020)
output.set_frame_data(yuv_data)

# Display the frame
output.display_frame()
input("Press Enter to stop...")
output.stop_output()
output.cleanup()
```

### Example 8: HDR10 (PQ) Output (Simplified API)

```python
import numpy as np
from blackmagic_io import BlackmagicOutput, DisplayMode, Matrix, Eotf

# Create HDR10 content with PQ transfer function applied
frame = np.zeros((1080, 1920, 3), dtype=np.float32)

# Fill with PQ-encoded HDR content
# A library such as Colour Science for Python (https://www.colour-science.org/) is needed for PQ encoding
# frame = colour.eotf(linear_rgb_data, 'ST 2084')

# Configure for HDR10 (PQ) output using the simplified API
with BlackmagicOutput() as output:
    # Single call with Rec.2020 matrix and PQ metadata
    # YUV10 automatically selected for float data
    output.display_static_frame(
        frame,
        DisplayMode.HD1080p25,
        matrix=Matrix.Rec2020,
        hdr_metadata={'eotf': Eotf.PQ}
    )

    input("Press Enter to stop...")
```

**Alternative: Low-level API**

```python
import numpy as np
import decklink_io as dl

# Create HDR10 content
frame = np.zeros((1080, 1920, 3), dtype=np.float32)
# ... fill frame ...

# Configure for HDR10 (PQ) output
output = dl.DeckLinkOutput()
output.initialize()

# IMPORTANT: Set HDR metadata BEFORE setup_output()
# This embeds HDR metadata (including Rec.2020 primaries, EOTF, mastering display info) in frames
output.set_hdr_metadata(dl.Gamut.Rec2020, dl.Eotf.PQ)

# Setup output settings
settings = output.get_video_settings(dl.DisplayMode.HD1080p25)
settings.format = dl.PixelFormat.YUV10
output.setup_output(settings)

# Convert to Y'CbCr with Rec.2020 matrix
yuv_data = dl.rgb_float_to_yuv10(frame, 1920, 1080, dl.Gamut.Rec2020)
output.set_frame_data(yuv_data)

# Display the frame
output.display_frame()
input("Press Enter to stop...")
output.stop_output()
output.cleanup()
```

### Example 9: Color Patches for Testing and Calibration

The `display_solid_color()` method supports displaying color patches smaller than full screen, useful for testing, calibration, and creating custom test patterns.

```python
import time
from blackmagic_io import BlackmagicOutput, DisplayMode

with BlackmagicOutput() as output:
    # Full screen white (default behavior)
    output.display_solid_color((1.0, 1.0, 1.0), DisplayMode.HD1080p25)
    time.sleep(2)

    # Centered 50% white patch on black background
    output.display_solid_color(
        (1.0, 1.0, 1.0),
        DisplayMode.HD1080p25,
        patch=(0.5, 0.5, 0.5, 0.5)  # (center_x, center_y, width, height)
    )
    time.sleep(2)

    # Small centered white patch (10% size) on gray background
    output.display_solid_color(
        (1.0, 1.0, 1.0),
        DisplayMode.HD1080p25,
        patch=(0.5, 0.5, 0.1, 0.1),
        background_color=(0.5, 0.5, 0.5)
    )
    time.sleep(2)

    # Red patch in top-left quadrant on blue background
    output.display_solid_color(
        (1.0, 0.0, 0.0),
        DisplayMode.HD1080p25,
        patch=(0.25, 0.25, 0.3, 0.3),
        background_color=(0.0, 0.0, 1.0)
    )
    time.sleep(2)

    # Horizontal bar across center (full width, half height)
    # Using integer 10-bit values with narrow range
    output.display_solid_color(
        (940, 940, 64),
        DisplayMode.HD1080p25,
        patch=(0.5, 0.5, 1.0, 0.5),
        background_color=(400, 400, 400),
        input_narrow_range=True
    )
    time.sleep(2)
```

**Patch coordinates:**
- All values are normalized (0.0-1.0) for resolution independence
- `center_x, center_y`: Position of patch center (0.0 = left/top, 1.0 = right/bottom)
- `width, height`: Patch dimensions as fraction of screen (1.0 = full width/height)
- Example: `(0.5, 0.5, 0.25, 0.25)` = centered patch, 25% of screen size

**Background color:**
- Uses same format as foreground `color` (integers 0-1023 or floats 0.0-1.0)
- Defaults to black if not specified
- For integer colors with `input_narrow_range=True`, black defaults to 64 instead of 0

## HDR Metadata

HDR metadata is embedded into each video frame using the DeckLink SDK's `IDeckLinkVideoFrameMetadataExtensions` interface. When you call `set_hdr_metadata()`, the library automatically wraps each output frame with the specified metadata.

### Metadata Includes:

- **Display Primaries**: Automatically set to match the matrix parameter (unless explicitly specified via `HdrStaticMetadata`)
  - Matrix.Rec709 → Rec.709 primaries (x,y): R(0.64, 0.33), G(0.30, 0.60), B(0.15, 0.06)
  - Matrix.Rec2020 → Rec.2020 primaries (x,y): R(0.708, 0.292), G(0.170, 0.797), B(0.131, 0.046)
- **White Point**: D65 (0.3127, 0.3290) for all matrices (unless explicitly specified)
- **EOTF**: Electro-Optical Transfer Function (SDR / Rec.709, PQ / SMPTE ST 2084, or HLG)
- **Mastering Display Info**: Default values for max / min luminance
- **Content Light Levels**: Max content light level and max frame average

### Default HDR Metadata Values (PQ only):

**When using Matrix.Rec709:**
```
Display Primaries: Rec.709 (ITU-R BT.709)
  Red:   (0.64, 0.33)
  Green: (0.30, 0.60)
  Blue:  (0.15, 0.06)
White Point: D65 (0.3127, 0.3290)
```

**When using Matrix.Rec2020:**
```
Display Primaries: Rec.2020 (ITU-R BT.2020)
  Red:   (0.708, 0.292)
  Green: (0.170, 0.797)
  Blue:  (0.131, 0.046)
White Point: D65 (0.3127, 0.3290)
```

**Luminance values:**
```python
Max Mastering Luminance: 1000 nits
Min Mastering Luminance: 0.0001 nits
Max Content Light Level: 1000 nits
Max Frame Average Light Level: 50 nits
```

### Customizing HDR Metadata Values:

**High-level API:**

```python
from blackmagic_io import BlackmagicOutput, DisplayMode, PixelFormat, Matrix, Eotf
import decklink_io as dl

# Build HDR Static Metadata for the source's mastering display
static_metadata = dl.HdrStaticMetadata()
static_metadata.display_primaries_red_x = 0.708
static_metadata.display_primaries_red_y = 0.292
# ... set other values ...
static_metadata.max_display_mastering_luminance = 1000.0
static_metadata.min_display_mastering_luminance = 0.0001

# Use in simplified API
with BlackmagicOutput() as output:
    output.initialize()
    output.display_static_frame(
        frame,
        DisplayMode.HD1080p25,
        pixel_format=PixelFormat.YUV10,
        matrix=Matrix.Rec2020,
        hdr_metadata={'eotf': Eotf.PQ, 'static_metadata': static_metadata}
    )
    input("Press Enter to stop...")
```

**Low-level API:**

For precise control over HDR Static Metadata with the low-level API, use `set_hdr_static_metadata()`:

```python
import decklink_io as dl

# Build HDR Static Metadata for the mastering display
static_metadata = dl.HdrStaticMetadata()

# Display primaries (chromaticity coordinates)
static_metadata.display_primaries_red_x = 0.708
static_metadata.display_primaries_red_y = 0.292
static_metadata.display_primaries_green_x = 0.170
static_metadata.display_primaries_green_y = 0.797
static_metadata.display_primaries_blue_x = 0.131
static_metadata.display_primaries_blue_y = 0.046
static_metadata.white_point_x = 0.3127
static_metadata.white_point_y = 0.3290

# Mastering display luminance
static_metadata.max_display_mastering_luminance = 4000.0      # 4000 nits peak (e.g., for HDR10+ content)
static_metadata.min_display_mastering_luminance = 0.0005      # 0.0005 nits black level

# Content light levels
static_metadata.max_content_light_level = 2000.0     # 2000 nits max content (MaxCLL)
static_metadata.max_frame_average_light_level = 400.0 # 400 nits average (MaxFALL)

output = dl.DeckLinkOutput()
output.initialize()
output.set_hdr_static_metadata(dl.Gamut.Rec2020, dl.Eotf.PQ, static_metadata)
```

### Available HDR Metadata Fields:

All 14 SMPTE ST 2086 / CEA-861.3 HDR static metadata fields are supported:

**Display Primaries (xy chromaticity coordinates):**
- `display_primaries_red_x`, `display_primaries_red_y`
- `display_primaries_green_x`, `display_primaries_green_y`
- `display_primaries_blue_x`, `display_primaries_blue_y`
- `white_point_x`, `white_point_y`

**Mastering Display Luminance:**
- `max_display_mastering_luminance` (nits) - Peak luminance of mastering display
- `min_display_mastering_luminance` (nits) - Minimum luminance of mastering display

**Content Light Levels:**
- `max_content_light_level` (nits) - Maximum luminance of any pixel (MaxCLL)
- `max_frame_average_light_level` (nits) - Maximum average luminance of any frame (MaxFALL)

### Important Notes:

1. **Simplified API**: With `display_static_frame()`, HDR metadata and matrix are set in a single call
2. **Low-level API call order**: When using the low-level API, `set_hdr_metadata()` must be called before `setup_output()`
3. **Frame-level metadata**: Metadata is embedded in every video frame, not set globally
4. **Matrix consistency**: When using the simplified API, the same `matrix` parameter is used for both metadata and R'G'B' →Y'CbCr conversion. With the low-level API, ensure consistency between `set_hdr_metadata()` and conversion functions.
5. **Transfer function**: The library only sets the metadata - you must apply the actual transfer function (PQ / HLG curve) to your RGB data before conversion
6. **All 14 metadata fields supported**: The library implements all SMPTE ST 2086 / CEA-861.3 HDR metadata fields including display primaries, white point, mastering display luminance, and content light levels
7. **Matrix / Resolution restrictions**:
   - **Rec.601** is only supported for SD display modes (NTSC, PAL, etc.) and is the only matrix supported for SD
   - **Rec.709** and **Rec.2020** are only supported for HD and higher resolutions (720p, 1080p, 2K, 4K, 8K, etc.)

## Troubleshooting

### Common Issues

**"DeckLink output module not found"**
- Build and install the C++ extension: `pip install -e .`
- Check that pybind11 is installed: `pip install pybind11`

**"Could not create DeckLink iterator"**
- Install Blackmagic Desktop Video software
- Ensure DeckLink device is connected and recognized by the system
- Check device drivers are properly installed

**"Could not find DeckLink device"**
- Verify device is connected and powered
- Check device appears in Blackmagic software (Media Express, etc.)
- Try different device index: `output.initialize(device_index=1)`

**Build errors about missing headers**
- The SDK headers are included in the repository under `_vendor/decklink_sdk/`
- If you need to use a different SDK version, update the paths in `CMakeLists.txt`
- On Linux, ensure headers are accessible to the build system

**Permission errors (Linux)**
- Add user to appropriate groups: `sudo usermod -a -G video $USER`
- Log out and back in for group changes to take effect

**HDR output not displaying correctly**
- **Simplified API**: Specify both `matrix` and `hdr_metadata` in `display_static_frame()` - they're automatically set correctly
- **Low-level API**: Call `set_hdr_metadata()` BEFORE `setup_output()` - metadata is embedded in each frame
- Ensure matrix consistency: same value in both metadata and R'G'B' →Y'CbCr conversion

### Testing Your Installation

Run the example script to test your installation:

```bash
python example_usage.py
```

This will show available devices and let you test various output modes.

## Tools

### pixel_reader

The `pixel_reader` tool captures and analyzes video input from a DeckLink device, displaying pixel values and metadata. This is useful for verifying output from the library by looping a DeckLink output back to its own input.

**Build:**
```bash
cd tools
make
```

**Usage:**
```bash
./pixel_reader [device_index]
```
See `tools/README.md` for more detail.

The tool displays:
- **Pixel format** and **color space** (RGB 4:4:4, YCbCr 4:2:2, etc.)
- **Resolution** and **frame rate**
- **Metadata**: EOTF (SDR / PQ / HLG), matrix (Rec.601 / Rec.709 / Rec.2020)
- **Pixel values** at selected coordinates in native format (code values)

Use this tool to verify that matrix and EOTF metadata are being set correctly by the output library.

## Platform-Specific Notes

### Windows
- Requires Visual Studio Build Tools or Visual Studio with C++ support
- DeckLink SDK typically installs to Program Files

### macOS  
- Requires Xcode Command Line Tools
- May need to codesign the built extension for some versions

### Linux
- Requires build-essential package
- May need to configure udev rules for device access
- Some distributions require additional video group membership

## Contributing

1. Fork the repository
2. Create a feature branch
3. Make your changes
4. Add tests if applicable
5. Submit a pull request

## License

This project is licensed under the MIT License - see the LICENSE file for details.

### Blackmagic DeckLink SDK License

This repository includes header files from the Blackmagic DeckLink SDK v14.1. These header files are redistributable under the terms of the Blackmagic DeckLink SDK End User License Agreement (Section 0.1), which specifically exempts the Include folder headers from the more restrictive licensing terms that apply to other parts of the SDK.

**Important notes about the SDK headers:**
- The header files in `_vendor/decklink_sdk/{Mac,Win,Linux}/include/` directories are from the Blackmagic DeckLink SDK
- These headers are required only for **building** the library from source
- **Runtime usage requires** the Blackmagic Desktop Video software to be installed separately
- The SDK headers are provided under Blackmagic Design's EULA - see `_vendor/Blackmagic Design EULA.pdf` for full terms
- Download the complete SDK and Desktop Video software from: https://www.blackmagicdesign.com/developer

The Blackmagic DeckLink SDK is © Blackmagic Design Pty. Ltd. All rights reserved.

## Support

- Check the [Issues](https://github.com/nick-shaw/blackmagic-io/issues) page for known problems
- Review Blackmagic's official DeckLink SDK documentation
- Ensure your DeckLink device is supported by the SDK version

## Acknowledgments

- Blackmagic Design for the DeckLink SDK
- pybind11 project for the C++/Python bindings
- Contributors and testers
- Special thanks to [Zach Lewis](https://github.com/zachlewis) and [Gino Bollaert](https://github.com/yergin)