pysdtw normalized

Small extension for original pysdtw that implements the normalized version of the SoftDTW (i.e. the “divergence” version of the SoftDTW).

Note that the original implementation of the pysdtw is not included directly in this package, but it is listed as a dependency.

Authors

Alberto Zancanaro <alberto.zancanaro@uni.lu>

class dtw_loss_functions.soft_dtw_implementations.pysdtw_normalize.pysdtw_normalized(use_cuda: bool, gamma: float = 1, bandwidth: int = None, dist_func: callable = None)[source]

Bases: SoftDTW

Extension of the original PySDTW implementation to include normalization (i.e. the “divergence” version of the SoftDTW).

Methods

forward(x, y)

Computes the normalized SoftDTW (i.e. SoftDTW Divergence) distance between two time series.

forward(x: Tensor, y: Tensor) Tensor[source]

Computes the normalized SoftDTW (i.e. SoftDTW Divergence) distance between two time series. The final value is computed as SDTW(x, y) - SDTW(x, x) - SDTW(y, y).

Parameters:
  • x (torch.Tensor) – First input tensor of shape B x T x C

  • y (torch.Tensor) – Second input tensor of shape B x T x C

Returns:

sdtw_divergence – Normalized SoftDTW distance between the two input tensors, of shape B.

Return type:

torch.Tensor