callcut.training.BaseLoss🔗

class callcut.training.BaseLoss[source]🔗

Abstract base class for training loss functions.

Subclasses must implement forward() to compute the loss between model logits and ground truth labels.

All loss functions expect:

  • logits: Raw model output of shape (batch, time) or (batch,)

  • targets: Binary labels of shape (batch, time) or (batch,), values in [0, 1]

Methods

forward(logits, targets)

Compute the loss.

Examples

Create a custom loss by subclassing:

>>> class MyLoss(BaseLoss):
...     def __init__(self, weight: float = 1.0):
...         super().__init__()
...         self._weight = weight
...
...     def forward(self, logits: Tensor, targets: Tensor) -> Tensor:
...         # Custom loss computation
...         ...
abstractmethod forward(logits, targets)[source]🔗

Compute the loss.

Parameters:
logitsTensor

Raw model output (before sigmoid).

targetsTensor

Ground truth binary labels.

Returns:
lossTensor

Scalar loss value.