callcut.training.FocalLossπŸ”—

class callcut.training.FocalLoss(alpha=0.25, gamma=2.0)[source]πŸ”—

Focal loss for handling class imbalance.

Down-weights easy examples to focus training on hard negatives. Particularly useful when positive (call) frames are rare.

The focal loss is defined as:

\[\begin{split}FL(p_t) = -\\alpha_t (1 - p_t)^\\gamma \\log(p_t)\end{split}\]

where \(p_t\) is the model’s estimated probability for the correct class.

Parameters:
alphafloat

Weighting factor for the positive class, in [0, 1]. Use alpha > 0.5 to weight positives more heavily.

gammafloat

Focusing parameter. gamma = 0 recovers standard cross-entropy. gamma > 0 reduces the loss for well-classified examples, focusing on hard examples. Typical values are 1.0 to 5.0.

Attributes

alpha

Weighting factor for positive class.

gamma

Focusing parameter.

Methods

forward(logits, targets)

Compute focal loss.

Examples

>>> loss_fn = FocalLoss(alpha=0.75, gamma=2.0)
>>> loss = loss_fn(logits, targets)
forward(logits, targets)[source]πŸ”—

Compute focal loss.

Parameters:
logitsTensor

Raw model output (before sigmoid).

targetsTensor

Ground truth binary labels.

Returns:
lossTensor

Scalar loss value.

property alphaπŸ”—

Weighting factor for positive class.

Type:

float

property gammaπŸ”—

Focusing parameter.

Type:

float