callcut.training.BCEWithLogitsLoss🔗

class callcut.training.BCEWithLogitsLoss(pos_weight=None)[source]🔗

Binary cross-entropy loss with logits.

Wraps torch.nn.BCEWithLogitsLoss with the BaseLoss interface.

Parameters:
pos_weightfloat | None

Weight for the positive class. If > 1, increases recall; if < 1, increases precision. Can be computed using compute_pos_weight().

Attributes

pos_weight

Positive class weight.

Methods

forward(logits, targets)

Compute binary cross-entropy loss.

Examples

>>> loss_fn = BCEWithLogitsLoss(pos_weight=2.0)
>>> loss = loss_fn(logits, targets)
forward(logits, targets)[source]🔗

Compute binary cross-entropy loss.

Parameters:
logitsTensor

Raw model output (before sigmoid).

targetsTensor

Ground truth binary labels.

Returns:
lossTensor

Scalar loss value.

property pos_weight🔗

Positive class weight.

Type:

float | None