espnet2.legacy.nets.pytorch_backend.transformer.label_smoothing_loss.LabelSmoothingLoss
Less than 1 minute
espnet2.legacy.nets.pytorch_backend.transformer.label_smoothing_loss.LabelSmoothingLoss
class espnet2.legacy.nets.pytorch_backend.transformer.label_smoothing_loss.LabelSmoothingLoss(size, padding_idx, smoothing, normalize_length=False, criterion=KLDivLoss())
Bases: Module
Label-smoothing loss.
- Parameters:
- size (int) β the number of class
- padding_idx (int) β ignored class id
- smoothing (float) β smoothing rate (0.0 means the conventional CE)
- normalize_length (bool) β normalize loss by sequence length if True
- criterion (torch.nn.Module) β loss function to be smoothed
Construct an LabelSmoothingLoss object.
forward(x, target)
Compute loss between x and target.
- Parameters:
- x (torch.Tensor) β prediction (batch, seqlen, class)
- target (torch.Tensor) β target signal masked with self.padding_id (batch, seqlen)
- Returns: scalar float value
:rtype torch.Tensor
