espnet2.legacy.nets.pytorch_backend.transducer.transformer_decoder_layer.TransformerDecoderLayer
Less than 1 minute
espnet2.legacy.nets.pytorch_backend.transducer.transformer_decoder_layer.TransformerDecoderLayer
class espnet2.legacy.nets.pytorch_backend.transducer.transformer_decoder_layer.TransformerDecoderLayer(hdim: int, self_attention: MultiHeadedAttention, feed_forward: PositionwiseFeedForward, dropout_rate: float)
Bases: Module
Transformer decoder layer module for custom Transducer model.
- Parameters:
- hdim β Hidden dimension.
- self_attention β Self-attention module.
- feed_forward β Feed forward module.
- dropout_rate β Dropout rate.
Construct an DecoderLayer object.
forward(sequence: Tensor, mask: Tensor, cache: Tensor | None = None)
Compute previous decoder output sequences.
- Parameters:
- sequence β Transformer input sequences. (B, U, D_dec)
- mask β Transformer intput mask sequences. (B, U)
- cache β Cached decoder output sequences. (B, (U - 1), D_dec)
- Returns: Transformer output sequences. (B, U, D_dec) mask: Transformer output mask sequences. (B, U)
- Return type: sequence
