espnet2.enh.layers.tcn.TemporalBlock
espnet2.enh.layers.tcn.TemporalBlock
class espnet2.enh.layers.tcn.TemporalBlock(in_channels, out_channels, skip_channels, kernel_size, stride, padding, dilation, norm_type='gLN', causal=False)
Bases: Module
Temporal Block for Temporal Convolutional Network.
This class implements a temporal block that applies a sequence of operations including a 1x1 convolution, activation function, normalization, and a depthwise separable convolution. The block supports skip connections for improved feature extraction.
skip_connection
Indicates if skip connections are used.
- Type: bool
net
Sequential container for the convolutional layers and activation functions.
Type: nn.Sequential
Parameters:
- in_channels (int) – Number of input channels.
- out_channels (int) – Number of output channels.
- skip_channels (int or None) – Number of channels for the skip connection.
- kernel_size (int) – Size of the convolutional kernel.
- stride (int) – Stride for the convolution.
- padding (int) – Padding for the convolution.
- dilation (int) – Dilation for the convolution.
- norm_type (str) – Type of normalization to use (‘gLN’, ‘cLN’, ‘BN’).
- causal (bool) – If True, applies causal convolution.
Returns: Output tensor after passing through the block. skip_out (Tensor): Output tensor for skip connections, if used.
Return type: res_out (Tensor)
Examples
>>> temporal_block = TemporalBlock(64, 128, 32, kernel_size=3,
... stride=1, padding=1,
... dilation=1, norm_type='gLN',
... causal=False)
>>> x = torch.randn(10, 64, 100) # Batch of 10, 64 channels, length 100
>>> output = temporal_block(x)
>>> output.shape
torch.Size([10, 128, 100]) # Output shape after convolution
NOTE
The output length is the same as the input length if padding is appropriately set.
Initialize internal Module state, shared by both nn.Module and ScriptModule.
forward(x)
Forward.
- Parameters:x – [M, B, K]
- Returns: [M, B, K]