espnet2.enh.layers.ncsnpp_utils.layers.RCUBlock
espnet2.enh.layers.ncsnpp_utils.layers.RCUBlock
class espnet2.enh.layers.ncsnpp_utils.layers.RCUBlock(features, n_blocks, n_stages, act=ReLU())
Bases: Module
Residual Channel Update Block.
This block is designed to facilitate residual learning through multiple convolutional stages. Each stage applies a series of convolutions and an activation function, with the output being added back to the input (residual connection). This architecture helps in training deeper networks by mitigating the vanishing gradient problem.
n_blocks
The number of residual blocks to be created.
- Type: int
n_stages
The number of convolutional stages within each block.
- Type: int
act
The activation function to use, default is ReLU.
Type: callable
Parameters:
- features (int) – The number of input and output channels for the convolutional layers.
- n_blocks (int) – The number of blocks in the RCU.
- n_stages (int) – The number of stages in each block.
- act (callable , optional) – The activation function to apply. Defaults to nn.ReLU().
Example
>>> rcu_block = RCUBlock(features=64, n_blocks=2, n_stages=3)
>>> input_tensor = torch.randn(1, 64, 32, 32) # Batch of 1, 64 channels
>>> output_tensor = rcu_block(input_tensor)
>>> print(output_tensor.shape)
torch.Size([1, 64, 32, 32]) # Output shape matches input shape
NOTE
The number of input channels must match the features argument.
Initialize internal Module state, shared by both nn.Module and ScriptModule.
forward(x)
Residual Convolutional Unit Block (RCUBlock).
This class implements a residual block consisting of multiple stages of convolutional layers. Each stage applies a specified activation function and an optional pooling operation to the input tensor. The output of each stage is added back to the input tensor, allowing for effective gradient flow and learning.
n_stages
The number of convolutional stages in the block.
- Type: int
act
The activation function to be applied.
Type: callable
Parameters:
- features (int) – Number of input and output features for each convolutional layer.
- n_blocks (int) – Number of blocks to be used in the RCU.
- n_stages (int) – Number of stages within each block.
- act (callable , optional) – Activation function to use (default: nn.ReLU()).
##
Example
>>> rcu_block = RCUBlock(features=64, n_blocks=2, n_stages=3)
>>> input_tensor = torch.randn(1, 64, 32, 32)
>>> output_tensor = rcu_block(input_tensor)
>>> output_tensor.shape
torch.Size([1, 64, 32, 32])
NOTE
The RCUBlock can be used as a building block for more complex neural network architectures, especially in tasks such as image processing or feature extraction in deep learning.
- Raises:ValueError – If n_stages or n_blocks is not a positive integer.