espnet2.enh.layers.ncsnpp_utils.layers.CondRCUBlock
espnet2.enh.layers.ncsnpp_utils.layers.CondRCUBlock
class espnet2.enh.layers.ncsnpp_utils.layers.CondRCUBlock(features, n_blocks, n_stages, num_classes, normalizer, act=ReLU())
Bases: Module
Conditional Residual Convolutional Unit Block.
This class implements a block that consists of multiple residual convolutional layers, each with normalization, activation, and skip connections. It is designed to process input features while conditioning on an additional input (e.g., class labels).
n_blocks
The number of residual blocks in the unit.
- Type: int
n_stages
The number of convolutional stages within each block.
- Type: int
act
The activation function to be used in the block.
- Type: callable
normalizer
The normalization function to be applied to the input.
Type: callable
Parameters:
- features (int) – The number of input and output features for the convolutions.
- n_blocks (int) – The number of blocks in the CondRCU.
- n_stages (int) – The number of stages in each block.
- num_classes (int) – The number of classes for the conditional input.
- normalizer (callable) – The normalization layer to be used.
- act (callable) – The activation function (default: nn.ReLU()).
####### Examples
>>> cond_rcu = CondRCUBlock(features=64, n_blocks=2, n_stages=2,
... num_classes=10, normalizer=SomeNormalizer)
>>> x = torch.randn(1, 64, 32, 32) # Input tensor
>>> y = torch.randint(0, 10, (1,)) # Conditional input (class labels)
>>> output = cond_rcu(x, y)
NOTE
The conditional input y is expected to be compatible with the normalization layer and should typically represent class labels or similar categorical data.
- Raises:
- ValueError – If the number of features is not positive or if the number of
- blocks or stages is not positive. –
Initialize internal Module state, shared by both nn.Module and ScriptModule.
forward(x, y)
Conditional Residual Unit Block.
This class implements a conditional residual unit block which applies a series of convolutional layers with normalization and activation. The block consists of multiple stages, each containing normalization, convolution, and residual connections. It is particularly useful for tasks where conditioning on additional information (e.g., class labels) is required.
n_blocks
The number of blocks in the unit.
- Type: int
n_stages
The number of stages in each block.
- Type: int
act
Activation function to be applied after normalization.
- Type: callable
normalizer
Normalization function used for conditioning.
Type: callable
Parameters:
- features (int) – Number of input and output features for the convolutions.
- n_blocks (int) – Number of residual blocks.
- n_stages (int) – Number of stages in each block.
- num_classes (int) – Number of classes for conditional normalization.
- normalizer (callable) – Normalization layer used for conditioning.
- act (callable , optional) – Activation function (default is nn.ReLU()).
Forward Method: : The forward method takes two inputs: : x (torch.Tensor): The input tensor of shape (B, C, H, W). y (torch.Tensor): The conditioning tensor of shape (B, num_classes). <br/> Returns: : torch.Tensor: The output tensor of shape (B, C, H, W) after applying the conditional residual block.
####### Examples
>>> cond_rcu_block = CondRCUBlock(features=64, n_blocks=2,
... n_stages=2, num_classes=10,
... normalizer=some_normalizer)
>>> x = torch.randn(8, 64, 32, 32) # Example input
>>> y = torch.randint(0, 10, (8,)) # Example conditioning input
>>> output = cond_rcu_block(x, y)
>>> print(output.shape)
torch.Size([8, 64, 32, 32])
NOTE
This block is particularly effective in generative models where conditioning can significantly improve performance.