espnet2.diar.layers.tcn_nomask.GlobalLayerNorm
espnet2.diar.layers.tcn_nomask.GlobalLayerNorm
class espnet2.diar.layers.tcn_nomask.GlobalLayerNorm(channel_size)
Bases: Module
Global Layer Normalization (gLN).
This module applies global layer normalization to the input tensor. It normalizes the input across all channels and spatial dimensions, ensuring that the output has a mean of zero and a variance of one.
gamma
Scale parameter for normalization, initialized to 1.
- Type: nn.Parameter
beta
Shift parameter for normalization, initialized to 0.
Type: nn.Parameter
Parameters:channel_size (int) – The number of channels in the input tensor.
Returns: Normalized output tensor of shape [M, N, K], where M is the batch size, N is the channel size, and K is the length.
Return type: gLN_y (torch.Tensor)
######### Examples
>>> gLN = GlobalLayerNorm(channel_size=10)
>>> input_tensor = torch.randn(32, 10, 50) # Batch of 32, 10 channels, length 50
>>> output_tensor = gLN(input_tensor)
>>> output_tensor.shape
torch.Size([32, 10, 50])
####### NOTE The normalization is performed as follows: gLN_y = gamma * (y - mean) / sqrt(var + EPS) + beta where mean and var are calculated over all channels and spatial dimensions.
Initialize internal Module state, shared by both nn.Module and ScriptModule.
forward(y)
Global Layer Normalization (gLN).
This class implements global layer normalization for input tensors. It normalizes the input across both the channel and sequence dimensions, making it suitable for various deep learning tasks, especially in sequence modeling.
gamma
Learnable scale parameter with shape [1, N, 1].
- Type: torch.Parameter
beta
Learnable shift parameter with shape [1, N, 1].
- Type: torch.Parameter
reset_parameters()
Initializes the parameters of gamma and beta.
forward(y)
Applies global layer normalization to the input tensor.
- Parameters:channel_size (int) – The size of the channel dimension of the input tensor.
######### Examples
>>> gLN = GlobalLayerNorm(channel_size=64)
>>> input_tensor = torch.randn(32, 64, 128) # [M, N, K]
>>> output_tensor = gLN(input_tensor) # [M, N, K]
####### NOTE The input tensor y is expected to have the shape [M, N, K], where M is the batch size, N is the channel size, and K is the length.
- Raises:ValueError – If the input tensor does not have the expected shape.
reset_parameters()
Global Layer Normalization (gLN).
This class implements the global layer normalization technique, which normalizes the input tensor across both the channel and spatial dimensions. This helps to stabilize the training of deep learning models by reducing internal covariate shift.
gamma
Learnable scale parameter of shape [1, N, 1].
- Type: nn.Parameter
beta
Learnable shift parameter of shape [1, N, 1].
- Type: nn.Parameter
reset_parameters()
Resets the parameters gamma and beta to their initial values.
forward()
Applies global layer normalization to the input tensor.
- Parameters:channel_size (int) – The number of channels in the input tensor.
######### Examples
>>> gLN = GlobalLayerNorm(channel_size=64)
>>> input_tensor = torch.randn(32, 64, 128) # [M, N, K]
>>> output_tensor = gLN(input_tensor)
>>> output_tensor.shape
torch.Size([32, 64, 128]) # Output shape is the same as input shape
####### NOTE The input tensor should have the shape [M, N, K], where M is the batch size, N is the number of channels, and K is the sequence length.