espnet2.enh.layers.ncsnpp_utils.layers.variance_scaling
espnet2.enh.layers.ncsnpp_utils.layers.variance_scaling
espnet2.enh.layers.ncsnpp_utils.layers.variance_scaling(scale, mode, distribution, in_axis=1, out_axis=0, dtype=torch.float32, device='cpu')
Initialize a tensor using variance scaling.
This function provides a variance scaling initializer, which can be useful for initializing weights in neural networks. It computes the fan-in and fan-out of the tensor shape and scales the variance according to the specified mode and distribution.
- Parameters:
- scale (float) – Scaling factor for the variance.
- mode (str) – One of {‘fan_in’, ‘fan_out’, ‘fan_avg’} that determines how to calculate the variance.
- distribution (str) – One of {‘normal’, ‘uniform’} that determines the distribution of the initialized values.
- in_axis (int , optional) – Axis that corresponds to input dimension. Defaults to 1.
- out_axis (int , optional) – Axis that corresponds to output dimension. Defaults to 0.
- dtype (torch.dtype , optional) – Desired data type of the tensor. Defaults to torch.float32.
- device (str , optional) – Device to allocate the tensor on. Defaults to “cpu”.
- Returns: A function that initializes a tensor of the specified shape with variance scaling.
- Return type: function
- Raises:ValueError – If an invalid mode or distribution is specified.
Examples
Initialize weights with normal distribution
init_normal = variance_scaling(1.0, ‘fan_in’, ‘normal’) weights_normal = init_normal((64, 128))
Initialize weights with uniform distribution
init_uniform = variance_scaling(1.0, ‘fan_avg’, ‘uniform’) weights_uniform = init_uniform((64, 128))
NOTE
The variance scaling is a technique used to maintain the variance of activations across layers, which can help with training deep neural networks.