espnet2.asr.state_spaces.components.SquaredReLU
espnet2.asr.state_spaces.components.SquaredReLU
class espnet2.asr.state_spaces.components.SquaredReLU(*args, **kwargs)
Bases: Module
Squared ReLU activation function.
Applies the squared rectified linear unit (ReLU) activation function to the input tensor. The Squared ReLU function outputs the square of the input if it is positive and zero otherwise.
This can be particularly useful in neural networks where a non-linear activation is required that emphasizes positive values while suppressing negative ones, effectively allowing for a form of feature amplification.
- Parameters:None
- Returns: The squared output of the ReLU activation function applied element-wise to the input tensor.
- Return type: Tensor
####### Examples
>>> activation = SquaredReLU()
>>> input_tensor = torch.tensor([-1.0, 0.0, 2.0, 3.0])
>>> output_tensor = activation(input_tensor)
>>> print(output_tensor)
tensor([0., 0., 4., 9.])
Initialize internal Module state, shared by both nn.Module and ScriptModule.
forward(x)
Squared ReLU activation function.
Applies the ReLU activation function followed by squaring the output. The function is defined as:
SquaredReLU(x) = (ReLU(x))^2
This activation can be useful in scenarios where non-negative outputs are required, and squaring the output can enhance the gradients for positive inputs during backpropagation.
####### Examples
>>> activation = SquaredReLU()
>>> input_tensor = torch.tensor([-1.0, 0.0, 1.0, 2.0])
>>> output_tensor = activation(input_tensor)
>>> print(output_tensor)
tensor([0., 0., 1., 4.])
- Returns: The squared ReLU applied to the input tensor.
- Return type: Tensor