espnet2.asr.state_spaces.components.Activation
Less than 1 minute
espnet2.asr.state_spaces.components.Activation
espnet2.asr.state_spaces.components.Activation(activation=None, size=None, dim=-1)
Create an activation function module.
This function returns a specific activation function from the available options in PyTorch, which can be used in neural network architectures. The supported activation functions include identity, tanh, relu, gelu, swish (or silu), glu, sigmoid, squared relu, and transposed layer norm.
- Parameters:
- activation (str or None) – The type of activation function to create. Options include: “id”, “identity”, “linear”, “tanh”, “relu”, “gelu”, “swish”, “silu”, “glu”, “sigmoid”, “sqrelu”, “ln”. If None, it defaults to an identity function.
- size (int , optional) – Size of the output (not used currently).
- dim (int , optional) – The dimension along which to apply the GLU activation. Default is -1.
- Returns: A PyTorch activation function module corresponding to the specified activation type.
- Return type: nn.Module
- Raises:
- NotImplementedError – If the specified activation type is not
- supported. –
Examples
>>> relu_activation = Activation("relu")
>>> tanh_activation = Activation("tanh")
>>> glu_activation = Activation("glu", dim=1)
NOTE
The size parameter is currently not utilized in the function but may be included for future enhancements.