espnet2.enh.layers.dcunet.get_activation
Less than 1 minute
espnet2.enh.layers.dcunet.get_activation
espnet2.enh.layers.dcunet.get_activation(name)
Get the activation function based on the provided name.
This function returns the corresponding activation function class from PyTorch based on the input string. The following activation functions are supported:
- ‘silu’: SiLU (Sigmoid Linear Unit)
- ‘relu’: ReLU (Rectified Linear Unit)
- ‘leaky_relu’: Leaky ReLU
- Parameters:name (str) – The name of the activation function to retrieve.
- Returns: The corresponding activation function class.
- Return type: Callable
- Raises:
- NotImplementedError – If the provided name does not match any
- supported activation function. –
Examples
>>> activation = get_activation("relu")
>>> print(activation) # Output: <class 'torch.nn.modules.activation.ReLU'>
>>> activation = get_activation("silu")
>>> print(activation) # Output: <class 'torch.nn.modules.activation.SiLU'>
>>> activation = get_activation("unknown")
Traceback (most recent call last):
...
NotImplementedError: Unknown activation: unknown