espnet2.tts.prodiff.denoiser.Mish
Less than 1 minute
espnet2.tts.prodiff.denoiser.Mish
class espnet2.tts.prodiff.denoiser.Mish(*args, **kwargs)
Bases: Module
Mish Activation Function.
Introduced in
`Mish: A Self Regularized Non-Monotonic Activation Function`_
.
The Mish activation function is a smooth, non-monotonic function that can potentially improve the performance of deep learning models. It is defined as:
Mish(x) = x * tanh(softplus(x))
None
- Parameters:x (torch.Tensor) – Input tensor.
- Returns: Output tensor after applying the Mish activation function.
- Return type: torch.Tensor
####### Examples
>>> import torch
>>> mish = Mish()
>>> input_tensor = torch.tensor([-1.0, 0.0, 1.0])
>>> output_tensor = mish(input_tensor)
>>> print(output_tensor)
tensor([-0.3031, 0.0000, 0.8651])
Initialize internal Module state, shared by both nn.Module and ScriptModule.
forward(x: Tensor) → Tensor
Mish Activation Function.
Introduced in
`Mish: A Self Regularized Non-Monotonic Activation Function`_
.
This activation function is a smooth, non-monotonic function that has shown improved performance in various deep learning tasks compared to traditional activation functions like ReLU or Swish.
None
- Parameters:None
- Returns: Output tensor after applying the Mish activation function.
- Return type: torch.Tensor
####### Examples
>>> import torch
>>> mish = Mish()
>>> input_tensor = torch.tensor([-1.0, 0.0, 1.0])
>>> output_tensor = mish(input_tensor)
>>> print(output_tensor)
tensor([-0.3039, 0.0000, 0.8651])