espnet2.asr.encoder.beats_encoder.gelu_accurate
Less than 1 minute
espnet2.asr.encoder.beats_encoder.gelu_accurate
espnet2.asr.encoder.beats_encoder.gelu_accurate(x)
Computes the accurate Gaussian Error Linear Unit (GELU) activation.
The GELU activation function is defined as: : GELU(x) = 0.5 * x * (1 + tanh(sqrt(2 / π) * (x + 0.044715 * x^3)))
This function computes the GELU activation in a numerically stable way by avoiding overflow issues in the exponential function.
- Parameters:x (torch.Tensor) – The input tensor for which the GELU activation needs to be computed.
- Returns: A tensor with the same shape as the input tensor, : containing the GELU activations.
- Return type: torch.Tensor
Examples
>>> import torch
>>> input_tensor = torch.tensor([-1.0, 0.0, 1.0])
>>> output_tensor = gelu_accurate(input_tensor)
>>> print(output_tensor)
tensor([-0.1587, 0.0000, 0.8413])
NOTE
This implementation is designed to provide accurate results for a wide range of input values. It is recommended to use this function in neural networks where GELU activation is desired.