espnet2.gan_svs.visinger2.ddsp.gru
Less than 1 minute
espnet2.gan_svs.visinger2.ddsp.gru
espnet2.gan_svs.visinger2.ddsp.gru(n_input, hidden_size)
Create a Gated Recurrent Unit (GRU) layer.
This function constructs a GRU layer using PyTorch’s nn.GRU. The GRU layer is a type of recurrent neural network layer that is well-suited for sequence modeling tasks, providing a mechanism to retain information over long sequences while addressing the vanishing gradient problem.
- Parameters:
- n_input (int) – The number of input features for each time step.
- hidden_size (int) – The number of features in the hidden state.
- Returns: A GRU layer configured with the specified input size and hidden size.
- Return type: nn.GRU
Examples
>>> gru_layer = gru(n_input=10, hidden_size=20)
>>> input_tensor = torch.randn(5, 3, 10) # (batch_size, seq_len, n_input)
>>> output, hidden = gru_layer(input_tensor)
>>> output.shape
torch.Size([5, 3, 20]) # (batch_size, seq_len, hidden_size)