espnet2.gan_svs.visinger2.ddsp.mlp
Less than 1 minute
espnet2.gan_svs.visinger2.ddsp.mlp
espnet2.gan_svs.visinger2.ddsp.mlp(in_size, hidden_size, n_layers)
Construct a multi-layer perceptron (MLP) model.
This function creates a multi-layer perceptron consisting of fully connected layers followed by layer normalization and LeakyReLU activation functions. The architecture of the MLP is defined by the number of input features, the size of hidden layers, and the number of layers.
- Parameters:
- in_size (int) – The number of input features for the MLP.
- hidden_size (int) – The number of neurons in each hidden layer.
- n_layers (int) – The total number of hidden layers in the MLP.
- Returns: A PyTorch sequential model representing the MLP.
- Return type: nn.Sequential
Examples
>>> model = mlp(10, 20, 3)
>>> print(model)
Sequential(
(0): Linear(in_features=10, out_features=20, bias=True)
(1): LayerNorm((20,), eps=1e-05, elementwise_affine=True)
(2): LeakyReLU(negative_slope=0.01)
(3): Linear(in_features=20, out_features=20, bias=True)
(4): LayerNorm((20,), eps=1e-05, elementwise_affine=True)
(5): LeakyReLU(negative_slope=0.01)
(6): Linear(in_features=20, out_features=20, bias=True)
(7): LayerNorm((20,), eps=1e-05, elementwise_affine=True)
(8): LeakyReLU(negative_slope=0.01)
)