espnet2.optimizers.optim_groups.configure_optimizer
Less than 1 minute
espnet2.optimizers.optim_groups.configure_optimizer
espnet2.optimizers.optim_groups.configure_optimizer(model, optim_class, optim_conf, weight_decay_conf)
Configure an optimizer for the given model with specified hyperparameters.
This function sets up the optimizer by separating model parameters into those that should and should not have weight decay applied. It allows for special hyperparameters to be defined for certain parameters while maintaining a consistent configuration for the overall optimizer.
- Parameters:
- model (nn.Module) – The PyTorch model whose parameters will be optimized.
- optim_class (type) – The optimizer class (e.g., torch.optim.SGD) to instantiate.
- optim_conf (dict) – A dictionary of keyword arguments to pass to the optimizer constructor (e.g., learning rate, momentum).
- weight_decay_conf (dict) – A dictionary with keys ‘bias_weight_decay’ and ‘normalization_weight_decay’ that determine whether to apply weight decay to bias and normalization layers.
- Returns: An instance of the configured optimizer.
- Return type: torch.optim.Optimizer
Examples
>>> import torch.optim as optim
>>> model = MyModel()
>>> optimizer = configure_optimizer(
... model,
... optim.SGD,
... {'lr': 0.01, 'momentum': 0.9},
... {'bias_weight_decay': False, 'normalization_weight_decay': True}
... )
NOTE
This function modifies the parameters of the model to include a special attribute “_optim” that indicates optimizer settings for individual parameters.
- Raises:ValueError – If the optimizer class does not accept the provided configuration.