espnet3.components.multiple_scheduler.MultipleScheduler
Less than 1 minute
espnet3.components.multiple_scheduler.MultipleScheduler
class espnet3.components.multiple_scheduler.MultipleScheduler(multiple_optimizer: MultipleOptim, lr_scheduler: LRScheduler, optimizer_idx: int)
Bases: _LRScheduler
Wrap a scheduler so Lightning accepts the MultipleOptim wrapper.
Lightning expects a scheduler to expose an optimizer attribute. This shim forwards everything to the wrapped scheduler while reporting the MultipleOptim instance. It allows one scheduler per optimizer inside MultipleOptim without triggering Lightning’s per-optimizer training_step logic.
- Parameters:
- multiple_optimizer (MultipleOptim) – The wrapper that owns the underlying optimizers.
- lr_scheduler (torch.optim.lr_scheduler.LRScheduler) – Scheduler configured for a single optimizer inside
multiple_optimizer. - optimizer_idx (int) – Index of the optimizer the scheduler controls.
NOTE
Only minimal delegation is implemented. Any attribute not explicitly intercepted is passed through to the wrapped scheduler.
Example
>>> ms = MultipleScheduler(multi_optim, StepLR(opt_a, step_size=10), 0)
>>> lr = ms.get_last_lr()