espnet2.schedulers.exponential_decay_warmup.ExponentialDecayWarmup
Less than 1 minute
espnet2.schedulers.exponential_decay_warmup.ExponentialDecayWarmup
class espnet2.schedulers.exponential_decay_warmup.ExponentialDecayWarmup(optimizer: Optimizer, max_lr: float, min_lr: float, total_steps: int, warmup_steps: int = 0, warm_from_zero: bool = False, last_epoch: int = -1)
Bases: _LRScheduler, AbsBatchStepScheduler
Exponential Decay with Warmup.
if step < warmup_steps: : if warm_from_zero: : lr = initial_lr * (step / warmup_steps)
else: : lr = initial_lr <br/> else: : decay_factor = (step - warmup_steps) / (total_steps - warmup_steps) lr = initial_lr * exp(decay_factor * log(final_lr / initial_lr))
- Parameters:
- optimizer (Optimizer) β Wrapped optimizer.
- max_lr (float) β Initial learning rate (before decay).
- min_lr (float) β Final learning rate (after decay).
- total_steps (int) β Total number of steps (epochs * iters per epoch).
- warmup_steps (int) β Number of warmup steps. Default: 0.
- warm_from_zero (bool) β If True, warmup starts from 0 to initial_lr.
- last_epoch (int) β The index of last step. Default: -1.
get_lr()
Compute learning rate using chainable form of the scheduler.
init_lr()
step(epoch: int = None)
Perform a step.
