espnet2.schedulers.abs_scheduler.AbsBatchStepScheduler
espnet2.schedulers.abs_scheduler.AbsBatchStepScheduler
class espnet2.schedulers.abs_scheduler.AbsBatchStepScheduler
Bases: AbsScheduler
Abstract base class for batch step learning rate schedulers.
This class serves as a blueprint for creating custom learning rate schedulers that operate on a batch basis. It inherits from the AbsScheduler class and mandates the implementation of the step, state_dict, and load_state_dict methods.
None
- Parameters:epoch (int , optional) – The current epoch number. Default is None.
- Returns: None
- Raises:
- NotImplementedError – If the method is not implemented in a
- subclass. –
########### Examples
To create a custom batch step scheduler, inherit from this class and implement the required methods:
``
`
python class CustomBatchScheduler(AbsBatchStepScheduler):
def step(self, epoch: int = None): : # Custom implementation for step
def state_dict(self): : # Custom implementation to return state dict
def load_state_dict(self, state): : # Custom implementation to load state dict
``
`
abstract load_state_dict(state)
Loads the state dictionary into the scheduler.
This method is intended to restore the state of the scheduler from a previously saved state dictionary. It allows for resuming training from a specific point while maintaining the learning rate schedule.
- Parameters:
- state (dict) – A dictionary containing the state of the scheduler, which
- parameters (typically includes the learning rate and any other relevant)
- restored. (that need to be)
- Raises:
- ValueError – If the state dictionary is invalid or does not match the
- expected structure. –
########### Examples
Create a scheduler instance
scheduler = MyCustomScheduler()
Load a previously saved state
state_dict = torch.load(‘scheduler_state.pth’) scheduler.load_state_dict(state_dict)
abstract state_dict()
Abstract base class for batch step learning rate schedulers.
This class defines the interface for learning rate schedulers that operate on batches. Implementing classes should provide their own logic for adjusting the learning rate based on the batch steps.
None
- Parameters:None
- Returns: None
- Yields: None
- Raises:None –
########### Examples
To create a custom scheduler, inherit from AbsBatchStepScheduler and implement the required methods.
class CustomBatchStepScheduler(AbsBatchStepScheduler): : def step(self, epoch: int = None): : # Custom step logic pass <br/> def state_dict(self): : # Return the state of the scheduler return {} <br/> def load_state_dict(self, state): : # Load the state into the scheduler pass
abstract step(epoch: int | None = None)
Abstract base class for batch step learning rate schedulers.
This class defines the interface for batch step learning rate schedulers, which are used to adjust the learning rate based on the number of batches processed. Classes inheriting from this abstract class should implement the step method, which updates the learning rate according to the specific scheduling strategy.
None
- Parameters:epoch (int , optional) – The current epoch number. If None, the epoch will not be used in the scheduling logic.
- Returns: None
- Raises:NotImplementedError – If the step method is not implemented in a subclass.
########### Examples
To create a custom batch step scheduler, inherit from this class and implement the step method:
class CustomBatchStepScheduler(AbsBatchStepScheduler): : def step(self, epoch: int = None): : # Implement your custom step logic here pass <br/> def state_dict(self): : # Return the state of the scheduler pass <br/> def load_state_dict(self, state): : # Load the state into the scheduler pass