espnet2.gan_codec.shared.quantizer.modules.distrib.is_distributed
Less than 1 minute
espnet2.gan_codec.shared.quantizer.modules.distrib.is_distributed
espnet2.gan_codec.shared.quantizer.modules.distrib.is_distributed()
Check if the current process is part of a distributed training setup.
A process is considered to be in a distributed setup if the world size is greater than 1, which indicates that multiple processes are communicating with each other.
- Returns: True if the world size is greater than 1, indicating that distributed training is in use; otherwise, False.
- Return type: bool
Examples
>>> is_distributed()
True # If running in a distributed environment
>>> is_distributed()
False # If running in a non-distributed environment