espnet3.systems.tts.gan_trainer.GANTTSLightningTrainer
Less than 1 minute
espnet3.systems.tts.gan_trainer.GANTTSLightningTrainer
class espnet3.systems.tts.gan_trainer.GANTTSLightningTrainer(model=None, exp_dir: str | None = None, config=None, best_model_criterion=None)
Bases: ESPnet3LightningTrainer
ESPnet3 trainer wrapper that strips TTS GAN-only trainer config.
Initialize GANTTSLightningTrainer, stripping GAN-specific config keys.
Removes the gan sub-config from config before delegating to the parent ESPnet3LightningTrainer, so GAN-only keys (e.g. discriminator update intervals) do not interfere with the base Lightning trainer.
- Parameters:
- model – The Lightning module to train. Typically a
GANTTSLightningModuleinstance. - exp_dir – Path to the experiment output directory where checkpoints and logs are written.
Nonedisables checkpoint saving. - config – Trainer configuration (OmegaConf
DictConfigor plaindict). Thegankey, if present, is stripped before use. - best_model_criterion – Sequence of
(metric, weight, mode)triples used to select the best checkpoint. PassNoneto disable.
- model – The Lightning module to train. Typically a
- Returns: None
- Raises:TypeError – If config is neither a
DictConfignor adict, and the parent class cannot accept it.
Notes
The gan key is stripped on a deep copy of config, so the caller’s object is never mutated.
Examples
>>> import torch.nn as nn
>>> from omegaconf import OmegaConf
>>> cfg = OmegaConf.create({"accelerator": "cpu", "gan": {"ratio": 2}})
>>> trainer = GANTTSLightningTrainer(config=cfg)
>>> "gan" not in trainer # gan key was stripped
True