espnet2.layers.create_adapter_fn.create_new_lora_module
Less than 1 minute
espnet2.layers.create_adapter_fn.create_new_lora_module
espnet2.layers.create_adapter_fn.create_new_lora_module(target_module: Module, rank: int, alpha: int, dropout_rate: float)
Create a new LoRA module for the given target module.
This function constructs a LoRA (Low-Rank Adaptation) module based on the provided target module’s type. It supports both embedding and linear layers, and integrates the specified rank, alpha, and dropout rate parameters.
- Parameters:
- target_module (torch.nn.Module) – The target module for which to create the LoRA adaptation. Must be an instance of torch.nn.Embedding or torch.nn.Linear.
- rank (int) – Rank of the LoRA matrices. This controls the low-rank approximation.
- alpha (int) – Constant number for LoRA scaling.
- dropout_rate (float) – Dropout probability for LoRA layers.
- Returns: A new LoRA module that adapts the target module.
- Return type: torch.nn.Module
- Raises:ValueError – If the target module is not of a supported type.
Examples
>>> import torch
>>> target_linear = torch.nn.Linear(10, 5)
>>> lora_module = create_new_lora_module(target_linear, rank=8, alpha=8,
... dropout_rate=0.1)
>>> print(lora_module)