espnet2.layers.create_adapter_fn.create_lora_adapter
espnet2.layers.create_adapter_fn.create_lora_adapter
espnet2.layers.create_adapter_fn.create_lora_adapter(model: Module, rank: int = 8, alpha: int = 8, dropout_rate: float = 0.0, target_modules: List[str] = ['query'], bias_type: str | None = 'none')
Create LoRA adapter for the base model.
This function adds Low-Rank Adaptation (LoRA) layers to specified target modules within a given model, allowing for efficient fine-tuning of the model parameters. The implementation is based on the concept outlined in the paper: https://arxiv.org/pdf/2106.09685.pdf.
- Parameters:
- model (torch.nn.Module) – Base model to be adapted.
- rank (int) – Rank of LoRA matrices. Defaults to 8.
- alpha (int) – Constant number for LoRA scaling. Defaults to 8.
- dropout_rate (float) – Dropout probability for LoRA layers. Defaults to 0.0.
- target_modules (List *[*str ]) – List of module(s) to apply LoRA adaptation. e.g. [“query”, “key”, “value”] for all layers, while [“encoder.encoders.blocks.0.attn.key”] for a specific layer.
- bias_type (str) – Bias training type for LoRA adaptation, can be one of [“none”, “all”, “lora_only”]. “none” means not training any bias vectors; “all” means training all bias vectors, including LayerNorm biases; “lora_only” means only training bias vectors in LoRA adapted modules.
- Returns: None
- Raises:
- ImportError – If loralib is not installed.
- ValueError – If target modules are not found in the base model.
Examples
Create a model instance (example)
model = SomeModel()
Apply LoRA adaptation
create_lora_adapter(
model=model, rank=16, alpha=32, dropout_rate=0.1, target_modules=[“query”, “key”], bias_type=”lora_only”
)
NOTE
This function requires the loralib library to be installed. Please follow the instructions at https://github.com/microsoft/LoRA for installation.