espnet2.layers.create_adapter.create_adapter
Less than 1 minute
espnet2.layers.create_adapter.create_adapter
espnet2.layers.create_adapter.create_adapter(model: Module, adapter: str, adapter_conf: dict)
Definition of the low-rank adaptation (LoRA) for large models.
References
- LoRA: Low-Rank Adaptation of Large Language Models (https://arxiv.org/pdf/2106.09685.pdf)
- https://github.com/microsoft/LoRA.git
- https://github.com/huggingface/peft/blob/main/src/peft/tuners/lora.py
- Parameters:
- model (torch.nn.Module) – Base model to be adapted.
- adapter (str) – Name of the adapter to create. Supported types include “lora” and “houlsby”.
- adapter_conf (dict) – Configuration for the adapter. For example, {“rank”: 8, “alpha”: 8} for the LoRA adapter.
- Raises:AssertionError – If the specified adapter type is not supported.
Examples
>>> import torch
>>> model = SomeModel() # Replace with an actual model instance
>>> adapter_conf = {"rank": 8, "alpha": 8}
>>> create_adapter(model, "lora", adapter_conf)
NOTE
The function will invoke the corresponding adapter creation function based on the adapter type specified.