Skip to content

GraLoRA merge_and_unload with modules_to_save error #3114

@JBurtn

Description

@JBurtn

System Info

Transformers==4.57.1
Peft==0.18.2.dev0 via pip install git+https://github.com/huggingface/peft.git

Who can help?

No response

Reproduction

Issue:

Unable to do merge_and_unload with GraLorA on peft 0.18.2.dev0 with modules_to_save active.

Error (with Traceback):


    ---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
Cell In[17], line 1
----> 1 model = model.merge_and_unload(safe_merge=True)

File ~/miniconda3/envs/model_refactoring/lib/python3.11/site-packages/peft/tuners/tuners_utils.py:703, in BaseTuner.merge_and_unload(self, progressbar, safe_merge, adapter_names)
    669 def merge_and_unload(
    670     self, progressbar: bool = False, safe_merge: bool = False, adapter_names: Optional[list[str]] = None
    671 ) -> torch.nn.Module:
    672     r"""
    673     This method merges the adapter layers into the base model.
    674 
   (...)    701     ```
    702     """
--> 703     return self._unload_and_optionally_merge(
    704         progressbar=progressbar, safe_merge=safe_merge, adapter_names=adapter_names
    705     )

File ~/miniconda3/envs/model_refactoring/lib/python3.11/site-packages/peft/tuners/tuners_utils.py:656, in BaseTuner._unload_and_optionally_merge(self, merge, progressbar, safe_merge, adapter_names)
    651 if hasattr(target, "unload_and_optionally_merge_module"):
    652     # if layers have special unloading method, like MultiheadAttention, use that
    653     unloaded_module = target.unload_and_optionally_merge_module(
    654         merge=merge, safe_merge=safe_merge, adapter_names=adapter_names
    655     )
--> 656     self._replace_module(parent, target_name, unloaded_module, target)
    657 elif hasattr(target, "base_layer"):
    658     if merge:

File ~/miniconda3/envs/model_refactoring/lib/python3.11/site-packages/peft/tuners/tuners_utils.py:1128, in BaseTuner._replace_module(self, parent, child_name, new_module, child)
   1125     child = child.base_layer
   1127 if not hasattr(new_module, "base_layer"):
-> 1128     new_module.weight = child.weight
   1129     if hasattr(child, "bias"):
   1130         new_module.bias = child.bias

File ~/miniconda3/envs/model_refactoring/lib/python3.11/site-packages/peft/utils/other.py:373, in AuxiliaryTrainingWrapper.__getattr__(self, name)
    371     return getattr(self.original_module, name)
    372 elif self._hasattr_wrapped(name, modules):
--> 373     return self._getattr_wrapped(name, modules)
    375 # For some reason, there is no module corresponding to the active adapter; this should normally not be
    376 # reached and exists as a failsafe (otherwise, a KeyError would be raised)
    377 raise AttributeError(f"'{type(self).__name__}' object has no attribute '{name}'")

File ~/miniconda3/envs/model_refactoring/lib/python3.11/site-packages/peft/utils/other.py:592, in ModulesToSaveWrapper._getattr_wrapped(self, name, modules)
    591 def _getattr_wrapped(self, name, modules):
--> 592     return getattr(modules["modules_to_save"][self.active_adapters[0]], name)

File ~/miniconda3/envs/model_refactoring/lib/python3.11/site-packages/torch/nn/modules/module.py:1965, in Module.__getattr__(self, name)
   1963     if name in modules:
   1964         return modules[name]
-> 1965 raise AttributeError(
   1966     f"'{type(self).__name__}' object has no attribute '{name}'"
   1967 )

AttributeError: 'Siglip2EncoderLayer' object has no attribute 'weight'

Reproduction:

I am using the huggingface trainer with base model Ovis2.5 2B . I am saving with default settings using model.save_pretrained and trainer save_every.

GraLoRA config:

GraloraConfig( r=64, gralora_k=4, alpha=32, target_modules=model_config.peft_weights, gralora_dropout=0.05, bias="none", task_type="CAUSAL_LM", modules_to_save=modules_to_save )

Modules To Save:

['vit.vision_model.encoder.layers.24', 'vit.vision_model.encoder.layers.25', 'vit.vision_model.encoder.layers.26', 'head.0', 'head.1', 'vte']
Modules To Save is the last 3 siglip layers of the model with 2 linear layers and 1 embedding layer.

Model Loading:

model = AutoModelForCausalLM.from_pretrained(
    MODEL_ID,
    device_map={'': 'cuda'},
    dtype=torch.bfloat16,
    trust_remote_code=True,
)

model = PeftModel.from_pretrained(
    model,
    PEFT_ID,
    adapter_name='default',
    device_map={'': 'cuda'},
)
model = model.merge_and_unload(safe_merge=True)

Expected behavior

Peft is unloaded and merged with base model

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions