Bias92 commited on
Commit
8abc725
·
verified ·
1 Parent(s): e949c91

Fix _tied_weights_keys mapping for Transformers v5

Browse files

This PR fixes the `_tied_weights_keys` compatibility issue with Transformers v5.0.0+.

## Problem
- `_tied_weights_keys` was a list, but Transformers v5+ expects a dict-like mapping
- This caused `AttributeError: 'list' object has no attribute 'keys'`

## Solution
- Changed `_tied_weights_keys` from list to dict format
- Maps `lm_head.weight` to `transformer.wte.weight`

## Related
- Discussion #9

If backward compatibility with v4 is needed, I can add a version check.

Files changed (1) hide show
  1. modeling_exaone.py +1 -1
modeling_exaone.py CHANGED
@@ -988,7 +988,7 @@ class ExaoneModel(ExaonePreTrainedModel):
988
  EXAONE_START_DOCSTRING,
989
  )
990
  class ExaoneForCausalLM(ExaonePreTrainedModel, GenerationMixin):
991
- _tied_weights_keys = ["lm_head.weight"]
992
 
993
  def __init__(self, config):
994
  super().__init__(config)
 
988
  EXAONE_START_DOCSTRING,
989
  )
990
  class ExaoneForCausalLM(ExaonePreTrainedModel, GenerationMixin):
991
+ _tied_weights_keys = {"lm_head.weight": "transformer.wte.weight"}
992
 
993
  def __init__(self, config):
994
  super().__init__(config)