lm_head.weight is missing from weight_map in model.safetensors.index.json

#15
by zhaosiyuan - opened

model.safetensors.index.json from Qwen3-4B misses is missing lm_head.weight. This causes one issue when loading the model with tie_word_embeddings = False, lm_head.weight will be randomly initialized instead of directly copy from model.embed_tokens.weight.

Sign up or log in to comment