Update config.json
#2
by
Daemontatox
- opened
https://huggingface.co/google/functiongemma-270m-it/discussions/5#694457b5ad6f76a0d3c43917
"""
vLLM has started using the Transformers v5 style rope_parameters and patches it into configs loaded with Transformers v4.
vLLM 0.12.0 uses the existence of rope_parameters to decide whether or not to set a default value for it, i.e.:
if rope_theta is not None:
if not hasattr(config, "rope_parameters"):
config.rope_parameters = {"rope_type": "default"}
config.rope_parameters["rope_theta"] = rope_theta
This causes an error with this model because rope_parameters will remain None and the following dict assignment will fail.
https://github.com/vllm-project/vllm/pull/30983 fixes this on the vLLM side for (hopefully) v0.13.0 onwards.
Removing these fields from the checkpoint should fix this model for vLLM v0.12.0 in the meantime.
"""
Thank you appreciate it!
danielhanchen
changed pull request status to
merged