--- base_model: [] library_name: transformers tags: - mergekit - merge --- # old-MakeGemma3 **Note: multilingual feature improvement version has been posted. Please use Testament200156/MakeGemma3-abliterated. This location is currently deprecated.** This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). I have created a GGUF, so please check the GGUF folder. Since this is an experimental model, there are some defects, mainly in the multilingual functionality. Please use with caution. ## Merge Details ### Merge Method This model was merged using the NuSLERP merge method. ### Models Merged The following models were included in the merge: * drwlf/medgemma-27b-it-abliterated * test_base (summykai/gemma3-27b-abliterated-dpo with additional layers added) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: test_base parameters: weight: 1.0 - model: medgemma-27b-it-abliterated parameters: weight: 1.0 merge_method: nuslerp dtype: bfloat16 ```