Testament200156's picture
Update README.md
7720e12 verified
|
raw
history blame
930 Bytes
metadata
base_model: []
library_name: transformers
tags:
  - mergekit
  - merge

MakeGemma3

This is a merge of pre-trained language models created using mergekit. I have created a GGUF, so please check the GGUF folder. Since this is an experimental model, there are some defects, mainly in the multilingual functionality. Please use with caution.

Merge Details

Merge Method

This model was merged using the NuSLERP merge method.

Models Merged

The following models were included in the merge:

  • drwlf/medgemma-27b-it-abliterated
  • test_base (summykai/gemma3-27b-abliterated-dpo with additional layers added)

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: test_base
    parameters:
      weight: 1.0
  - model: medgemma-27b-it-abliterated
    parameters: 
      weight: 1.0
merge_method: nuslerp
dtype: bfloat16