metadata
base_model:
- SuperbEmphasis/MN-12b-RP-Ink-RP-Longform
- Sicarius-Prototyping/Impish_Longtail_12B
library_name: transformers
tags:
- mergekit
- merge
license: apache-2.0
model-index:
- name: Impish-LongPen-12B
results:
- task:
type: text-generation
name: Text Generation
dataset:
name: IFEval (0-Shot)
type: HuggingFaceH4/ifeval
args:
num_few_shot: 0
metrics:
- type: prompt_level_strict_acc
value: 55.82
name: strict accuracy
comment: self-reported
Impish-LongPen-12B
A karcher merge of Sicarius-Prototyping/Impish_Longtail_12B and SuperbEmphasis/MN-12b-RP-Ink-RP-Longform used in KansenSakura-Erosion-RP-12b
But with better quality.
The merge itself took long ass time, probably not going to repeat similar experiments.
Expect more experimental models in the meantime.
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the Karcher Mean merge method.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
merge_method: karcher
models:
- model: SuperbEmphasis/MN-12b-RP-Ink-RP-Longform
- model: Sicarius-Prototyping/Impish_Longtail_12B
parameters:
max_iter: 100000
tol: 1e-9
dtype: bfloat16
