merge

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the Model Stock merge method using anthracite-org/magnum-v2-4b as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

merge_method: model_stock
models:
  - model: Magpie-Align/MagpieLM-4B-Chat-v0.1
    parameters:
      weight: 1.0
  - model: FourOhFour/Deedlit_4B
    parameters:
      weight: 1.0
base_model: anthracite-org/magnum-v2-4b
dtype: bfloat16
normalize: true

Open LLM Leaderboard Evaluation Results

Detailed results can be found here

Metric Value
Avg. 16.19
IFEval (0-Shot) 42.30
BBH (3-Shot) 19.29
MATH Lvl 5 (4-Shot) 4.00
GPQA (0-shot) 4.47
MuSR (0-shot) 9.93
MMLU-PRO (5-shot) 17.17
Downloads last month
3
Safetensors
Model size
5B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Nexesenex/Nemotron_W_4b_MagLight_0.1

Evaluation results