mono_self_amh

This model is a fine-tuned version of castorini/afriteva_v2_base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0401
  • Accuracy: {'accuracy': 0.21588385158087275}

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 64
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Accuracy
3.3329 1.1655 500 1.7980 {'accuracy': 0.1844460412856023}
1.831 2.3310 1000 1.5109 {'accuracy': 0.19700483407368696}
1.5538 3.4965 1500 1.3840 {'accuracy': 0.20110399790958974}
1.409 4.6620 2000 1.2974 {'accuracy': 0.20477854716488111}
1.3144 5.8275 2500 1.2348 {'accuracy': 0.2073588973085968}
1.2412 6.9930 3000 1.1953 {'accuracy': 0.2086000783903841}
1.1535 8.1585 3500 1.1696 {'accuracy': 0.2093349882414424}
1.1184 9.3240 4000 1.1589 {'accuracy': 0.21016788607264175}
1.0845 10.4895 4500 1.1263 {'accuracy': 0.21158871178468774}
1.0656 11.6550 5000 1.1136 {'accuracy': 0.21162137444473478}
1.0409 12.8205 5500 1.1089 {'accuracy': 0.2121276456754638}
1.0193 13.9860 6000 1.1017 {'accuracy': 0.21263391690619285}
0.9882 15.1515 6500 1.0911 {'accuracy': 0.2130748628168278}
0.9713 16.3170 7000 1.0774 {'accuracy': 0.21389142931800365}
0.9578 17.4825 7500 1.0679 {'accuracy': 0.213744447347792}
0.9462 18.6480 8000 1.0619 {'accuracy': 0.21447935719885028}
0.937 19.8135 8500 1.0629 {'accuracy': 0.21477332113927358}
0.9259 20.9790 9000 1.0580 {'accuracy': 0.21457734517899138}
0.9075 22.1445 9500 1.0550 {'accuracy': 0.21505095374967337}
0.9032 23.3100 10000 1.0549 {'accuracy': 0.21513261039979095}
0.894 24.4755 10500 1.0494 {'accuracy': 0.21568787562059055}
0.8849 25.6410 11000 1.0435 {'accuracy': 0.21554089365037887}
0.8825 26.8065 11500 1.0481 {'accuracy': 0.21575320094068462}
0.8758 27.9720 12000 1.0400 {'accuracy': 0.21568787562059055}
0.8672 29.1375 12500 1.0401 {'accuracy': 0.21588385158087275}

Framework versions

  • PEFT 0.7.1
  • Transformers 4.43.3
  • Pytorch 2.4.0+cu121
  • Datasets 2.15.0
  • Tokenizers 0.19.1
Downloads last month
1
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Hellina/mono_self_amh

Adapter
(4)
this model