arabic-hs-group-prediction

This model is a fine-tuned version of aubmindlab/bert-base-arabert on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3012
  • Accuracy: 0.9444
  • Macro F1: 0.9147

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-06
  • train_batch_size: 16
  • eval_batch_size: 20
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Accuracy Macro F1
2.07 0.1029 100 1.9058 0.4144 0.1548
1.7855 0.2058 200 1.6273 0.4907 0.2162
1.539 0.3086 300 1.4125 0.5683 0.2940
1.2834 0.4115 400 1.1540 0.6586 0.3591
1.1157 0.5144 500 0.9695 0.7095 0.4362
0.9701 0.6173 600 0.8559 0.7488 0.5098
0.848 0.7202 700 0.8231 0.7714 0.5732
0.7585 0.8230 800 0.6777 0.8171 0.6513
0.698 0.9259 900 0.6211 0.8281 0.6726
0.6143 1.0288 1000 0.5631 0.8466 0.7037
0.5656 1.1317 1100 0.5575 0.8519 0.7296
0.4638 1.2346 1200 0.4738 0.8738 0.7532
0.4572 1.3374 1300 0.4279 0.8877 0.7689
0.3826 1.4403 1400 0.4296 0.8814 0.7680
0.3424 1.5432 1500 0.3643 0.9045 0.7900
0.3636 1.6461 1600 0.3541 0.9057 0.7895
0.3061 1.7490 1700 0.3264 0.9120 0.7982
0.3145 1.8519 1800 0.3232 0.9126 0.7974
0.3074 1.9547 1900 0.3124 0.9167 0.8140
0.3145 2.0576 2000 0.3228 0.9144 0.8238
0.2337 2.1605 2100 0.3152 0.9167 0.8236
0.2283 2.2634 2200 0.3145 0.9184 0.8286
0.2265 2.3663 2300 0.3230 0.9207 0.8212
0.2428 2.4691 2400 0.3038 0.9248 0.8434
0.2301 2.5720 2500 0.2946 0.9225 0.8523
0.2471 2.6749 2600 0.2937 0.9201 0.8516
0.1854 2.7778 2700 0.2864 0.9288 0.8594
0.2154 2.8807 2800 0.2896 0.9259 0.8526
0.2196 2.9835 2900 0.2891 0.9271 0.8727
0.1772 3.0864 3000 0.2884 0.9271 0.8537
0.1876 3.1893 3100 0.2827 0.9282 0.8730
0.1898 3.2922 3200 0.2698 0.9300 0.8584
0.1592 3.3951 3300 0.2923 0.9334 0.8752
0.1423 3.4979 3400 0.2738 0.9329 0.8862
0.1818 3.6008 3500 0.2811 0.9294 0.8825
0.1357 3.7037 3600 0.2758 0.9381 0.8940
0.1667 3.8066 3700 0.3017 0.9317 0.8941
0.1609 3.9095 3800 0.2881 0.9340 0.8914
0.141 4.0123 3900 0.3014 0.9329 0.9001
0.0912 4.1152 4000 0.2830 0.9346 0.9008
0.1152 4.2181 4100 0.3018 0.9329 0.8922
0.1576 4.3210 4200 0.2790 0.9375 0.9090
0.1255 4.4239 4300 0.2893 0.9329 0.8968
0.1186 4.5267 4400 0.3274 0.9265 0.8982
0.1362 4.6296 4500 0.2865 0.9363 0.9025
0.1171 4.7325 4600 0.2941 0.9358 0.9042
0.1494 4.8354 4700 0.2841 0.9352 0.9006
0.1065 4.9383 4800 0.3263 0.9300 0.9010
0.104 5.0412 4900 0.2949 0.9398 0.9146
0.0767 5.1440 5000 0.3041 0.9375 0.9025
0.0924 5.2469 5100 0.2908 0.9381 0.9125
0.1106 5.3498 5200 0.3003 0.9387 0.9006
0.1099 5.4527 5300 0.2844 0.9410 0.9084
0.0985 5.5556 5400 0.2936 0.9363 0.9070
0.087 5.6584 5500 0.2828 0.9404 0.9089
0.0997 5.7613 5600 0.2881 0.9387 0.9119
0.0982 5.8642 5700 0.2950 0.9363 0.9087
0.1242 5.9671 5800 0.3027 0.9398 0.9074
0.0947 6.0700 5900 0.2926 0.9398 0.9115
0.0845 6.1728 6000 0.2823 0.9427 0.9101
0.0881 6.2757 6100 0.3071 0.9363 0.9144
0.0806 6.3786 6200 0.2972 0.9392 0.9101
0.0676 6.4815 6300 0.2839 0.9381 0.8999
0.1106 6.5844 6400 0.3066 0.9358 0.9020
0.1031 6.6872 6500 0.2971 0.9387 0.9034
0.0691 6.7901 6600 0.2932 0.9416 0.9130
0.0779 6.8930 6700 0.3020 0.9410 0.9121
0.0676 6.9959 6800 0.2931 0.9416 0.9144
0.0721 7.0988 6900 0.2877 0.9398 0.9114
0.0552 7.2016 7000 0.2978 0.9450 0.9241
0.0502 7.3045 7100 0.2899 0.9416 0.9122
0.0572 7.4074 7200 0.2941 0.9421 0.9131
0.079 7.5103 7300 0.2913 0.9427 0.9161
0.0676 7.6132 7400 0.2965 0.9410 0.9135
0.0673 7.7160 7500 0.3055 0.9404 0.9076
0.0709 7.8189 7600 0.3005 0.9421 0.9122
0.0709 7.9218 7700 0.3023 0.9416 0.9160
0.0679 8.0247 7800 0.2973 0.9410 0.9108
0.0542 8.1276 7900 0.2913 0.9427 0.9174
0.0605 8.2305 8000 0.3070 0.9421 0.9154
0.0578 8.3333 8100 0.3053 0.9410 0.9140
0.0457 8.4362 8200 0.2946 0.9450 0.9214
0.0631 8.5391 8300 0.3010 0.9433 0.9094
0.0563 8.6420 8400 0.2957 0.9433 0.9177
0.0624 8.7449 8500 0.2965 0.9468 0.9209
0.0557 8.8477 8600 0.3052 0.9416 0.9135
0.058 8.9506 8700 0.3091 0.9416 0.9129
0.0319 9.0535 8800 0.3078 0.9439 0.9156
0.0454 9.1564 8900 0.3069 0.9433 0.9109
0.0583 9.2593 9000 0.2996 0.9450 0.9156
0.0668 9.3621 9100 0.3026 0.9427 0.9132
0.0615 9.4650 9200 0.3065 0.9421 0.9115
0.0506 9.5679 9300 0.3037 0.9439 0.9150
0.0514 9.6708 9400 0.3012 0.9450 0.9164
0.0447 9.7737 9500 0.3004 0.9450 0.9156
0.0532 9.8765 9600 0.3009 0.9444 0.9147
0.0568 9.9794 9700 0.3012 0.9444 0.9147

Framework versions

  • Transformers 4.51.3
  • Pytorch 2.6.0+cu124
  • Datasets 2.14.4
  • Tokenizers 0.21.1
Downloads last month
14
Safetensors
Model size
0.1B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for HrantDinkFoundation/arabic-hs-group-prediction

Finetuned
(13)
this model