mounir4

This model is a fine-tuned version of facebook/wav2vec2-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6829
  • Wer: 1

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 64
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • training_steps: 10000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
3.3494 8.51 500 3.1482 1
2.9331 17.02 1000 2.9053 1
2.8691 25.53 1500 2.8793 1
2.8393 34.04 2000 2.8696 1
1.9588 42.55 2500 1.5982 1
0.9108 51.06 3000 0.8335 1
0.7196 59.57 3500 0.7443 1
0.6198 68.09 4000 0.6949 1
0.5558 76.6 4500 0.6862 1
0.5152 85.11 5000 0.6743 1
0.4781 93.62 5500 0.6668 1
0.4442 102.13 6000 0.6587 1
0.4255 110.64 6500 0.6498 1
0.408 119.15 7000 0.6698 1
0.3888 127.66 7500 0.6739 1
0.3815 136.17 8000 0.6754 1
0.3704 144.68 8500 0.6843 1
0.3625 153.19 9000 0.6707 1
0.356 161.7 9500 0.6812 1
0.3541 170.21 10000 0.6829 1

Framework versions

  • Transformers 4.28.0
  • Pytorch 2.0.1+cu117
  • Datasets 2.12.0
  • Tokenizers 0.13.3
Downloads last month
5
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Evaluation results