ASR_Whisper_Degenerative_Brain_Argumentation

This model is a fine-tuned version of openai/whisper-small on the ASR_Preprocess_Degenerative_Brain_Dataset_Augmentation dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4547
  • Cer: 19.5682
  • Wer: 26.4223

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 320
  • training_steps: 3232
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Cer Wer
0.9359 0.9913 400 0.9269 30.1532 43.4307
0.4058 1.9814 800 0.6448 30.0489 48.1589
0.1813 2.9715 1200 0.4917 34.0008 37.9538
0.1193 3.9616 1600 0.4691 28.8523 34.5848
0.0461 4.9517 2000 0.4717 19.7738 27.4559
0.0257 5.9418 2400 0.4440 19.5613 27.2727
0.01 6.9318 2800 0.4532 20.5277 27.4314
0.01 7.9219 3200 0.4547 19.5682 26.4223

Framework versions

  • Transformers 4.53.0.dev0
  • Pytorch 2.6.0+cu124
  • Datasets 3.6.0
  • Tokenizers 0.21.1
Downloads last month
5
Safetensors
Model size
0.2B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for yoona-J/ASR_Whisper_Cerebral_Palsy

Finetuned
(3025)
this model

Dataset used to train yoona-J/ASR_Whisper_Cerebral_Palsy

Evaluation results

  • Wer on ASR_Preprocess_Degenerative_Brain_Dataset_Augmentation
    self-reported
    26.422