timesformer-base-finetuned-k400-finetuned-MicroLens_classification

This model is a fine-tuned version of facebook/timesformer-base-finetuned-k400 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6900
  • Accuracy: 0.5327
  • 0 Precision: 0.5219
  • 0 Recall: 0.8770
  • 0 F1-score: 0.6544
  • 0 Support: 88401.0
  • 1 Precision: 0.5927
  • 1 Recall: 0.1822
  • 1 F1-score: 0.2787
  • 1 Support: 86829.0
  • Accuracy F1-score: 0.5327
  • Macro avg Precision: 0.5573
  • Macro avg Recall: 0.5296
  • Macro avg F1-score: 0.4666
  • Macro avg Support: 175230.0
  • Weighted avg Precision: 0.5570
  • Weighted avg Recall: 0.5327
  • Weighted avg F1-score: 0.4682
  • Weighted avg Support: 175230.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 11600

Training results

Training Loss Epoch Step Validation Loss Accuracy 0 Precision 0 Recall 0 F1-score 0 Support 1 Precision 1 Recall 1 F1-score 1 Support Accuracy F1-score Macro avg Precision Macro avg Recall Macro avg F1-score Macro avg Support Weighted avg Precision Weighted avg Recall Weighted avg F1-score Weighted avg Support
0.6962 0.05 580 0.6924 0.5439 0.5404 0.6402 0.5861 88401.0 0.5489 0.4457 0.4920 86829.0 0.5439 0.5447 0.5430 0.5390 175230.0 0.5446 0.5439 0.5395 175230.0
0.7128 1.05 1160 0.6996 0.4982 0.5020 0.6570 0.5691 88401.0 0.4908 0.3366 0.3993 86829.0 0.4982 0.4964 0.4968 0.4842 175230.0 0.4965 0.4982 0.4850 175230.0
0.7141 2.0501 1741 0.7067 0.5070 0.5611 0.1043 0.1759 88401.0 0.5014 0.9170 0.6483 86829.0 0.5070 0.5313 0.5106 0.4121 175230.0 0.5315 0.5070 0.4100 175230.0
0.7048 3.0501 2322 0.6925 0.5304 0.5485 0.3911 0.4566 88401.0 0.5202 0.6722 0.5865 86829.0 0.5304 0.5344 0.5317 0.5216 175230.0 0.5345 0.5304 0.5210 175230.0
0.6834 4.05 2902 0.6919 0.5342 0.5977 0.2343 0.3366 88401.0 0.5185 0.8395 0.6410 86829.0 0.5342 0.5581 0.5369 0.4888 175230.0 0.5585 0.5342 0.4875 175230.0
0.6908 5.0501 3483 0.6900 0.5327 0.5219 0.8770 0.6544 88401.0 0.5927 0.1822 0.2787 86829.0 0.5327 0.5573 0.5296 0.4666 175230.0 0.5570 0.5327 0.4682 175230.0

Framework versions

  • Transformers 4.46.3
  • Pytorch 2.0.0+cu117
  • Datasets 3.1.0
  • Tokenizers 0.20.3
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Kartikeya/timesformer-base-finetuned-k400-finetuned-MicroLens_classification

Finetuned
(75)
this model