jaffe_V2_200_1

This model is a fine-tuned version of WinKawaks/vit-tiny-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3747
  • Accuracy: 0.9

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 200

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 1 2.4997 0.0667
No log 2.0 2 2.6037 0.1
No log 3.0 3 2.3924 0.0667
No log 4.0 4 2.3152 0.1
No log 5.0 5 2.1146 0.1667
No log 6.0 6 2.1610 0.2333
No log 7.0 7 2.1346 0.1333
No log 8.0 8 2.1400 0.1
No log 9.0 9 2.1422 0.0667
2.3217 10.0 10 2.0948 0.1333
2.3217 11.0 11 2.0994 0.2
2.3217 12.0 12 1.8570 0.3333
2.3217 13.0 13 1.9750 0.2667
2.3217 14.0 14 1.8089 0.3
2.3217 15.0 15 1.8738 0.3
2.3217 16.0 16 1.7751 0.3333
2.3217 17.0 17 1.7744 0.2
2.3217 18.0 18 1.7998 0.3333
2.3217 19.0 19 1.7048 0.2667
1.798 20.0 20 1.6367 0.4
1.798 21.0 21 1.6092 0.3
1.798 22.0 22 1.5605 0.3667
1.798 23.0 23 1.4219 0.5
1.798 24.0 24 1.5037 0.4
1.798 25.0 25 1.3966 0.4333
1.798 26.0 26 1.4327 0.4
1.798 27.0 27 1.3484 0.4
1.798 28.0 28 1.3958 0.4
1.798 29.0 29 1.2789 0.4667
1.1133 30.0 30 1.2002 0.4333
1.1133 31.0 31 1.1080 0.4667
1.1133 32.0 32 0.9814 0.6
1.1133 33.0 33 1.0498 0.5667
1.1133 34.0 34 0.9709 0.6333
1.1133 35.0 35 0.9985 0.5333
1.1133 36.0 36 0.8779 0.6667
1.1133 37.0 37 0.7959 0.7
1.1133 38.0 38 0.7583 0.7
1.1133 39.0 39 1.0074 0.5667
0.5945 40.0 40 0.6441 0.6667
0.5945 41.0 41 0.7701 0.6667
0.5945 42.0 42 0.8433 0.6667
0.5945 43.0 43 0.7998 0.6667
0.5945 44.0 44 0.7087 0.7
0.5945 45.0 45 0.5793 0.8333
0.5945 46.0 46 0.5024 0.8
0.5945 47.0 47 0.8088 0.7
0.5945 48.0 48 0.7690 0.7
0.5945 49.0 49 0.8561 0.6667
0.3008 50.0 50 0.4728 0.8667
0.3008 51.0 51 0.5935 0.6667
0.3008 52.0 52 0.3772 0.9
0.3008 53.0 53 0.6337 0.6333
0.3008 54.0 54 0.6097 0.7
0.3008 55.0 55 0.4838 0.8333
0.3008 56.0 56 0.5487 0.8333
0.3008 57.0 57 0.5395 0.8
0.3008 58.0 58 0.5078 0.7667
0.3008 59.0 59 0.4211 0.8
0.1792 60.0 60 0.4578 0.8333
0.1792 61.0 61 0.4603 0.8333
0.1792 62.0 62 0.2765 0.9
0.1792 63.0 63 0.6634 0.7333
0.1792 64.0 64 0.3247 0.9
0.1792 65.0 65 0.6290 0.6667
0.1792 66.0 66 0.5741 0.8
0.1792 67.0 67 0.3994 0.8333
0.1792 68.0 68 0.4273 0.8333
0.1792 69.0 69 0.4240 0.7333
0.1158 70.0 70 0.4269 0.8333
0.1158 71.0 71 0.4764 0.8333
0.1158 72.0 72 0.3892 0.8667
0.1158 73.0 73 0.5258 0.8
0.1158 74.0 74 0.3253 0.8333
0.1158 75.0 75 0.5055 0.7667
0.1158 76.0 76 0.6183 0.7667
0.1158 77.0 77 0.3801 0.9
0.1158 78.0 78 0.5568 0.7333
0.1158 79.0 79 0.3794 0.8333
0.0936 80.0 80 0.2896 0.9
0.0936 81.0 81 0.5924 0.7667
0.0936 82.0 82 0.5123 0.8333
0.0936 83.0 83 0.6333 0.8
0.0936 84.0 84 0.4452 0.7333
0.0936 85.0 85 0.4296 0.8333
0.0936 86.0 86 0.3000 0.8667
0.0936 87.0 87 0.3882 0.8667
0.0936 88.0 88 0.5478 0.7333
0.0936 89.0 89 0.3075 0.8667
0.0473 90.0 90 0.5298 0.8
0.0473 91.0 91 0.6640 0.7333
0.0473 92.0 92 0.4580 0.8333
0.0473 93.0 93 0.5458 0.7333
0.0473 94.0 94 0.4686 0.8333
0.0473 95.0 95 0.2982 0.8333
0.0473 96.0 96 0.4537 0.8333
0.0473 97.0 97 0.3308 0.8667
0.0473 98.0 98 0.4839 0.8
0.0473 99.0 99 0.4554 0.8
0.0443 100.0 100 0.2150 0.9667
0.0443 101.0 101 0.3185 0.9333
0.0443 102.0 102 0.2575 0.9
0.0443 103.0 103 0.3313 0.8667
0.0443 104.0 104 0.4836 0.8333
0.0443 105.0 105 0.3910 0.8667
0.0443 106.0 106 0.5569 0.8333
0.0443 107.0 107 0.4688 0.8667
0.0443 108.0 108 0.2292 0.9333
0.0443 109.0 109 0.4958 0.8
0.0353 110.0 110 0.3628 0.9
0.0353 111.0 111 0.6191 0.7333
0.0353 112.0 112 0.5096 0.8
0.0353 113.0 113 0.3478 0.9
0.0353 114.0 114 0.3585 0.8667
0.0353 115.0 115 0.3859 0.8
0.0353 116.0 116 0.3952 0.8333
0.0353 117.0 117 0.4491 0.8333
0.0353 118.0 118 0.4710 0.8
0.0353 119.0 119 0.5375 0.8
0.0292 120.0 120 0.6853 0.8333
0.0292 121.0 121 0.4836 0.8
0.0292 122.0 122 0.5246 0.8
0.0292 123.0 123 0.4446 0.8667
0.0292 124.0 124 0.4238 0.8
0.0292 125.0 125 0.3543 0.8333
0.0292 126.0 126 0.2007 0.9333
0.0292 127.0 127 0.2274 0.9333
0.0292 128.0 128 0.3778 0.8333
0.0292 129.0 129 0.4544 0.8333
0.0296 130.0 130 0.2613 0.8667
0.0296 131.0 131 0.3248 0.9
0.0296 132.0 132 0.4552 0.8
0.0296 133.0 133 0.4356 0.8333
0.0296 134.0 134 0.3427 0.9
0.0296 135.0 135 0.1513 1.0
0.0296 136.0 136 0.3139 0.8333
0.0296 137.0 137 0.3094 0.9
0.0296 138.0 138 0.3401 0.8667
0.0296 139.0 139 0.4339 0.9333
0.0178 140.0 140 0.2465 0.9
0.0178 141.0 141 0.4604 0.8667
0.0178 142.0 142 0.4860 0.8
0.0178 143.0 143 0.3710 0.8333
0.0178 144.0 144 0.4719 0.8333
0.0178 145.0 145 0.3030 0.9333
0.0178 146.0 146 0.6212 0.7667
0.0178 147.0 147 0.2716 0.9
0.0178 148.0 148 0.4297 0.8333
0.0178 149.0 149 0.3456 0.8333
0.0103 150.0 150 0.4718 0.8667
0.0103 151.0 151 0.3841 0.8333
0.0103 152.0 152 0.4124 0.9333
0.0103 153.0 153 0.2595 0.9333
0.0103 154.0 154 0.2666 0.8667
0.0103 155.0 155 0.4872 0.7333
0.0103 156.0 156 0.4039 0.8333
0.0103 157.0 157 0.3004 0.8667
0.0103 158.0 158 0.3021 0.9
0.0103 159.0 159 0.4477 0.9
0.0075 160.0 160 0.3548 0.9333
0.0075 161.0 161 0.2648 0.9333
0.0075 162.0 162 0.3269 0.9333
0.0075 163.0 163 0.5231 0.8
0.0075 164.0 164 0.2841 0.8667
0.0075 165.0 165 0.3145 0.9
0.0075 166.0 166 0.4291 0.8667
0.0075 167.0 167 0.5396 0.8333
0.0075 168.0 168 0.3873 0.9
0.0075 169.0 169 0.3150 0.9333
0.0062 170.0 170 0.3809 0.9
0.0062 171.0 171 0.2062 0.9
0.0062 172.0 172 0.3242 0.8667
0.0062 173.0 173 0.3500 0.9
0.0062 174.0 174 0.2784 0.9
0.0062 175.0 175 0.2553 0.8667
0.0062 176.0 176 0.4475 0.9
0.0062 177.0 177 0.3598 0.9333
0.0062 178.0 178 0.3488 0.8333
0.0062 179.0 179 0.2966 0.8333
0.0056 180.0 180 0.4635 0.8
0.0056 181.0 181 0.2402 0.9
0.0056 182.0 182 0.3984 0.8667
0.0056 183.0 183 0.2032 0.9
0.0056 184.0 184 0.2633 0.8333
0.0056 185.0 185 0.3015 0.9333
0.0056 186.0 186 0.3774 0.9
0.0056 187.0 187 0.5716 0.8333
0.0056 188.0 188 0.3961 0.8667
0.0056 189.0 189 0.3915 0.9
0.0048 190.0 190 0.3788 0.8333
0.0048 191.0 191 0.4823 0.8667
0.0048 192.0 192 0.3158 0.8667
0.0048 193.0 193 0.2184 0.8667
0.0048 194.0 194 0.3363 0.8667
0.0048 195.0 195 0.3996 0.9
0.0048 196.0 196 0.2263 0.8333
0.0048 197.0 197 0.4634 0.8333
0.0048 198.0 198 0.3492 0.8667
0.0048 199.0 199 0.3086 0.9
0.0034 200.0 200 0.3747 0.9

Framework versions

  • Transformers 4.47.0
  • Pytorch 2.5.1+cu121
  • Datasets 3.3.1
  • Tokenizers 0.21.0
Downloads last month
1
Safetensors
Model size
5.53M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for ricardoSLabs/jaffe_V2_200_1

Finetuned
(48)
this model

Evaluation results