Uploaded finetuned model

  • Developed by: AhmetSemih
  • License: apache-2.0
  • Finetuned from model : unsloth/gemma-3-4b-it-unsloth-bnb-4bit

This model was trained on:

  1. Wikipedia (Turkish) – 21,399 articles

Source: wikimedia/wikipedia (20231101.tr)

  1. MMLU-style QA Dataset in Turkish – 5,000 examples

Source: alibayram/turkish_mmlu Format: Multiple-choice questions.

Downloads last month
6
Safetensors
Model size
4B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for AhmetSemih/gemma-3-4b-finetuned

Finetuned
(913)
this model
Quantizations
1 model

Datasets used to train AhmetSemih/gemma-3-4b-finetuned