This is a llama model with ~50M parameters.

You can use modeling files from this GitHub repo.

  • Model Size: 52,177,152
  • Vocab Size: 32,768
  • Context Length: 512
  • Embedding Dimension: 256
  • Attention Heads: 128
  • KV Groups: 64
  • Hidden Dimension: 2048
  • Number of Layers: 20
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for aliarda/llama-50M-latest

Finetuned
(3)
this model
Finetunes
1 model

Dataset used to train aliarda/llama-50M-latest