YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Bharat PII Gemma 3 270M (GGUF)

Files

  • bharat-pii-gemma-3-270m-it-v0.7-f16.gguf
  • bharat-pii-gemma-3-270m-it-v0.7_q8_0.gguf

Notes

  • GGUF is self-contained (includes tokenizer/config metadata).
  • Converted with llama.cpp.
  • Suggested: q8_0 for general use, f16 for reference/evaluation.

Example (llama.cpp)

  • ./main -m ./bharat-pii-gemma-3-270m-it-v0.7_q8_0.gguf -p "My name is ..."
Downloads last month
25
GGUF
Model size
0.3B params
Architecture
gemma3
Hardware compatibility
Log In to view the estimation

8-bit

16-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support