Llama-3.2-3B Psychology Quiz Generator (FP16 GGUF)
This is a GGUF (FP16) version of Llama-3.2-3B fine-tuned for generating psychology quiz questions.
Model Details
- Base Model: meta-llama/Llama-3.2-3B-Instruct
- LoRA Adapter: Ghadafares2/llama3b-psych-mcqqa-lora
- Format: GGUF FP16
- Size: 5.99 GB
- Use Case: Psychology quiz question generation with RAG
Usage with llama.cpp
from llama_cpp import Llama
llm = Llama(
model_path="Llama-3.2-3B-Psychology-FP16.gguf",
n_ctx=2048,
n_threads=8
)
# Retrieve context from psychology textbook first
context = "..." # Your RAG context here
prompt = f"""[INST] The following context is from a psychology textbook:
{context}
Based on the context above, generate 1 multiple-choice question about classical conditioning.
[/INST]"""
output = llm(prompt, max_tokens=250, temperature=0.2)
print(output['choices'][0]['text'])
Important Notes
- This model was fine-tuned to work WITH RAG (textbook context)
- Best results require providing relevant psychology textbook excerpts
- For production use, consider the Q4_K_M quantized version for faster inference
- Downloads last month
- 8
Hardware compatibility
Log In
to view the estimation
We're not able to determine the quantization variants.
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for ahmedhugging12/Llama-3.2-3B-Psychology-GGUF
Base model
meta-llama/Llama-3.2-3B-Instruct