--- license: llama3.2 base_model: meta-llama/Llama-3.2-3B-Instruct tags: - gguf - psychology - quiz-generation - education --- # Llama-3.2-3B Psychology Quiz Generator (FP16 GGUF) This is a GGUF (FP16) version of Llama-3.2-3B fine-tuned for generating psychology quiz questions. ## Model Details - **Base Model**: meta-llama/Llama-3.2-3B-Instruct - **LoRA Adapter**: Ghadafares2/llama3b-psych-mcqqa-lora - **Format**: GGUF FP16 - **Size**: 5.99 GB - **Use Case**: Psychology quiz question generation with RAG ## Usage with llama.cpp ```python from llama_cpp import Llama llm = Llama( model_path="Llama-3.2-3B-Psychology-FP16.gguf", n_ctx=2048, n_threads=8 ) # Retrieve context from psychology textbook first context = "..." # Your RAG context here prompt = f"""[INST] The following context is from a psychology textbook: {context} Based on the context above, generate 1 multiple-choice question about classical conditioning. [/INST]""" output = llm(prompt, max_tokens=250, temperature=0.2) print(output['choices'][0]['text']) ``` ## Important Notes - This model was fine-tuned to work WITH RAG (textbook context) - Best results require providing relevant psychology textbook excerpts - For production use, consider the Q4_K_M quantized version for faster inference