sapie-model/SQL-sft-grpo-3000
- μ΄ λ¦¬ν¬λ LoRA/μ΄λν° κ°μ€μΉλ§ ν¬ν¨ν©λλ€. μΆλ‘ μμλ λ² μ΄μ€ λͺ¨λΈ
OpenPipe/gemma-3-27b-it-text-onlyκ³Ό ν¨κ» λ‘λνμΈμ.
μ¬μ© μμ
from transformers import AutoTokenizer, AutoModelForCausalLM
from peft import PeftModel
base = "OpenPipe/gemma-3-27b-it-text-only"
model_id = "sapie-model/SQL-sft-grpo-3000" # μ΄ μ΄λν° λ¦¬ν¬
tok = AutoTokenizer.from_pretrained(base)
model = AutoModelForCausalLM.from_pretrained(base, torch_dtype='auto', device_map='auto')
model = PeftModel.from_pretrained(model, model_id)
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
π
Ask for provider support
Model tree for sapie-model/SQL-sft-grpo-3000
Base model
OpenPipe/gemma-3-27b-it-text-only