Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
zera09
/
gemma-dpo_v1_text
like
0
Transformers
TensorBoard
Safetensors
Generated from Trainer
trl
dpo
arxiv:
2305.18290
Model card
Files
Files and versions
xet
Metrics
Training metrics
Community
Deploy
Use this model
main
gemma-dpo_v1_text
Commit History
End of training
161218a
verified
zera09
commited on
Apr 7
initial commit
bb07ac6
verified
zera09
commited on
Apr 7