deberta-v3-base-korean-ai-detect

Model Details

2025 SWμ€‘μ‹¬λŒ€ν•™ λ””μ§€ν„Έ κ²½μ§„λŒ€νšŒ : AIλΆ€λ¬Έ : μƒμ„±ν˜• AI(LLM)와 인간 : ν…μŠ€νŠΈ νŒλ³„ μ±Œλ¦°μ§€ μˆ˜μƒμž‘
이 λͺ¨λΈμ€ ν•œμ„±λŒ€ν•™κ΅ SWμ€‘μ‹¬λŒ€ν•™μ‚¬μ—…λ‹¨μœΌλ‘œλΆ€ν„° 후원 받은 Cloud GPU 둜 ν•™μŠ΅λ˜μ—ˆμŠ΅λ‹ˆλ‹€.

How to Get Started with the Model

from transformers import AutoTokenizer, AutoModelForSequenceClassification

tokenizer = AutoTokenizer.from_pretrained("team-lucid/deberta-v3-base-korean")
model = AutoModelForSequenceClassification.from_pretrained("hyun5ooo/deberta-v3-base-korean-ai-detect")

inputs = tokenizer("μ•ˆλ…•, 세상!", return_tensors="pt")
outputs = model(**inputs)

Evaluation

2025 SWμ€‘μ‹¬λŒ€ν•™ λ””μ§€ν„Έ κ²½μ§„λŒ€νšŒ : AIλΆ€λ¬Έ / Public λ¦¬λ”λ³΄λ“œ κΈ°μ€€

Model roc-auc
deberta base 0.903
deberta xlarge 0.904
mdeberta base 0.894
koelectra base 0.894
klue roberta large 0.892
Downloads last month
10
Safetensors
Model size
0.1B params
Tensor type
F32
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for hyun5ooo/deberta-v3-base-korean-ai-detect

Finetuned
(4)
this model