Arc1el's picture
Update README.md
d0443ec verified
---
license: apache-2.0
base_model: kakaocorp/kanana-safeguard-8b
quantized_by: Arc1el
quantization_method: bitsandbytes
model_type: llama
tags:
- quantized
- 4bit
- bitsandbytes
- safeguard
- korean
- safety
pipeline_tag: text-generation
---
# Kanana Safeguard Siren 8B - 4bit μ–‘μžν™” 버전
## λͺ¨λΈ κ°œμš”
- **원본 λͺ¨λΈ**: [kakaocorp/kanana-safeguard-8b](https://huggingface.co/kakaocorp/kanana-safeguard-8b)
- **μ–‘μžν™” 방법**: BitsAndBytes 4bit (NF4)
- **μ–‘μžν™” 도ꡬ**: bitsandbytes + transformers
- **μ••μΆ•λ₯ **: 원본 λŒ€λΉ„ μ•½ 75% 크기 κ°μ†Œ (μΆ”μ •)
## λͺ¨λΈ 세뢀정보
### 원본 λͺ¨λΈ 정보
- **λͺ¨λΈ μ•„ν‚€ν…μ²˜**: Safeguard model based on transformer architecture
- **νŒŒλΌλ―Έν„° 수**: 8B parameters
- **μ£Όμš” μš©λ„**: μ•ˆμ „μ„± 검증, μœ ν•΄ μ½˜ν…μΈ  탐지
- **μ–Έμ–΄**: ν•œκ΅­μ–΄ 쀑심
- **원본 λͺ¨λΈ λΌμ΄μ„ μŠ€**: Apache 2.0
### μ–‘μžν™” 정보
- **μ–‘μžν™” νƒ€μž…**: 4bit NormalFloat (NF4)
- **정밀도**: 4bit weights, 16bit activations
## μ„±λŠ₯ 및 벀치마크
### λͺ¨λΈ 크기 비ꡐ
- **원본 λͺ¨λΈ**: ~16GB (μΆ”μ •)
- **μ–‘μžν™” λͺ¨λΈ**: ~4GB (μΆ”μ •)
- **μ••μΆ•λ₯ **: μ•½ 75% κ°μ†Œ
### λ©”λͺ¨λ¦¬ μ‚¬μš©λŸ‰
- **λ‘œλ”© μ‹œ VRAM**: μ•½ 4-5GB
- **μΆ”λ‘  μ‹œ VRAM**: μ•½ 6-8GB (배치 크기에 따라 변동)
- **μ‹œμŠ€ν…œ RAM**: μ΅œμ†Œ 8GB ꢌμž₯
## μ‚¬μš©λ²•
### μ„€μΉ˜
```bash
pip install transformers accelerate bitsandbytes torch
```
### μ½”λ“œ μ˜ˆμ‹œ
```python
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer, BitsAndBytesConfig
model_name = "nxtcloud-org/kanana-safeguard-siren-8b-4bit"
# BitsAndBytesConfig μ„€μ •
bnb_config = BitsAndBytesConfig(
load_in_4bit=True,
bnb_4bit_use_double_quant=True,
bnb_4bit_quant_type="nf4",
bnb_4bit_compute_dtype=torch.bfloat16
)
# λͺ¨λΈκ³Ό ν† ν¬λ‚˜μ΄μ € λ‘œλ“œ
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(
model_name,
quantization_config=bnb_config,
device_map="auto",
trust_remote_code=True
)
# μ‚¬μš© μ˜ˆμ‹œ - μ•ˆμ „μ„± 검증
text = "이것은 검증할 ν…μŠ€νŠΈμž…λ‹ˆλ‹€."
inputs = tokenizer(text, return_tensors="pt")
with torch.no_grad():
outputs = model.generate(
**inputs,
max_length=512,
temperature=0.7,
do_sample=True,
pad_token_id=tokenizer.eos_token_id
)
result = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(result)
```
## μ‹œμŠ€ν…œ μš”κ΅¬μ‚¬ν•­
- **μ΅œμ†Œ RAM**: 8GB
- **ꢌμž₯ RAM**: 16GB
- **GPU λ©”λͺ¨λ¦¬**: 6GB VRAM (RTX 3060 이상)
- **지원 ν”Œλž«νΌ**: Linux, Windows (CUDA 지원 GPU)
- **Python**: 3.8+
- **CUDA**: 11.1+
## μ œν•œμ‚¬ν•­ 및 고렀사항
### μ„±λŠ₯ μ œν•œμ‚¬ν•­
- 4bit μ–‘μžν™”λ‘œ μΈν•œ λ―Έλ―Έν•œ μ„±λŠ₯ μ €ν•˜ κ°€λŠ₯
- 원본 λͺ¨λΈ λŒ€λΉ„ μΆ”λ‘  ν’ˆμ§ˆμ΄ μ•½κ°„ κ°μ†Œν•  수 있음
- λ³΅μž‘ν•œ μ•ˆμ „μ„± νŒλ‹¨μ—μ„œ 더 큰 영ν–₯을 받을 수 있음
### μ‚¬μš© ꢌμž₯사항
- GPU λ©”λͺ¨λ¦¬ μ œμ•½μ΄ μžˆλŠ” ν™˜κ²½μ—μ„œ μ‚¬μš© ꢌμž₯
- μ‹€μ‹œκ°„ μ•ˆμ „μ„± 검증이 ν•„μš”ν•œ μ• ν”Œλ¦¬μΌ€μ΄μ…˜μ— 적합
- ν”„λ‘œλ•μ…˜ ν™˜κ²½ μ‚¬μš© μ „ μΆ©λΆ„ν•œ 검증 ν•„μš”
### μ£Όμ˜μ‚¬ν•­
- 이 λͺ¨λΈμ€ μ•ˆμ „μ„± 검증 λͺ©μ μ˜ safeguard λͺ¨λΈμž…λ‹ˆλ‹€
- 원본 λͺ¨λΈμ˜ μ„±λŠ₯κ³Ό μ•ˆμ „μ„± νŠΉμ„±μ„ μ–‘μžν™” 후에도 μœ μ§€ν•˜λ„λ‘ λ…Έλ ₯ν–ˆμœΌλ‚˜, μ™„μ „νžˆ λ™μΌν•˜μ§€ μ•Šμ„ 수 μžˆμŠ΅λ‹ˆλ‹€
- μ€‘μš”ν•œ μ•ˆμ „μ„± νŒλ‹¨μ—λŠ” 원본 λͺ¨λΈκ³Όμ˜ ꡐ차 검증을 ꢌμž₯ν•©λ‹ˆλ‹€
## 윀리적 고렀사항
- 원본 Kakao Corp의 safeguard λͺ¨λΈμ˜ 윀리적 κ°€μ΄λ“œλΌμΈμ„ μ€€μˆ˜ν•©λ‹ˆλ‹€
- 이 λͺ¨λΈμ€ μœ ν•΄ μ½˜ν…μΈ  탐지 및 μ•ˆμ „μ„± 검증 λͺ©μ μœΌλ‘œλ§Œ μ‚¬μš©λ˜μ–΄μ•Ό ν•©λ‹ˆλ‹€
- μ–‘μžν™”λ‘œ μΈν•œ μ˜ˆμƒμΉ˜ λͺ»ν•œ 편ν–₯μ΄λ‚˜ μ•ˆμ „μ„± μ΄μŠˆκ°€ λ°œμƒν•  수 μžˆμœΌλ―€λ‘œ μ£Όμ˜κ°€ ν•„μš”ν•©λ‹ˆλ‹€
## λΌμ΄μ„ μŠ€
이 μ–‘μžν™” λͺ¨λΈμ€ 원본 λͺ¨λΈκ³Ό λ™μΌν•œ **Apache License 2.0**을 λ”°λ¦…λ‹ˆλ‹€.
```
Copyright 2025 Kakao Corp. (Original model)
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
```
## ν¬λ ˆλ”§ 및 인용
### 원본 λͺ¨λΈ ν¬λ ˆλ”§
```bibtex
@misc{kakao-kanana-safeguard-siren-8b,
title={Kanana Safeguard Siren 8B},
author={Kakao Corp},
year={2024},
publisher={Hugging Face},
url={https://huggingface.co/kakaocorp/kanana-safeguard-8b}
}
```