File size: 1,067 Bytes
ca358aa bdfb439 ca358aa 1fd222f a8eee76 e63d059 a8eee76 6dce13d ddefe57 2309095 ddefe57 2309095 ddefe57 2309095 ddefe57 2309095 ddefe57 2309095 ddefe57 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 |
---
library_name: transformers
license: gpl-3.0
datasets:
- Moleys/mtl-zh2vi
- Moleys/mtl-zh2vi-b
language:
- zh
- vi
metrics:
- bleu
pipeline_tag: translation
---
# Hirashiba ^^

Hira's intelligence, Shiba's speed
Hirashiba-MT-zh-vi is a model used for gatekeeping and refilling water.
## Usage
```python
from transformers import MarianMTModel, MarianTokenizer
model_name = "chi-vi/hirashiba-mt-tiny-zh-vi"
tokenizer = MarianTokenizer.from_pretrained(model_name)
model = MarianMTModel.from_pretrained(model_name)
def translate(lines):
inputs = tokenizer(lines, return_tensors="pt", padding=True)
translated = model.generate(**inputs)
return [tokenizer.decode(t, skip_special_tokens=True) for t in translated]
with open('sample.txt') as f:
src_text = f.readlines()
import time
start = time.time()
translated = translate(src_text)
end = time.time()
print(translated)
print(end - start)
```
|