Spaces:
Sleeping
Sleeping
| """ | |
| ## 默认 use_fast=True 报错 | |
| lib\site-packages\transformers\tokenization_utils_fast.py", line 504, in _batch_encode_plus | |
| encodings = self._tokenizer.encode_batch( | |
| pyo3_runtime.PanicException: AddedVocabulary bad split | |
| """ | |
| from transformers import AutoTokenizer | |
| tokenizer = AutoTokenizer.from_pretrained("lmsys/fastchat-t5-3b-v1.0", trust_remote_code=True, use_fast=False) |