|
|
--- |
|
|
license: apache-2.0 |
|
|
widget: |
|
|
- example_title: Yi-34B-Chat |
|
|
text: hi |
|
|
output: |
|
|
text: ' Hello! How can I assist you today?' |
|
|
- example_title: Yi-34B |
|
|
text: There's a place where time stands still. A place of breath taking wonder, |
|
|
but also |
|
|
output: |
|
|
text: ' an eerie sense that something is just not right… |
|
|
|
|
|
Between the two worlds lies The Forgotten Kingdom - home to creatures long since |
|
|
thought extinct and ancient magic so strong it defies belief! Only here can |
|
|
you find what has been lost for centuries: An Elixir Of Life which will restore |
|
|
youth and vitality if only those who seek its power are brave enough to face |
|
|
up against all manner of dangers lurking in this mysterious land! But beware; |
|
|
some say there may even exist powerful entities beyond our comprehension whose |
|
|
intentions towards humanity remain unclear at best ---- they might want nothing |
|
|
more than destruction itself rather then anything else from their quest after |
|
|
immortality (and maybe someone should tell them about modern medicine)? In any |
|
|
event though – one thing remains true regardless : whether or not success comes |
|
|
easy depends entirely upon how much effort we put into conquering whatever challenges |
|
|
lie ahead along with having faith deep down inside ourselves too ;) So let’s |
|
|
get started now shall We?' |
|
|
pipeline_tag: text-generation |
|
|
tags: |
|
|
- mlx |
|
|
base_model: 01-ai/Yi-34B-200K |
|
|
--- |
|
|
|
|
|
# Fmuaddib/Yi-34B-200K-mlx-8Bit |
|
|
|
|
|
The Model [Fmuaddib/Yi-34B-200K-mlx-8Bit](https://huggingface.co/Fmuaddib/Yi-34B-200K-mlx-8Bit) was converted to MLX format from [01-ai/Yi-34B-200K](https://huggingface.co/01-ai/Yi-34B-200K) using mlx-lm version **0.21.5**. |
|
|
|
|
|
## Use with mlx |
|
|
|
|
|
```bash |
|
|
pip install mlx-lm |
|
|
``` |
|
|
|
|
|
```python |
|
|
from mlx_lm import load, generate |
|
|
|
|
|
model, tokenizer = load("Fmuaddib/Yi-34B-200K-mlx-8Bit") |
|
|
|
|
|
prompt="hello" |
|
|
|
|
|
if hasattr(tokenizer, "apply_chat_template") and tokenizer.chat_template is not None: |
|
|
messages = [{"role": "user", "content": prompt}] |
|
|
prompt = tokenizer.apply_chat_template( |
|
|
messages, tokenize=False, add_generation_prompt=True |
|
|
) |
|
|
|
|
|
response = generate(model, tokenizer, prompt=prompt, verbose=True) |
|
|
``` |
|
|
|