v0.40.0
Browse filesSee https://github.com/quic/ai-hub-models/releases/v0.40.0 for changelog.
README.md
CHANGED
|
@@ -11,17 +11,19 @@ pipeline_tag: text-generation
|
|
| 11 |
|
| 12 |

|
| 13 |
|
| 14 |
-
# Phi-3.5-
|
| 15 |
## State-of-the-art large language model useful on a variety of language understanding and generation tasks
|
| 16 |
|
| 17 |
|
| 18 |
Phi-3.5-mini is a lightweight, state-of-the-art open model built upon datasets used for Phi-3 - synthetic data and filtered publicly available websites - with a focus on very high-quality, reasoning dense data. The model belongs to the Phi-3 model family and supports 128K token context length. The model underwent a rigorous enhancement process, incorporating both supervised fine-tuning, proximal policy optimization, and direct preference optimization to ensure precise instruction adherence and robust safety measures.
|
| 19 |
|
| 20 |
-
This model is an implementation of Phi-3.5-
|
| 21 |
|
| 22 |
|
| 23 |
More details on model performance across various devices, can be found [here](https://aihub.qualcomm.com/models/phi_3_5_mini_instruct).
|
| 24 |
|
|
|
|
|
|
|
| 25 |
### Model Details
|
| 26 |
|
| 27 |
- **Model Type:** Model_use_case.text_generation
|
|
@@ -53,7 +55,7 @@ Please follow the [LLM on-device deployment](https://github.com/quic/ai-hub-apps
|
|
| 53 |
|
| 54 |
|
| 55 |
## License
|
| 56 |
-
* The license for the original implementation of Phi-3.5-
|
| 57 |
[here](https://huggingface.co/microsoft/Phi-3.5-mini-instruct/blob/main/LICENSE).
|
| 58 |
* The license for the compiled assets for on-device deployment can be found [here](https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/Qualcomm+AI+Hub+Proprietary+License.pdf)
|
| 59 |
|
|
|
|
| 11 |
|
| 12 |

|
| 13 |
|
| 14 |
+
# Phi-3.5-Mini-Instruct: Optimized for Mobile Deployment
|
| 15 |
## State-of-the-art large language model useful on a variety of language understanding and generation tasks
|
| 16 |
|
| 17 |
|
| 18 |
Phi-3.5-mini is a lightweight, state-of-the-art open model built upon datasets used for Phi-3 - synthetic data and filtered publicly available websites - with a focus on very high-quality, reasoning dense data. The model belongs to the Phi-3 model family and supports 128K token context length. The model underwent a rigorous enhancement process, incorporating both supervised fine-tuning, proximal policy optimization, and direct preference optimization to ensure precise instruction adherence and robust safety measures.
|
| 19 |
|
| 20 |
+
This model is an implementation of Phi-3.5-Mini-Instruct found [here](https://huggingface.co/microsoft/Phi-3.5-mini-instruct).
|
| 21 |
|
| 22 |
|
| 23 |
More details on model performance across various devices, can be found [here](https://aihub.qualcomm.com/models/phi_3_5_mini_instruct).
|
| 24 |
|
| 25 |
+
|
| 26 |
+
|
| 27 |
### Model Details
|
| 28 |
|
| 29 |
- **Model Type:** Model_use_case.text_generation
|
|
|
|
| 55 |
|
| 56 |
|
| 57 |
## License
|
| 58 |
+
* The license for the original implementation of Phi-3.5-Mini-Instruct can be found
|
| 59 |
[here](https://huggingface.co/microsoft/Phi-3.5-mini-instruct/blob/main/LICENSE).
|
| 60 |
* The license for the compiled assets for on-device deployment can be found [here](https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/Qualcomm+AI+Hub+Proprietary+License.pdf)
|
| 61 |
|