File size: 322 Bytes
b0afec2 |
1 2 3 4 5 6 7 8 9 10 11 12 13 |
---
license: apache-2.0
language:
- en
pipeline_tag: text-generation
tags:
- chat
---
# MobileLLM-125M-MNN
## Introduction
This model is a 4-bit quantized version of the MNN model exported from [MobileLLM-125M](https://huggingface.co/facebook/MobileLLM-125M) using [llm-export](https://github.com/wangzhaode/llm-export).
|