| ## Model Description | |
| The Mit series of large language models (LLMs) by WinkingFace is designed to seamlessly integrate intuitive conversational abilities with advanced multi-step reasoning. Unlike conventional AI models that focus solely on generating responses, Mit enhances structured thinking, contextual understanding, and function-calling precision to deliver more accurate and insightful interactions. | |
| Mit models are adaptable, capable of switching between standard conversational tasks and complex reasoning. By incorporating refined logical inference mechanisms, Mit achieves superior accuracy in judgment, decision-making, and long-form analytical tasks. | |
| Built on the open-source Qwen platform, Mit has undergone extensive architectural refinements and performance optimizations to align more effectively with real-world applications. Our fine-tuning efforts emphasize deeper contextual awareness, enhanced response coherence, and improved execution of function-calling, making Mit a powerful and versatile AI system. | |
| ## Requirements | |
| Mit's code is integrated into WinkingFace's custom version of `transformers`, and we recommend using this modified version for optimal compatibility. | |
| To prevent potential errors, such as: | |
| ``` | |
| KeyError: 'mit' | |
| ``` | |
| install the custom `transformers` package using the following command: | |
| ``` | |
| pip install git+https://github.com/WinkingFaceAI/tfm-recooked.git | |
| ``` | |
| This ensures seamless functionality and avoids compatibility issues with the model. | |
| ## License | |
| This code repository and the model weights are licensed under the [Apache 2.0 License](https://huggingface.co/WinkingFace/Mit-0.5B/blob/main/LICENSE). The Mit series is fully compatible with commercial use and allows for modifications and derivative works, including but not limited to distillation for training other LLMs. | |
| Please note that: | |
| Mit-0.5B, Mit-1.5B, Mit-3B, and Mit-7B are derived from the [Qwen series](https://huggingface.co/Qwen), which is also licensed under the [Apache 2.0 License](https://huggingface.co/Qwen/Qwen2.5-1.5B/blob/main/LICENSE). | |
| ## Contact | |
| For any questions or inquiries, feel free to [contact us here 📨](mailto:[email protected]). | |