Update README.md
Browse files
README.md
CHANGED
|
@@ -6,6 +6,6 @@ language:
|
|
| 6 |
# NeuralHermes-MoE-2x7B
|
| 7 |
|
| 8 |
This is a mix between teknium/OpenHermes-2.5-Mistral-7B and Intel/neural-chat-7b-v3-3.
|
| 9 |
-
Using
|
| 10 |
|
| 11 |
This Mixture of Expert was done using `mergekit` method.
|
|
|
|
| 6 |
# NeuralHermes-MoE-2x7B
|
| 7 |
|
| 8 |
This is a mix between teknium/OpenHermes-2.5-Mistral-7B and Intel/neural-chat-7b-v3-3.
|
| 9 |
+
Using mistralai/Mistral-7B-v0.1 as the base model.
|
| 10 |
|
| 11 |
This Mixture of Expert was done using `mergekit` method.
|