| license: mit | |
| library_name: transformers | |
| pipeline_tag: text-generation | |
| # MDM-1.7B | |
| We introduce MDM-1.7B, a diffusion language model with an 1.7B scale, trained entirely from scratch with open sourece 1.1T tokens. |
| license: mit | |
| library_name: transformers | |
| pipeline_tag: text-generation | |
| # MDM-1.7B | |
| We introduce MDM-1.7B, a diffusion language model with an 1.7B scale, trained entirely from scratch with open sourece 1.1T tokens. |