Requesting for a guide to train model on new languages if possible

#13
by dumbass10 - opened

I’d like to explore training Qwen Omni 3 on new/low-resource languages. Since it’s end-to-end and not modular, I’m curious what the recommended approach is — continued pretraining with speech+text data, LoRA-based finetuning, or another method? Has anyone tried multilingual adaptation?

Sign up or log in to comment