Hi I am new to transformers. I am trying to train mT5, but I couldn’t find “max_position_embeddings” configuration. How to define the maximum sequence length in mT5?
Related topics
| Topic | Replies | Views | Activity | |
|---|---|---|---|---|
| Max length transformers problem | 0 | 150 | March 4, 2023 | |
| Claritifcation about the `max_position_embeddings` argument | 1 | 550 | January 27, 2023 | |
| In transformer, if the text exceeds max_seq_length, how to deal with it | 0 | 402 | January 19, 2024 | |
| Model max length not set. Default value | 1 | 671 | October 6, 2024 | |
| Passing Inputs Longer Than 512 Tokens After Pretraining a T5 Model: Is It Safe? | 3 | 116 | November 20, 2025 |