LegalLMs
Collection
XLM-RoBERTa models with continued pretraining on the MultiLegalPile
•
37 items
•
Updated
•
4
This model was trained from scratch on an unknown dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss |
|---|---|---|---|
| 0.8647 | 0.25 | 50000 | 0.6348 |
| 0.8373 | 0.5 | 100000 | 0.5584 |
| 0.7926 | 1.21 | 150000 | 0.5244 |
| 0.6757 | 1.46 | 200000 | 0.5155 |