Update README.md
Browse files
README.md
CHANGED
|
@@ -8,6 +8,8 @@ tags:
|
|
| 8 |
- transliteration
|
| 9 |
- tibetan
|
| 10 |
- buddhism
|
|
|
|
|
|
|
| 11 |
---
|
| 12 |
# Model Card for tibetan-phonetic-transliteration
|
| 13 |
|
|
@@ -128,7 +130,8 @@ For users who wish to use the model for other texts, I recommend further finetun
|
|
| 128 |
|
| 129 |
This model was trained on 98597 pairs of text, the first member of which is a line of unicode Tibetan text, the second (the target) is a the phonetic transliteration of the first.
|
| 130 |
This dataset was scraped from Lotsawa House and is released on Kaggle under the same license as the texts from which it is sourced.
|
| 131 |
-
[You can find this dataset and more information by clicking here.](https://www.kaggle.com/datasets/billingsmoore/tibetan-phonetic-transliteration-pairs)
|
|
|
|
| 132 |
|
| 133 |
This model was trained for five epochs. Further information regarding training can be found in the documentation of the [MLotsawa repository](https://github.com/billingsmoore/MLotsawa).
|
| 134 |
|
|
|
|
| 8 |
- transliteration
|
| 9 |
- tibetan
|
| 10 |
- buddhism
|
| 11 |
+
datasets:
|
| 12 |
+
- billingsmoore/tibetan-phonetic-transliteration-dataset
|
| 13 |
---
|
| 14 |
# Model Card for tibetan-phonetic-transliteration
|
| 15 |
|
|
|
|
| 130 |
|
| 131 |
This model was trained on 98597 pairs of text, the first member of which is a line of unicode Tibetan text, the second (the target) is a the phonetic transliteration of the first.
|
| 132 |
This dataset was scraped from Lotsawa House and is released on Kaggle under the same license as the texts from which it is sourced.
|
| 133 |
+
[You can find this dataset and more information on Kaggle by clicking here.](https://www.kaggle.com/datasets/billingsmoore/tibetan-phonetic-transliteration-pairs)
|
| 134 |
+
[You can find this dataset and more information on Huggingface by clicking here.](https://huggingface.co/datasets/billingsmoore/tibetan-phonetic-transliteration-dataset)
|
| 135 |
|
| 136 |
This model was trained for five epochs. Further information regarding training can be found in the documentation of the [MLotsawa repository](https://github.com/billingsmoore/MLotsawa).
|
| 137 |
|