Add new CrossEncoder model
Browse files- README.md +440 -0
- added_tokens.json +4 -0
- config.json +35 -0
- model.safetensors +3 -0
- special_tokens_map.json +37 -0
- tokenizer.json +0 -0
- tokenizer_config.json +81 -0
- vocab.txt +0 -0
README.md
ADDED
|
@@ -0,0 +1,440 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
tags:
|
| 3 |
+
- sentence-transformers
|
| 4 |
+
- sentence-similarity
|
| 5 |
+
- feature-extraction
|
| 6 |
+
- generated_from_trainer
|
| 7 |
+
- dataset_size:1893949
|
| 8 |
+
- loss:Contrastive
|
| 9 |
+
base_model: nreimers/MiniLM-L6-H384-uncased
|
| 10 |
+
widget:
|
| 11 |
+
- source_sentence: what medicine can i give my dog for kennel cough?
|
| 12 |
+
sentences:
|
| 13 |
+
- Your veterinarian can prescribe a round of antibiotics to help your dog recover
|
| 14 |
+
faster. Some of the most widely prescribed medications for Kennel Cough are Baytril,
|
| 15 |
+
Doxycycline, and Clavamox. However, because the disease is caused by both a virus
|
| 16 |
+
and bacteria, the dog will require a dual-purpose treatment.
|
| 17 |
+
- Preventing Kennel Cough The kennel cough vaccination is given once a year, but
|
| 18 |
+
should not be given at the same time as their annual booster. If you're going
|
| 19 |
+
on holiday, it's important that your dog is vaccinated against kennel cough. Often
|
| 20 |
+
kennels will not accept your dog if it has not had a kennel cough vaccination.
|
| 21 |
+
- The Driver Guide A driver guide will act as your driver and tour guide in one
|
| 22 |
+
person. He is less knowledgeable than the tour guide, but can provide you with
|
| 23 |
+
the most important information, help you to get around at the destination and
|
| 24 |
+
give tips about the best photo spots.
|
| 25 |
+
- source_sentence: how to redirect http traffic to https in tomcat?
|
| 26 |
+
sentences:
|
| 27 |
+
- Yes, but it depends on the previous contents of the bag. FoodSaver® Bags that
|
| 28 |
+
previously contained fruits, vegetables, breads and dry goods can be washed and
|
| 29 |
+
reused. ... FoodSaver® Bags that contained greasy or oily foods should also be
|
| 30 |
+
discarded, as they may be difficult to clean. FoodSaver® Bags can be washed by
|
| 31 |
+
hand.
|
| 32 |
+
- '[''Go to SymantecDLP\\Protect\\tomcat\\conf directory.'', ''Edit the file server.xml.'',
|
| 33 |
+
''Add the following above the first <connector> entry: ... '', ''Save the server.
|
| 34 |
+
... '', ''Edit the web.xml file in the same directory.'', ''Scroll to the bottom
|
| 35 |
+
of the file and add the following just above the </web-app> entry: ... '', ''Save
|
| 36 |
+
the web.xml file.'']'
|
| 37 |
+
- 'Cause: The Enforce Console''s tomcat webserver is configured to only accept HTTPS
|
| 38 |
+
requests. Any non-secure HTTP request will not be redirected. By default the tomcat
|
| 39 |
+
webserver is not configured to redirect HTTP requests to HTTPS.'
|
| 40 |
+
- source_sentence: when did the last rick and morty air?
|
| 41 |
+
sentences:
|
| 42 |
+
- Instead of Naota, the main character of the original FLCL (pronounced “Fooly Cooly”),
|
| 43 |
+
Progressive focuses on Hidomi, a reserved girl who closes herself off from the
|
| 44 |
+
world by wearing headphones (that aren't actually playing music).
|
| 45 |
+
- On May 10, 2018, Adult Swim announced a long-term deal with the creators, ordering
|
| 46 |
+
70 new episodes of Rick and Morty over an unknown number of seasons. As of May
|
| 47 |
+
31, 2020, 41 episodes of Rick and Morty have aired, concluding the fourth season.
|
| 48 |
+
- What time is Rick and Morty season 4 episode 8 out? The latest episode of Rick
|
| 49 |
+
and Morty airs on Sunday, May 17 on Adult Swim. US fans can stream the new episode
|
| 50 |
+
of Rick and Morty when it airs on the channel.
|
| 51 |
+
- source_sentence: are clothes made of plastic?
|
| 52 |
+
sentences:
|
| 53 |
+
- It might surprise you, but you're probably wearing plastic clothes. ... Many of
|
| 54 |
+
our clothes contain plastics like polyester, nylon, acrylic and polyamide. In
|
| 55 |
+
fact most new fabrics are made of plastic – up to 64% of them. The thing is, every
|
| 56 |
+
time we wash these materials they shed millions of plastic microfibres.
|
| 57 |
+
- General Pharmacology. Beta-blockers are drugs that bind to beta-adrenoceptors
|
| 58 |
+
and thereby block the binding of norepinephrine and epinephrine to these receptors.
|
| 59 |
+
... Second generation beta-blockers are more cardioselective in that they are
|
| 60 |
+
relatively selective for β1 adrenoceptors.
|
| 61 |
+
- It might surprise you, but you're probably wearing plastic clothes. ... Many of
|
| 62 |
+
our clothes contain plastics like polyester, nylon, acrylic and polyamide. In
|
| 63 |
+
fact most new fabrics are made of plastic – up to 64% of them. The thing is, every
|
| 64 |
+
time we wash these materials they shed millions of plastic microfibres.
|
| 65 |
+
- source_sentence: are emg pickups any good?
|
| 66 |
+
sentences:
|
| 67 |
+
- EMGs are a one trick pony, and only sound good for high gain applications. Sort
|
| 68 |
+
of, they definitely aren't as flexible as most passive options, but most metal
|
| 69 |
+
oriented passive pickups have the same issue. A lot of guitarists forget that
|
| 70 |
+
EMG makes more pickups than just the 81/85 set.
|
| 71 |
+
- Among guitar and bass accessories, the company sells active humbucker pickups,
|
| 72 |
+
such as the EMG 81, the EMG 85, the EMG 60, and the EMG 89. They also produce
|
| 73 |
+
passive pickups such as the EMG-HZ Series, which include SRO-OC1's and SC Sets.
|
| 74 |
+
- You can find the star next to the abandoned mansion. The Treasure Map Loading
|
| 75 |
+
Screen is unlocked through the battle pass, and if you look at the treasure map
|
| 76 |
+
loading screen you'll see a knife pointing in this location.
|
| 77 |
+
pipeline_tag: sentence-similarity
|
| 78 |
+
library_name: sentence-transformers
|
| 79 |
+
---
|
| 80 |
+
|
| 81 |
+
# ColBERT based on nreimers/MiniLM-L6-H384-uncased
|
| 82 |
+
|
| 83 |
+
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [nreimers/MiniLM-L6-H384-uncased](https://huggingface.co/nreimers/MiniLM-L6-H384-uncased). It maps sentences & paragraphs to a 128-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
|
| 84 |
+
|
| 85 |
+
## Model Details
|
| 86 |
+
|
| 87 |
+
### Model Description
|
| 88 |
+
- **Model Type:** Sentence Transformer
|
| 89 |
+
- **Base model:** [nreimers/MiniLM-L6-H384-uncased](https://huggingface.co/nreimers/MiniLM-L6-H384-uncased) <!-- at revision 3276f0fac9d818781d7a1327b3ff818fc4e643c0 -->
|
| 90 |
+
- **Maximum Sequence Length:** 31 tokens
|
| 91 |
+
- **Output Dimensionality:** 128 dimensions
|
| 92 |
+
- **Similarity Function:** Cosine Similarity
|
| 93 |
+
<!-- - **Training Dataset:** Unknown -->
|
| 94 |
+
<!-- - **Language:** Unknown -->
|
| 95 |
+
<!-- - **License:** Unknown -->
|
| 96 |
+
|
| 97 |
+
### Model Sources
|
| 98 |
+
|
| 99 |
+
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
|
| 100 |
+
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
|
| 101 |
+
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
|
| 102 |
+
|
| 103 |
+
### Full Model Architecture
|
| 104 |
+
|
| 105 |
+
```
|
| 106 |
+
ColBERT(
|
| 107 |
+
(0): Transformer({'max_seq_length': 31, 'do_lower_case': False}) with Transformer model: BertModel
|
| 108 |
+
(1): Dense({'in_features': 384, 'out_features': 128, 'bias': False, 'activation_function': 'torch.nn.modules.linear.Identity'})
|
| 109 |
+
)
|
| 110 |
+
```
|
| 111 |
+
|
| 112 |
+
## Usage
|
| 113 |
+
|
| 114 |
+
### Direct Usage (Sentence Transformers)
|
| 115 |
+
|
| 116 |
+
First install the Sentence Transformers library:
|
| 117 |
+
|
| 118 |
+
```bash
|
| 119 |
+
pip install -U sentence-transformers
|
| 120 |
+
```
|
| 121 |
+
|
| 122 |
+
Then you can load this model and run inference.
|
| 123 |
+
```python
|
| 124 |
+
from sentence_transformers import SentenceTransformer
|
| 125 |
+
|
| 126 |
+
# Download from the 🤗 Hub
|
| 127 |
+
model = SentenceTransformer("sentence_transformers_model_id")
|
| 128 |
+
# Run inference
|
| 129 |
+
sentences = [
|
| 130 |
+
'are emg pickups any good?',
|
| 131 |
+
"EMGs are a one trick pony, and only sound good for high gain applications. Sort of, they definitely aren't as flexible as most passive options, but most metal oriented passive pickups have the same issue. A lot of guitarists forget that EMG makes more pickups than just the 81/85 set.",
|
| 132 |
+
"Among guitar and bass accessories, the company sells active humbucker pickups, such as the EMG 81, the EMG 85, the EMG 60, and the EMG 89. They also produce passive pickups such as the EMG-HZ Series, which include SRO-OC1's and SC Sets.",
|
| 133 |
+
]
|
| 134 |
+
embeddings = model.encode(sentences)
|
| 135 |
+
print(embeddings.shape)
|
| 136 |
+
# [3, 128]
|
| 137 |
+
|
| 138 |
+
# Get the similarity scores for the embeddings
|
| 139 |
+
similarities = model.similarity(embeddings, embeddings)
|
| 140 |
+
print(similarities.shape)
|
| 141 |
+
# [3, 3]
|
| 142 |
+
```
|
| 143 |
+
|
| 144 |
+
<!--
|
| 145 |
+
### Direct Usage (Transformers)
|
| 146 |
+
|
| 147 |
+
<details><summary>Click to see the direct usage in Transformers</summary>
|
| 148 |
+
|
| 149 |
+
</details>
|
| 150 |
+
-->
|
| 151 |
+
|
| 152 |
+
<!--
|
| 153 |
+
### Downstream Usage (Sentence Transformers)
|
| 154 |
+
|
| 155 |
+
You can finetune this model on your own dataset.
|
| 156 |
+
|
| 157 |
+
<details><summary>Click to expand</summary>
|
| 158 |
+
|
| 159 |
+
</details>
|
| 160 |
+
-->
|
| 161 |
+
|
| 162 |
+
<!--
|
| 163 |
+
### Out-of-Scope Use
|
| 164 |
+
|
| 165 |
+
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
|
| 166 |
+
-->
|
| 167 |
+
|
| 168 |
+
<!--
|
| 169 |
+
## Bias, Risks and Limitations
|
| 170 |
+
|
| 171 |
+
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
|
| 172 |
+
-->
|
| 173 |
+
|
| 174 |
+
<!--
|
| 175 |
+
### Recommendations
|
| 176 |
+
|
| 177 |
+
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
|
| 178 |
+
-->
|
| 179 |
+
|
| 180 |
+
## Training Details
|
| 181 |
+
|
| 182 |
+
### Training Dataset
|
| 183 |
+
|
| 184 |
+
#### Unnamed Dataset
|
| 185 |
+
|
| 186 |
+
* Size: 1,893,949 training samples
|
| 187 |
+
* Columns: <code>question</code>, <code>answer</code>, and <code>negative</code>
|
| 188 |
+
* Approximate statistics based on the first 1000 samples:
|
| 189 |
+
| | question | answer | negative |
|
| 190 |
+
|:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
|
| 191 |
+
| type | string | string | string |
|
| 192 |
+
| details | <ul><li>min: 9 tokens</li><li>mean: 13.01 tokens</li><li>max: 27 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 31.78 tokens</li><li>max: 32 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 31.66 tokens</li><li>max: 32 tokens</li></ul> |
|
| 193 |
+
* Samples:
|
| 194 |
+
| question | answer | negative |
|
| 195 |
+
|:-------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
| 196 |
+
| <code>what is the relationship between humility and thankfulness?</code> | <code>how gratitude can influence humility and vice versa. Humility is characterized by low self-focus, secure sense of self, and increased valuation of others. Gratitude is marked by a sense that one has benefited from the actions of another.</code> | <code>-hum-, root. -hum- comes from Latin, where it has the meaning "ground. '' This meaning is found in such words as: exhume, humble, humiliate, humility, humus, posthumous.</code> |
|
| 197 |
+
| <code>what is the difference between usb a b c?</code> | <code>The USB-A has a much larger physical connector than the Type C, Type C is around the same size as a micro-USB connector. Unlike, Type A, you won't need to try and insert it, flip it over and then flip it over once more just to find the right orientation when trying to make a connection.</code> | <code>First the transfer rates: USB 2.0 offers transfer rates of 480 Mbps and USB 3.0 offers transfer rates of 4.8 Gbps - that's 10 times faster. ... USB 2.0 provided up to 500 mA whereas USB 3.0 provides up to 900 mA, allowing power hungry devices to now be bus powered.</code> |
|
| 198 |
+
| <code>how hyaluronic acid is made?</code> | <code>Hyaluronic acid is a substance that is naturally present in the human body. It is found in the highest concentrations in fluids in the eyes and joints. The hyaluronic acid that is used as medicine is extracted from rooster combs or made by bacteria in the laboratory.</code> | <code>Hyaluronic acid helps your skin hang on to the moisture. 2. ... Hyaluronic acid by itself is non-comedogenic (doesn't clog pores), but you should be careful when choosing a hyaluronic acid serum that the ingredient list doesn't contain any sneaky pore-clogging ingredients you're not expecting.</code> |
|
| 199 |
+
* Loss: <code>pylate.losses.contrastive.Contrastive</code>
|
| 200 |
+
|
| 201 |
+
### Evaluation Dataset
|
| 202 |
+
|
| 203 |
+
#### Unnamed Dataset
|
| 204 |
+
|
| 205 |
+
* Size: 5,000 evaluation samples
|
| 206 |
+
* Columns: <code>question</code>, <code>answer</code>, and <code>negative_1</code>
|
| 207 |
+
* Approximate statistics based on the first 1000 samples:
|
| 208 |
+
| | question | answer | negative_1 |
|
| 209 |
+
|:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
|
| 210 |
+
| type | string | string | string |
|
| 211 |
+
| details | <ul><li>min: 9 tokens</li><li>mean: 12.96 tokens</li><li>max: 22 tokens</li></ul> | <ul><li>min: 19 tokens</li><li>mean: 31.7 tokens</li><li>max: 32 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 31.43 tokens</li><li>max: 32 tokens</li></ul> |
|
| 212 |
+
* Samples:
|
| 213 |
+
| question | answer | negative_1 |
|
| 214 |
+
|:----------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
| 215 |
+
| <code>are tefal ingenio pans suitable for induction hobs?</code> | <code>Tefal Ingenio is a revolutionary concept that brings a whole new take on versatility. ... The frying pans also feature Tefal's iconic Thermo-Spot which lets you know when the pan has reached optimal cooking temperature. The Ingenio Induction range is compatible with all hobs and is also dishwasher safe.</code> | <code>Tefal Ingenio is a revolutionary concept that brings a whole new take on versatility. ... The frying pans also feature Tefal's iconic Thermo-Spot which lets you know when the pan has reached optimal cooking temperature. The Ingenio Induction range is compatible with all hobs and is also dishwasher safe.</code> |
|
| 216 |
+
| <code>how many continuing education hours is acls?</code> | <code>The ACLS, PALS, and NRP certification courses are approved for 8 CEUs/CMEs, and recertification courses are approved for 4 CEUs/CMEs. The BLS certification course is approved for 4 CEUs/CMEs and the recertification course is approved for 2 CEUs/CMEs. For more information, please visit our Accreditation page.</code> | <code>The foremost difference between the two is their advancement level. Essentially, ACLS is a sophisticated and more advanced course and builds upon the major fundamentals developed during BLS. The main purpose of BLS and ACLS certification are well explained in this article.</code> |
|
| 217 |
+
| <code>what are the health benefits of drinking peppermint tea?</code> | <code>['Makes you Stress Free. When it comes to relieving stress and anxiety, peppermint tea is one of the best allies. ... ', 'Sleep-Friendly. ... ', 'Aids in Weight Loss. ... ', 'Cure for an Upset Stomach. ... ', 'Improves Digestion. ... ', 'Boosts Immune System. ... ', 'Fights Bad Breath.']</code> | <code>Peppermint tea is a popular herbal tea that is naturally calorie- and caffeine-free. Some research has suggested that the oils in peppermint may have a number of other health benefits, such as fresher breath, better digestion, and reduced pain from headaches. Peppermint tea also has antibacterial properties.</code> |
|
| 218 |
+
* Loss: <code>pylate.losses.contrastive.Contrastive</code>
|
| 219 |
+
|
| 220 |
+
### Training Hyperparameters
|
| 221 |
+
#### Non-Default Hyperparameters
|
| 222 |
+
|
| 223 |
+
- `eval_strategy`: steps
|
| 224 |
+
- `per_device_train_batch_size`: 256
|
| 225 |
+
- `per_device_eval_batch_size`: 256
|
| 226 |
+
- `learning_rate`: 3e-06
|
| 227 |
+
- `num_train_epochs`: 1
|
| 228 |
+
- `warmup_ratio`: 0.1
|
| 229 |
+
- `seed`: 12
|
| 230 |
+
- `bf16`: True
|
| 231 |
+
- `dataloader_num_workers`: 12
|
| 232 |
+
- `load_best_model_at_end`: True
|
| 233 |
+
|
| 234 |
+
#### All Hyperparameters
|
| 235 |
+
<details><summary>Click to expand</summary>
|
| 236 |
+
|
| 237 |
+
- `overwrite_output_dir`: False
|
| 238 |
+
- `do_predict`: False
|
| 239 |
+
- `eval_strategy`: steps
|
| 240 |
+
- `prediction_loss_only`: True
|
| 241 |
+
- `per_device_train_batch_size`: 256
|
| 242 |
+
- `per_device_eval_batch_size`: 256
|
| 243 |
+
- `per_gpu_train_batch_size`: None
|
| 244 |
+
- `per_gpu_eval_batch_size`: None
|
| 245 |
+
- `gradient_accumulation_steps`: 1
|
| 246 |
+
- `eval_accumulation_steps`: None
|
| 247 |
+
- `torch_empty_cache_steps`: None
|
| 248 |
+
- `learning_rate`: 3e-06
|
| 249 |
+
- `weight_decay`: 0.0
|
| 250 |
+
- `adam_beta1`: 0.9
|
| 251 |
+
- `adam_beta2`: 0.999
|
| 252 |
+
- `adam_epsilon`: 1e-08
|
| 253 |
+
- `max_grad_norm`: 1.0
|
| 254 |
+
- `num_train_epochs`: 1
|
| 255 |
+
- `max_steps`: -1
|
| 256 |
+
- `lr_scheduler_type`: linear
|
| 257 |
+
- `lr_scheduler_kwargs`: {}
|
| 258 |
+
- `warmup_ratio`: 0.1
|
| 259 |
+
- `warmup_steps`: 0
|
| 260 |
+
- `log_level`: passive
|
| 261 |
+
- `log_level_replica`: warning
|
| 262 |
+
- `log_on_each_node`: True
|
| 263 |
+
- `logging_nan_inf_filter`: True
|
| 264 |
+
- `save_safetensors`: True
|
| 265 |
+
- `save_on_each_node`: False
|
| 266 |
+
- `save_only_model`: False
|
| 267 |
+
- `restore_callback_states_from_checkpoint`: False
|
| 268 |
+
- `no_cuda`: False
|
| 269 |
+
- `use_cpu`: False
|
| 270 |
+
- `use_mps_device`: False
|
| 271 |
+
- `seed`: 12
|
| 272 |
+
- `data_seed`: None
|
| 273 |
+
- `jit_mode_eval`: False
|
| 274 |
+
- `use_ipex`: False
|
| 275 |
+
- `bf16`: True
|
| 276 |
+
- `fp16`: False
|
| 277 |
+
- `fp16_opt_level`: O1
|
| 278 |
+
- `half_precision_backend`: auto
|
| 279 |
+
- `bf16_full_eval`: False
|
| 280 |
+
- `fp16_full_eval`: False
|
| 281 |
+
- `tf32`: None
|
| 282 |
+
- `local_rank`: 0
|
| 283 |
+
- `ddp_backend`: None
|
| 284 |
+
- `tpu_num_cores`: None
|
| 285 |
+
- `tpu_metrics_debug`: False
|
| 286 |
+
- `debug`: []
|
| 287 |
+
- `dataloader_drop_last`: False
|
| 288 |
+
- `dataloader_num_workers`: 12
|
| 289 |
+
- `dataloader_prefetch_factor`: None
|
| 290 |
+
- `past_index`: -1
|
| 291 |
+
- `disable_tqdm`: False
|
| 292 |
+
- `remove_unused_columns`: True
|
| 293 |
+
- `label_names`: None
|
| 294 |
+
- `load_best_model_at_end`: True
|
| 295 |
+
- `ignore_data_skip`: False
|
| 296 |
+
- `fsdp`: []
|
| 297 |
+
- `fsdp_min_num_params`: 0
|
| 298 |
+
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
|
| 299 |
+
- `tp_size`: 0
|
| 300 |
+
- `fsdp_transformer_layer_cls_to_wrap`: None
|
| 301 |
+
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
|
| 302 |
+
- `deepspeed`: None
|
| 303 |
+
- `label_smoothing_factor`: 0.0
|
| 304 |
+
- `optim`: adamw_torch
|
| 305 |
+
- `optim_args`: None
|
| 306 |
+
- `adafactor`: False
|
| 307 |
+
- `group_by_length`: False
|
| 308 |
+
- `length_column_name`: length
|
| 309 |
+
- `ddp_find_unused_parameters`: None
|
| 310 |
+
- `ddp_bucket_cap_mb`: None
|
| 311 |
+
- `ddp_broadcast_buffers`: False
|
| 312 |
+
- `dataloader_pin_memory`: True
|
| 313 |
+
- `dataloader_persistent_workers`: False
|
| 314 |
+
- `skip_memory_metrics`: True
|
| 315 |
+
- `use_legacy_prediction_loop`: False
|
| 316 |
+
- `push_to_hub`: False
|
| 317 |
+
- `resume_from_checkpoint`: None
|
| 318 |
+
- `hub_model_id`: None
|
| 319 |
+
- `hub_strategy`: every_save
|
| 320 |
+
- `hub_private_repo`: None
|
| 321 |
+
- `hub_always_push`: False
|
| 322 |
+
- `gradient_checkpointing`: False
|
| 323 |
+
- `gradient_checkpointing_kwargs`: None
|
| 324 |
+
- `include_inputs_for_metrics`: False
|
| 325 |
+
- `include_for_metrics`: []
|
| 326 |
+
- `eval_do_concat_batches`: True
|
| 327 |
+
- `fp16_backend`: auto
|
| 328 |
+
- `push_to_hub_model_id`: None
|
| 329 |
+
- `push_to_hub_organization`: None
|
| 330 |
+
- `mp_parameters`:
|
| 331 |
+
- `auto_find_batch_size`: False
|
| 332 |
+
- `full_determinism`: False
|
| 333 |
+
- `torchdynamo`: None
|
| 334 |
+
- `ray_scope`: last
|
| 335 |
+
- `ddp_timeout`: 1800
|
| 336 |
+
- `torch_compile`: False
|
| 337 |
+
- `torch_compile_backend`: None
|
| 338 |
+
- `torch_compile_mode`: None
|
| 339 |
+
- `dispatch_batches`: None
|
| 340 |
+
- `split_batches`: None
|
| 341 |
+
- `include_tokens_per_second`: False
|
| 342 |
+
- `include_num_input_tokens_seen`: False
|
| 343 |
+
- `neftune_noise_alpha`: None
|
| 344 |
+
- `optim_target_modules`: None
|
| 345 |
+
- `batch_eval_metrics`: False
|
| 346 |
+
- `eval_on_start`: False
|
| 347 |
+
- `use_liger_kernel`: False
|
| 348 |
+
- `eval_use_gather_object`: False
|
| 349 |
+
- `average_tokens_across_devices`: False
|
| 350 |
+
- `prompts`: None
|
| 351 |
+
- `batch_sampler`: batch_sampler
|
| 352 |
+
- `multi_dataset_batch_sampler`: proportional
|
| 353 |
+
|
| 354 |
+
</details>
|
| 355 |
+
|
| 356 |
+
### Training Logs
|
| 357 |
+
| Epoch | Step | Training Loss |
|
| 358 |
+
|:------:|:----:|:-------------:|
|
| 359 |
+
| 0.0001 | 1 | 10.8061 |
|
| 360 |
+
| 0.0270 | 200 | 8.9391 |
|
| 361 |
+
| 0.0541 | 400 | 5.1795 |
|
| 362 |
+
| 0.0811 | 600 | 2.3951 |
|
| 363 |
+
| 0.1081 | 800 | 1.6927 |
|
| 364 |
+
| 0.1352 | 1000 | 1.404 |
|
| 365 |
+
| 0.1622 | 1200 | 1.2496 |
|
| 366 |
+
| 0.1892 | 1400 | 1.1613 |
|
| 367 |
+
| 0.2162 | 1600 | 1.0843 |
|
| 368 |
+
| 0.2433 | 1800 | 1.0427 |
|
| 369 |
+
| 0.2703 | 2000 | 1.0005 |
|
| 370 |
+
| 0.2973 | 2200 | 0.9695 |
|
| 371 |
+
| 0.3244 | 2400 | 0.9325 |
|
| 372 |
+
| 0.3514 | 2600 | 0.9122 |
|
| 373 |
+
| 0.3784 | 2800 | 0.8832 |
|
| 374 |
+
| 0.4055 | 3000 | 0.8689 |
|
| 375 |
+
| 0.4325 | 3200 | 0.8626 |
|
| 376 |
+
| 0.4595 | 3400 | 0.8452 |
|
| 377 |
+
| 0.4866 | 3600 | 0.8329 |
|
| 378 |
+
| 0.5136 | 3800 | 0.8132 |
|
| 379 |
+
| 0.5406 | 4000 | 0.8111 |
|
| 380 |
+
| 0.5676 | 4200 | 0.7952 |
|
| 381 |
+
| 0.5947 | 4400 | 0.7892 |
|
| 382 |
+
| 0.6217 | 4600 | 0.7772 |
|
| 383 |
+
| 0.6487 | 4800 | 0.7793 |
|
| 384 |
+
| 0.6758 | 5000 | 0.7705 |
|
| 385 |
+
| 0.7028 | 5200 | 0.7692 |
|
| 386 |
+
| 0.7298 | 5400 | 0.7625 |
|
| 387 |
+
| 0.7569 | 5600 | 0.7595 |
|
| 388 |
+
| 0.7839 | 5800 | 0.7405 |
|
| 389 |
+
| 0.8109 | 6000 | 0.7513 |
|
| 390 |
+
| 0.8380 | 6200 | 0.7396 |
|
| 391 |
+
| 0.8650 | 6400 | 0.7312 |
|
| 392 |
+
| 0.8920 | 6600 | 0.7325 |
|
| 393 |
+
| 0.9190 | 6800 | 0.7371 |
|
| 394 |
+
| 0.9461 | 7000 | 0.7422 |
|
| 395 |
+
| 0.9731 | 7200 | 0.7296 |
|
| 396 |
+
|
| 397 |
+
|
| 398 |
+
### Framework Versions
|
| 399 |
+
- Python: 3.11.0
|
| 400 |
+
- Sentence Transformers: 4.0.1
|
| 401 |
+
- Transformers: 4.50.3
|
| 402 |
+
- PyTorch: 2.6.0+cu124
|
| 403 |
+
- Accelerate: 1.5.2
|
| 404 |
+
- Datasets: 3.5.0
|
| 405 |
+
- Tokenizers: 0.21.1
|
| 406 |
+
|
| 407 |
+
## Citation
|
| 408 |
+
|
| 409 |
+
### BibTeX
|
| 410 |
+
|
| 411 |
+
#### Sentence Transformers
|
| 412 |
+
```bibtex
|
| 413 |
+
@inproceedings{reimers-2019-sentence-bert,
|
| 414 |
+
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
|
| 415 |
+
author = "Reimers, Nils and Gurevych, Iryna",
|
| 416 |
+
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
|
| 417 |
+
month = "11",
|
| 418 |
+
year = "2019",
|
| 419 |
+
publisher = "Association for Computational Linguistics",
|
| 420 |
+
url = "https://arxiv.org/abs/1908.10084",
|
| 421 |
+
}
|
| 422 |
+
```
|
| 423 |
+
|
| 424 |
+
<!--
|
| 425 |
+
## Glossary
|
| 426 |
+
|
| 427 |
+
*Clearly define terms in order to be accessible across audiences.*
|
| 428 |
+
-->
|
| 429 |
+
|
| 430 |
+
<!--
|
| 431 |
+
## Model Card Authors
|
| 432 |
+
|
| 433 |
+
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
|
| 434 |
+
-->
|
| 435 |
+
|
| 436 |
+
<!--
|
| 437 |
+
## Model Card Contact
|
| 438 |
+
|
| 439 |
+
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
|
| 440 |
+
-->
|
added_tokens.json
ADDED
|
@@ -0,0 +1,4 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"[D] ": 30523,
|
| 3 |
+
"[Q] ": 30522
|
| 4 |
+
}
|
config.json
ADDED
|
@@ -0,0 +1,35 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"architectures": [
|
| 3 |
+
"BertForSequenceClassification"
|
| 4 |
+
],
|
| 5 |
+
"attention_probs_dropout_prob": 0.1,
|
| 6 |
+
"classifier_dropout": null,
|
| 7 |
+
"gradient_checkpointing": false,
|
| 8 |
+
"hidden_act": "gelu",
|
| 9 |
+
"hidden_dropout_prob": 0.1,
|
| 10 |
+
"hidden_size": 384,
|
| 11 |
+
"id2label": {
|
| 12 |
+
"0": "LABEL_0"
|
| 13 |
+
},
|
| 14 |
+
"initializer_range": 0.02,
|
| 15 |
+
"intermediate_size": 1536,
|
| 16 |
+
"label2id": {
|
| 17 |
+
"LABEL_0": 0
|
| 18 |
+
},
|
| 19 |
+
"layer_norm_eps": 1e-12,
|
| 20 |
+
"max_position_embeddings": 512,
|
| 21 |
+
"model_type": "bert",
|
| 22 |
+
"num_attention_heads": 12,
|
| 23 |
+
"num_hidden_layers": 6,
|
| 24 |
+
"pad_token_id": 0,
|
| 25 |
+
"position_embedding_type": "absolute",
|
| 26 |
+
"sentence_transformers": {
|
| 27 |
+
"activation_fn": "torch.nn.modules.activation.Sigmoid",
|
| 28 |
+
"version": "4.0.1"
|
| 29 |
+
},
|
| 30 |
+
"torch_dtype": "float32",
|
| 31 |
+
"transformers_version": "4.50.3",
|
| 32 |
+
"type_vocab_size": 2,
|
| 33 |
+
"use_cache": true,
|
| 34 |
+
"vocab_size": 30524
|
| 35 |
+
}
|
model.safetensors
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:e076b5a120ce558175a297483da67ad6c397aa29f4c867b8fa3a901eecd3b88d
|
| 3 |
+
size 90869484
|
special_tokens_map.json
ADDED
|
@@ -0,0 +1,37 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"cls_token": {
|
| 3 |
+
"content": "[CLS]",
|
| 4 |
+
"lstrip": false,
|
| 5 |
+
"normalized": false,
|
| 6 |
+
"rstrip": false,
|
| 7 |
+
"single_word": false
|
| 8 |
+
},
|
| 9 |
+
"mask_token": {
|
| 10 |
+
"content": "[MASK]",
|
| 11 |
+
"lstrip": false,
|
| 12 |
+
"normalized": false,
|
| 13 |
+
"rstrip": false,
|
| 14 |
+
"single_word": false
|
| 15 |
+
},
|
| 16 |
+
"pad_token": {
|
| 17 |
+
"content": "[MASK]",
|
| 18 |
+
"lstrip": false,
|
| 19 |
+
"normalized": false,
|
| 20 |
+
"rstrip": false,
|
| 21 |
+
"single_word": false
|
| 22 |
+
},
|
| 23 |
+
"sep_token": {
|
| 24 |
+
"content": "[SEP]",
|
| 25 |
+
"lstrip": false,
|
| 26 |
+
"normalized": false,
|
| 27 |
+
"rstrip": false,
|
| 28 |
+
"single_word": false
|
| 29 |
+
},
|
| 30 |
+
"unk_token": {
|
| 31 |
+
"content": "[UNK]",
|
| 32 |
+
"lstrip": false,
|
| 33 |
+
"normalized": false,
|
| 34 |
+
"rstrip": false,
|
| 35 |
+
"single_word": false
|
| 36 |
+
}
|
| 37 |
+
}
|
tokenizer.json
ADDED
|
The diff for this file is too large to render.
See raw diff
|
|
|
tokenizer_config.json
ADDED
|
@@ -0,0 +1,81 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"added_tokens_decoder": {
|
| 3 |
+
"0": {
|
| 4 |
+
"content": "[PAD]",
|
| 5 |
+
"lstrip": false,
|
| 6 |
+
"normalized": false,
|
| 7 |
+
"rstrip": false,
|
| 8 |
+
"single_word": false,
|
| 9 |
+
"special": true
|
| 10 |
+
},
|
| 11 |
+
"100": {
|
| 12 |
+
"content": "[UNK]",
|
| 13 |
+
"lstrip": false,
|
| 14 |
+
"normalized": false,
|
| 15 |
+
"rstrip": false,
|
| 16 |
+
"single_word": false,
|
| 17 |
+
"special": true
|
| 18 |
+
},
|
| 19 |
+
"101": {
|
| 20 |
+
"content": "[CLS]",
|
| 21 |
+
"lstrip": false,
|
| 22 |
+
"normalized": false,
|
| 23 |
+
"rstrip": false,
|
| 24 |
+
"single_word": false,
|
| 25 |
+
"special": true
|
| 26 |
+
},
|
| 27 |
+
"102": {
|
| 28 |
+
"content": "[SEP]",
|
| 29 |
+
"lstrip": false,
|
| 30 |
+
"normalized": false,
|
| 31 |
+
"rstrip": false,
|
| 32 |
+
"single_word": false,
|
| 33 |
+
"special": true
|
| 34 |
+
},
|
| 35 |
+
"103": {
|
| 36 |
+
"content": "[MASK]",
|
| 37 |
+
"lstrip": false,
|
| 38 |
+
"normalized": false,
|
| 39 |
+
"rstrip": false,
|
| 40 |
+
"single_word": false,
|
| 41 |
+
"special": true
|
| 42 |
+
},
|
| 43 |
+
"30522": {
|
| 44 |
+
"content": "[Q] ",
|
| 45 |
+
"lstrip": false,
|
| 46 |
+
"normalized": true,
|
| 47 |
+
"rstrip": false,
|
| 48 |
+
"single_word": false,
|
| 49 |
+
"special": false
|
| 50 |
+
},
|
| 51 |
+
"30523": {
|
| 52 |
+
"content": "[D] ",
|
| 53 |
+
"lstrip": false,
|
| 54 |
+
"normalized": true,
|
| 55 |
+
"rstrip": false,
|
| 56 |
+
"single_word": false,
|
| 57 |
+
"special": false
|
| 58 |
+
}
|
| 59 |
+
},
|
| 60 |
+
"clean_up_tokenization_spaces": true,
|
| 61 |
+
"cls_token": "[CLS]",
|
| 62 |
+
"do_basic_tokenize": true,
|
| 63 |
+
"do_lower_case": true,
|
| 64 |
+
"extra_special_tokens": {},
|
| 65 |
+
"mask_token": "[MASK]",
|
| 66 |
+
"max_length": 31,
|
| 67 |
+
"model_max_length": 512,
|
| 68 |
+
"never_split": null,
|
| 69 |
+
"pad_to_multiple_of": null,
|
| 70 |
+
"pad_token": "[MASK]",
|
| 71 |
+
"pad_token_type_id": 0,
|
| 72 |
+
"padding_side": "right",
|
| 73 |
+
"sep_token": "[SEP]",
|
| 74 |
+
"stride": 0,
|
| 75 |
+
"strip_accents": null,
|
| 76 |
+
"tokenize_chinese_chars": true,
|
| 77 |
+
"tokenizer_class": "BertTokenizer",
|
| 78 |
+
"truncation_side": "right",
|
| 79 |
+
"truncation_strategy": "longest_first",
|
| 80 |
+
"unk_token": "[UNK]"
|
| 81 |
+
}
|
vocab.txt
ADDED
|
The diff for this file is too large to render.
See raw diff
|
|
|