SentenceTransformer based on google-bert/bert-base-uncased
This is a sentence-transformers model finetuned from google-bert/bert-base-uncased on the wikipedia_subsets dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: google-bert/bert-base-uncased
- Maximum Sequence Length: 512 tokens
- Output Dimensionality: 768 dimensions
- Similarity Function: Cosine Similarity
- Training Dataset:
- Language: en
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False, 'architecture': 'BertModel'})
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("UmarAzam/bert-base-uncased-industrialtech")
# Run inference
sentences = [
'version of spaced was introduced beginning the 97th vehicle of 6th batch also introduced an of heavy ballistic Leopard on increased armour protection While Leopard to the Leopard 2A5 the covering the modules is modules . New armour modules armour cover the frontal arc of the turret . have distinctive and protection both penetrators and charges The side skirts incorporate improved protection . A 25 the danger of injuries in case armour penetration The Leopard 2A7 the generation and belly armour providing against mines and IEDs . Leopard 2A7 fitted for mounting armour modules protection systems against . For urban combat, the Leopard 2 can be with different of modular armour Leopard 2A4M Leopard 2 Peace) the mount modules composite along the flanks turret and hull, while slat armour can be adapted at vehicle The modules, which depending on the warhead can penetrate of armour The 2A6M CAN increases rocket-propelled including slat armour . Additional armour packages been developed by a number different companies IBD developed upgrades Advanced (AMAP) armour the latter used on Singaporean and Leopard tanks . RUAG has developed armour upgrade composite . first the 2013 . The Leopard and 2A6M add an additional protection for, which increases mines and IEDs . 22, the German Defence to Trophy, an active protection system of . 17 be fitted the with integration planned be in 2023 . Armour protection estimates Estimated levels of for range from 590 to 690 mm the turret RHAe the and lower front hull on Leopard 2A4, to mm RHAe turret 620 mm RHAe on',
" version of spaced multilayer armour was introduced beginning with the 97th vehicle of the 6th production batch. The same batch also introduced an improved type of heavy ballistic skirts.\n\nThe Leopard 2A5 upgrade focused on increased armour protection. While upgrading a Leopard 2 tank to the Leopard 2A5 configuration, the roof covering the armour modules is cut open and new armour modules are inserted. New additional armour modules made of laminated armour cover the frontal arc of the turret. They have a distinctive arrowhead shape and improve protection against both kinetic penetrators and shaped charges. The side skirts also incorporate improved armour protection. A 25\xa0mm-thick spall liner reduces the danger of crew injuries in case of armour penetration.\n\nThe Leopard 2A7 features the latest generation of passive armour and belly armour providing protection against mines and IEDs. The Leopard 2A7 is fitted with adapters for mounting additional armour modules or protection systems against RPGs.\n\nFor urban combat, the Leopard 2 can be fitted with different packages of modular armour. The Leopard 2A4M CAN, Leopard 2 PSO (Peace Support Operations) and the Leopard 2A7 can mount thick modules of composite armour along the flanks of the turret and hull, while slat armour can be adapted at the vehicle's rear. The armour modules provide protection against the RPG-7, which depending on the warhead can penetrate between and of steel armour. The Leopard 2A6M CAN increases protection against rocket-propelled grenades (RPGs) by including additional slat armour.\n\nAdditional armour packages have been developed by a number of different companies. IBD Deisenroth has developed upgrades with MEXAS and Advanced Modular Armor Protection (AMAP) composite armour, the latter is being used on Singaporean and Indonesian Leopard 2 tanks. RUAG has developed an armour upgrade utilizing their SidePRO-ATR composite armour. This upgrade was first presented on the IAV 2013.\n\nThe Leopard 2A4M and 2A6M add an additional mine protection plate for the belly, which increases protection against mines and IEDs.\n\nOn 22 February 2021, the German Defence Ministry agreed to acquire Trophy, an active protection system of Israeli design. 17 German Army tanks will be fitted with the system, with integration planned to be completed in 2023.\n\nArmour protection estimates\nEstimated levels of protection for the Leopard 2 range from 590 to 690\xa0mm RHAe on the turret, 600\xa0mm RHAe on the glacis and lower front hull on the Leopard 2A4, to 920–940\xa0mm RHAe on the turret, 620\xa0mm RHAe on the",
", produced by George Haggerty, made by Kai Productions\n 28 December Incredible Evidence, an Equinox Special about the limits of DNA profiling. Directed by Hilary Lawson, made by TVF\n\n1995\n 9 January Beyond Love, an Equinox Special about autoerotic asphyxia, which killed over 50 people in 1994; and due to the deeply, and distasteful, unconventional content of the programme, it was shown at 10pm; at the John Hopkins Sexual Disorders Clinic at the Johns Hopkins Bloomberg School of Public Health in Baltimore in Maryland, where chromosomal abnormality was found by Fred Berlin, often Klinefelter syndrome; Dr Raymond Goodman of Hope Hospital in Salford, now of the Institute of Brain, Behaviour and Mental Health at the University of Manchester, and why 90% of paraphiliacs were male; Peter Fenwick (neuropsychologist) of the Institute of Psychiatry, Psychology and Neuroscience, and how sexual arousal is centred in the limbic system; Gene Abel of the Behavioral Medicine Institute of Atlanta; William Marshall of the Queen's University at Kingston; Jeffrey Weeks (sociologist) at London South Bank University; John Bancroft (sexologist) of the MRC Reproductive Biology Unit in Edinburgh; Stephen Hucker of Queen's University, Ontario; John Money of Johns Hopkins Hospital; forensic psychologist Ronald Langevin. Narrated by Dame Jenni Murray, directed by Peter Boyd Maclean, produced by Simon Andreae, made by Optomen Television\n 27 August The Real X-Files: America's Psychic Spies, an Equinox Special about a former American military unit that conducted remote viewing, where operatives could see backwards and forwards in time; Admiral Stansfield Turner, Director from 1977 to 1981 of the CIA; Major-General Ed Thompson; Colonel John B. Alexander of the United States Army Intelligence and Security Command; Hal Puthoff, of SRI International in California; remote viewer Ingo Swann and the subsequent Stargate Project, at Fort Meade in Maryland; Keith Harary, who worked with Russell Targ. Narrated by Jim Schnabel, produced by Alex Graham, directed by Bill Eagles, made by Wall to Wall Television\n 3 September Cybersecrecy, the mathematician Fred Piper of the Information Security Group; the UK gave out Enigma machines to Commonwealth countries for secret telecommunications, without telling these countries that the UK could read every message; Phil Zimmermann, inventor of the PGP encryption algorithm; Simon Davies (privacy advocate); when at MIT in 1976, Whitfield Diffie found how to",
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[1.0000, 0.9618, 0.5859],
# [0.9618, 1.0000, 0.5862],
# [0.5859, 0.5862, 1.0000]])
Evaluation
Metrics
Semantic Similarity
- Datasets:
sts-devandsts-test - Evaluated with
EmbeddingSimilarityEvaluator
| Metric | sts-dev | sts-test |
|---|---|---|
| pearson_cosine | 0.5597 | 0.4154 |
| spearman_cosine | 0.5782 | 0.4684 |
Training Details
Training Dataset
wikipedia_subsets
- Dataset: wikipedia_subsets at 72f5c2f
- Size: 81,516 training samples
- Columns:
text - Approximate statistics based on the first 1000 samples:
text type string details - min: 512 tokens
- mean: 512.0 tokens
- max: 512 tokens
- Samples:
text Highway 82 where motorists enter the city's outskirts. The legal speed limit drops in a short space from 55 mph to 30 mph, leading to some drivers who are not alert to be caught. The minimum fine for exceeding the posted speed limit even by 1 mph is $146.
Initially, Illinois used photo enforcement for construction zones only. There was legislation on the books to expand that throughout the state. However, Chicago has expanded its red light camera program and is planning to put speed cameras in school zones. Some suburbs (e.g. Alsip) already have cameras at various intersections.
Some U.S. states that formerly allowed red-light enforcement cameras but not speed limit enforcement cameras ('photo radar'), have now approved, or are considering, the implementation of speed limit enforcement cameras. The Maryland legislature approved such a program in January 2006. In 2005, 2006, 2008 and 2009 the California legislature considered, but did not pass, bills to implement speed limit enforce...in many sectors of business including stock market trading systems, mobile devices, internet operations, fraud detection, the transportation industry, and governmental intelligence gathering.
The vast amount of information available about events is sometimes referred to as the event cloud.
Conceptual description
Among thousands of incoming events, a monitoring system may for instance receive the following three from the same source:
church bells ringing.
the appearance of a man in a tuxedo with a woman in a flowing white gown.
rice flying through the air.
From these events the monitoring system may infer a complex event: a wedding. CEP as a technique helps discover complex events by analyzing and correlating other events: the bells, the man and woman in wedding attire and the rice flying through the air.
CEP relies on a number of techniques, including:
Event-pattern detection
Event abstraction
Event filtering
Event aggregation and transformation
Modeling event hierarch...ating wheel that allows scientists to select between short, medium, and longer wavelengths when making observations using the MRS mode,” said NASA in a press statement.
Commissioning and testing
On 12 January 2022, while still in transit, mirror alignment began. The primary mirror segments and secondary mirror were moved away from their protective launch positions. This took about 10 days, because the 132 actuator motors are designed to fine-tune the mirror positions at microscopic accuracy (10 nanometer increments) and must each move over 1.2 million increments (12.5 mm) during initial alignment.
Mirror alignment requires each of the 18 mirror segments, and the secondary mirror, to be positioned to within 50 nanometers. NASA compares the required accuracy by analogy: "If the Webb primary mirror were the size of the United States, each [mirror] segment would be the size of Texas, and the team would need to line the height of those Texas-sized segments up with each other to an accurac... - Loss:
DenoisingAutoEncoderLoss
Evaluation Dataset
wikipedia_subsets
- Dataset: wikipedia_subsets at 72f5c2f
- Size: 10,000 evaluation samples
- Columns:
text - Approximate statistics based on the first 1000 samples:
text type string details - min: 512 tokens
- mean: 512.0 tokens
- max: 512 tokens
- Samples:
text prisoners of Stalin and Hitler, Frankfurt am Main; Berlin.
Wilfried Feldenkirchen: 1918–1945 Siemens, Munich 1995, Ulrike fire, Claus Füllberg-Stolberg, Sylvia Kempe: work at Ravensbrück concentration camp, in: Women in concentration camps. Bergen-Belsen. Ravensbrück, Bremen, 1994, pp. 55–69
Feldenkirchen, Wilfried (2000). Siemens: From Workshop to Global Player, Munich.
Feldenkirchen, Wilfried, and Eberhard Posner (2005). The Siemens Entrepreneurs: Continuity and Change, 1847–2005. Ten Portraits, Munich.
Greider, William (1997). One World, Ready or Not. Penguin Press. .
Sigrid Jacobeit: working at Siemens in Ravensbrück, in: Dietrich Eichholz (eds) War and economy. Studies on German economic history 1939–1945, Berlin 1999.
Ursula Krause-Schmitt: The path to the Siemens stock led past the crematorium, in: Information. German Resistance Study Group, Frankfurt / Main, 18 Jg, No. 37/38, Nov. 1993, pp. 38–46
MSS in the estate include Wanda Kiedrzy'nska, in: National Library of Pola...dates the beginning of behavioral modernity earlier to the Middle Paleolithic). This is characterized by the widespread observation of religious rites, artistic expression and the appearance of tools made for purely intellectual or artistic pursuits.
49–30 ka: Ground stone tools – fragments of an axe in Australia date to 49–45 ka, more appear in Japan closer to 30 ka, and elsewhere closer to the Neolithic.
47 ka: The oldest-known mines in the world are from Eswatini, and extracted hematite for the production of the red pigment ochre.
45 ka: Shoes, as evidenced by changes in foot bone morphology in Eurasia. Bark sandals dated to 10 to 9 ka were found in Fort Rock Cave in the US state of Oregon in 1938. Oldest leather shoe (Areni-1 shoe), 5.5 ka.
44–42 ka: Tally sticks (see Lebombo bone) in Eswatini
43.7 ka: Cave painting in Indonesia
37 ka: Mortar and pestle in Southwest Asia
36 ka: Weaving – Indirect evidence from Moravia and Georgia. The earliest actual piece of woven cloth wa...on a prestressing. Prestressing means the intentional creation of permanent stresses in a structure for the purpose of improving its performance under various service conditions.
There are the following basic types of prestressing:
Pre-compression (mostly, with the own weight of a structure)
Pretensioning with high-strength embedded tendons
Post-tensioning with high-strength bonded or unbonded tendons
Today, the concept of prestressed structure is widely engaged in design of buildings, underground structures, TV towers, power stations, floating storage and offshore facilities, nuclear reactor vessels, and numerous kinds of bridge systems.
A beneficial idea of prestressing was, apparently, familiar to the ancient Roman architects; look, e.g., at the tall attic wall of Colosseum working as a stabilizing device for the wall piers beneath.
Steel structures
Steel structures are considered mostly earthquake resistant but some failures have occurred. A great number of welded steel mo... - Loss:
DenoisingAutoEncoderLoss
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy: stepsper_device_train_batch_size: 4per_device_eval_batch_size: 4learning_rate: 3e-05num_train_epochs: 1warmup_ratio: 0.1fp16: True
All Hyperparameters
Click to expand
overwrite_output_dir: Falsedo_predict: Falseeval_strategy: stepsprediction_loss_only: Trueper_device_train_batch_size: 4per_device_eval_batch_size: 4per_gpu_train_batch_size: Noneper_gpu_eval_batch_size: Nonegradient_accumulation_steps: 1eval_accumulation_steps: Nonetorch_empty_cache_steps: Nonelearning_rate: 3e-05weight_decay: 0.0adam_beta1: 0.9adam_beta2: 0.999adam_epsilon: 1e-08max_grad_norm: 1.0num_train_epochs: 1max_steps: -1lr_scheduler_type: linearlr_scheduler_kwargs: {}warmup_ratio: 0.1warmup_steps: 0log_level: passivelog_level_replica: warninglog_on_each_node: Truelogging_nan_inf_filter: Truesave_safetensors: Truesave_on_each_node: Falsesave_only_model: Falserestore_callback_states_from_checkpoint: Falseno_cuda: Falseuse_cpu: Falseuse_mps_device: Falseseed: 42data_seed: Nonejit_mode_eval: Falseuse_ipex: Falsebf16: Falsefp16: Truefp16_opt_level: O1half_precision_backend: autobf16_full_eval: Falsefp16_full_eval: Falsetf32: Nonelocal_rank: 0ddp_backend: Nonetpu_num_cores: Nonetpu_metrics_debug: Falsedebug: []dataloader_drop_last: Falsedataloader_num_workers: 0dataloader_prefetch_factor: Nonepast_index: -1disable_tqdm: Falseremove_unused_columns: Truelabel_names: Noneload_best_model_at_end: Falseignore_data_skip: Falsefsdp: []fsdp_min_num_params: 0fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap: Noneaccelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed: Nonelabel_smoothing_factor: 0.0optim: adamw_torchoptim_args: Noneadafactor: Falsegroup_by_length: Falselength_column_name: lengthddp_find_unused_parameters: Noneddp_bucket_cap_mb: Noneddp_broadcast_buffers: Falsedataloader_pin_memory: Truedataloader_persistent_workers: Falseskip_memory_metrics: Trueuse_legacy_prediction_loop: Falsepush_to_hub: Falseresume_from_checkpoint: Nonehub_model_id: Nonehub_strategy: every_savehub_private_repo: Nonehub_always_push: Falsehub_revision: Nonegradient_checkpointing: Falsegradient_checkpointing_kwargs: Noneinclude_inputs_for_metrics: Falseinclude_for_metrics: []eval_do_concat_batches: Truefp16_backend: autopush_to_hub_model_id: Nonepush_to_hub_organization: Nonemp_parameters:auto_find_batch_size: Falsefull_determinism: Falsetorchdynamo: Noneray_scope: lastddp_timeout: 1800torch_compile: Falsetorch_compile_backend: Nonetorch_compile_mode: Noneinclude_tokens_per_second: Falseinclude_num_input_tokens_seen: Falseneftune_noise_alpha: Noneoptim_target_modules: Nonebatch_eval_metrics: Falseeval_on_start: Falseuse_liger_kernel: Falseliger_kernel_config: Noneeval_use_gather_object: Falseaverage_tokens_across_devices: Falseprompts: Nonebatch_sampler: batch_samplermulti_dataset_batch_sampler: proportionalrouter_mapping: {}learning_rate_mapping: {}
Training Logs
Click to expand
| Epoch | Step | Training Loss | Validation Loss | sts-dev_spearman_cosine | sts-test_spearman_cosine |
|---|---|---|---|---|---|
| -1 | -1 | - | - | 0.3173 | - |
| 0.0049 | 100 | 8.6795 | - | - | - |
| 0.0098 | 200 | 7.0916 | - | - | - |
| 0.0147 | 300 | 6.2754 | - | - | - |
| 0.0196 | 400 | 5.6468 | - | - | - |
| 0.0245 | 500 | 5.1806 | - | - | - |
| 0.0294 | 600 | 4.9193 | - | - | - |
| 0.0343 | 700 | 4.8224 | - | - | - |
| 0.0393 | 800 | 4.688 | - | - | - |
| 0.0442 | 900 | 4.5849 | - | - | - |
| 0.0491 | 1000 | 4.5054 | 4.5019 | 0.3220 | - |
| 0.0540 | 1100 | 4.4745 | - | - | - |
| 0.0589 | 1200 | 4.4241 | - | - | - |
| 0.0638 | 1300 | 4.3941 | - | - | - |
| 0.0687 | 1400 | 4.3561 | - | - | - |
| 0.0736 | 1500 | 4.2871 | - | - | - |
| 0.0785 | 1600 | 4.3038 | - | - | - |
| 0.0834 | 1700 | 4.2364 | - | - | - |
| 0.0883 | 1800 | 4.2433 | - | - | - |
| 0.0932 | 1900 | 4.2421 | - | - | - |
| 0.0981 | 2000 | 4.118 | 4.1484 | 0.3439 | - |
| 0.1030 | 2100 | 4.1618 | - | - | - |
| 0.1080 | 2200 | 4.1264 | - | - | - |
| 0.1129 | 2300 | 4.1202 | - | - | - |
| 0.1178 | 2400 | 4.0704 | - | - | - |
| 0.1227 | 2500 | 4.0588 | - | - | - |
| 0.1276 | 2600 | 4.0463 | - | - | - |
| 0.1325 | 2700 | 4.0372 | - | - | - |
| 0.1374 | 2800 | 4.0293 | - | - | - |
| 0.1423 | 2900 | 3.9915 | - | - | - |
| 0.1472 | 3000 | 4.002 | 3.9807 | 0.3650 | - |
| 0.1521 | 3100 | 3.9987 | - | - | - |
| 0.1570 | 3200 | 3.9888 | - | - | - |
| 0.1619 | 3300 | 3.9868 | - | - | - |
| 0.1668 | 3400 | 3.9166 | - | - | - |
| 0.1717 | 3500 | 3.963 | - | - | - |
| 0.1767 | 3600 | 3.9519 | - | - | - |
| 0.1816 | 3700 | 3.9177 | - | - | - |
| 0.1865 | 3800 | 3.9182 | - | - | - |
| 0.1914 | 3900 | 3.8742 | - | - | - |
| 0.1963 | 4000 | 3.9431 | 3.8795 | 0.4035 | - |
| 0.2012 | 4100 | 3.8876 | - | - | - |
| 0.2061 | 4200 | 3.8561 | - | - | - |
| 0.2110 | 4300 | 3.8497 | - | - | - |
| 0.2159 | 4400 | 3.8631 | - | - | - |
| 0.2208 | 4500 | 3.8035 | - | - | - |
| 0.2257 | 4600 | 3.8261 | - | - | - |
| 0.2306 | 4700 | 3.8372 | - | - | - |
| 0.2355 | 4800 | 3.8258 | - | - | - |
| 0.2404 | 4900 | 3.8329 | - | - | - |
| 0.2454 | 5000 | 3.7712 | 3.8027 | 0.4655 | - |
| 0.2503 | 5100 | 3.8269 | - | - | - |
| 0.2552 | 5200 | 3.768 | - | - | - |
| 0.2601 | 5300 | 3.8226 | - | - | - |
| 0.2650 | 5400 | 3.785 | - | - | - |
| 0.2699 | 5500 | 3.885 | - | - | - |
| 0.2748 | 5600 | 3.7768 | - | - | - |
| 0.2797 | 5700 | 3.7718 | - | - | - |
| 0.2846 | 5800 | 3.7653 | - | - | - |
| 0.2895 | 5900 | 3.6842 | - | - | - |
| 0.2944 | 6000 | 3.7923 | 3.7455 | 0.5044 | - |
| 0.2993 | 6100 | 3.6947 | - | - | - |
| 0.3042 | 6200 | 3.777 | - | - | - |
| 0.3091 | 6300 | 3.7484 | - | - | - |
| 0.3140 | 6400 | 3.7344 | - | - | - |
| 0.3190 | 6500 | 3.6983 | - | - | - |
| 0.3239 | 6600 | 3.7292 | - | - | - |
| 0.3288 | 6700 | 3.744 | - | - | - |
| 0.3337 | 6800 | 3.7059 | - | - | - |
| 0.3386 | 6900 | 3.7091 | - | - | - |
| 0.3435 | 7000 | 3.6957 | 3.6971 | 0.5374 | - |
| 0.3484 | 7100 | 3.7087 | - | - | - |
| 0.3533 | 7200 | 3.6739 | - | - | - |
| 0.3582 | 7300 | 3.7184 | - | - | - |
| 0.3631 | 7400 | 3.6772 | - | - | - |
| 0.3680 | 7500 | 3.6975 | - | - | - |
| 0.3729 | 7600 | 3.642 | - | - | - |
| 0.3778 | 7700 | 3.6739 | - | - | - |
| 0.3827 | 7800 | 3.7022 | - | - | - |
| 0.3877 | 7900 | 3.6733 | - | - | - |
| 0.3926 | 8000 | 3.6329 | 3.6604 | 0.5780 | - |
| 0.3975 | 8100 | 3.6507 | - | - | - |
| 0.4024 | 8200 | 3.7289 | - | - | - |
| 0.4073 | 8300 | 3.6692 | - | - | - |
| 0.4122 | 8400 | 3.7025 | - | - | - |
| 0.4171 | 8500 | 3.677 | - | - | - |
| 0.4220 | 8600 | 3.6106 | - | - | - |
| 0.4269 | 8700 | 3.6415 | - | - | - |
| 0.4318 | 8800 | 3.6768 | - | - | - |
| 0.4367 | 8900 | 3.6421 | - | - | - |
| 0.4416 | 9000 | 3.6317 | 3.6268 | 0.5576 | - |
| 0.4465 | 9100 | 3.6238 | - | - | - |
| 0.4514 | 9200 | 3.689 | - | - | - |
| 0.4564 | 9300 | 3.6149 | - | - | - |
| 0.4613 | 9400 | 3.6665 | - | - | - |
| 0.4662 | 9500 | 3.5821 | - | - | - |
| 0.4711 | 9600 | 3.6461 | - | - | - |
| 0.4760 | 9700 | 3.5887 | - | - | - |
| 0.4809 | 9800 | 3.6255 | - | - | - |
| 0.4858 | 9900 | 3.6296 | - | - | - |
| 0.4907 | 10000 | 3.6344 | 3.6002 | 0.5533 | - |
| 0.4956 | 10100 | 3.6424 | - | - | - |
| 0.5005 | 10200 | 3.6081 | - | - | - |
| 0.5054 | 10300 | 3.6397 | - | - | - |
| 0.5103 | 10400 | 3.5584 | - | - | - |
| 0.5152 | 10500 | 3.6293 | - | - | - |
| 0.5201 | 10600 | 3.6165 | - | - | - |
| 0.5251 | 10700 | 3.6171 | - | - | - |
| 0.5300 | 10800 | 3.5373 | - | - | - |
| 0.5349 | 10900 | 3.5654 | - | - | - |
| 0.5398 | 11000 | 3.5932 | 3.5734 | 0.5747 | - |
| 0.5447 | 11100 | 3.583 | - | - | - |
| 0.5496 | 11200 | 3.5785 | - | - | - |
| 0.5545 | 11300 | 3.601 | - | - | - |
| 0.5594 | 11400 | 3.6087 | - | - | - |
| 0.5643 | 11500 | 3.5732 | - | - | - |
| 0.5692 | 11600 | 3.6086 | - | - | - |
| 0.5741 | 11700 | 3.5875 | - | - | - |
| 0.5790 | 11800 | 3.6021 | - | - | - |
| 0.5839 | 11900 | 3.5893 | - | - | - |
| 0.5888 | 12000 | 3.5709 | 3.5515 | 0.5538 | - |
| 0.5937 | 12100 | 3.518 | - | - | - |
| 0.5987 | 12200 | 3.5438 | - | - | - |
| 0.6036 | 12300 | 3.5659 | - | - | - |
| 0.6085 | 12400 | 3.585 | - | - | - |
| 0.6134 | 12500 | 3.6017 | - | - | - |
| 0.6183 | 12600 | 3.5498 | - | - | - |
| 0.6232 | 12700 | 3.5396 | - | - | - |
| 0.6281 | 12800 | 3.5382 | - | - | - |
| 0.6330 | 12900 | 3.5224 | - | - | - |
| 0.6379 | 13000 | 3.508 | 3.5325 | 0.5721 | - |
| 0.6428 | 13100 | 3.4896 | - | - | - |
| 0.6477 | 13200 | 3.5678 | - | - | - |
| 0.6526 | 13300 | 3.581 | - | - | - |
| 0.6575 | 13400 | 3.5415 | - | - | - |
| 0.6624 | 13500 | 3.5696 | - | - | - |
| 0.6674 | 13600 | 3.4861 | - | - | - |
| 0.6723 | 13700 | 3.5742 | - | - | - |
| 0.6772 | 13800 | 3.4968 | - | - | - |
| 0.6821 | 13900 | 3.4915 | - | - | - |
| 0.6870 | 14000 | 3.5022 | 3.5153 | 0.5573 | - |
| 0.6919 | 14100 | 3.517 | - | - | - |
| 0.6968 | 14200 | 3.5066 | - | - | - |
| 0.7017 | 14300 | 3.5019 | - | - | - |
| 0.7066 | 14400 | 3.5103 | - | - | - |
| 0.7115 | 14500 | 3.4968 | - | - | - |
| 0.7164 | 14600 | 3.4643 | - | - | - |
| 0.7213 | 14700 | 3.507 | - | - | - |
| 0.7262 | 14800 | 3.5323 | - | - | - |
| 0.7311 | 14900 | 3.5152 | - | - | - |
| 0.7361 | 15000 | 3.5066 | 3.4975 | 0.5820 | - |
| 0.7410 | 15100 | 3.5186 | - | - | - |
| 0.7459 | 15200 | 3.5228 | - | - | - |
| 0.7508 | 15300 | 3.5193 | - | - | - |
| 0.7557 | 15400 | 3.5495 | - | - | - |
| 0.7606 | 15500 | 3.4999 | - | - | - |
| 0.7655 | 15600 | 3.4594 | - | - | - |
| 0.7704 | 15700 | 3.4803 | - | - | - |
| 0.7753 | 15800 | 3.5105 | - | - | - |
| 0.7802 | 15900 | 3.4946 | - | - | - |
| 0.7851 | 16000 | 3.4791 | 3.4834 | 0.5795 | - |
| 0.7900 | 16100 | 3.5171 | - | - | - |
| 0.7949 | 16200 | 3.4651 | - | - | - |
| 0.7998 | 16300 | 3.4954 | - | - | - |
| 0.8047 | 16400 | 3.465 | - | - | - |
| 0.8097 | 16500 | 3.4881 | - | - | - |
| 0.8146 | 16600 | 3.5276 | - | - | - |
| 0.8195 | 16700 | 3.5161 | - | - | - |
| 0.8244 | 16800 | 3.4257 | - | - | - |
| 0.8293 | 16900 | 3.4918 | - | - | - |
| 0.8342 | 17000 | 3.4942 | 3.4746 | 0.5747 | - |
| 0.8391 | 17100 | 3.4783 | - | - | - |
| 0.8440 | 17200 | 3.4571 | - | - | - |
| 0.8489 | 17300 | 3.4872 | - | - | - |
| 0.8538 | 17400 | 3.4986 | - | - | - |
| 0.8587 | 17500 | 3.4825 | - | - | - |
| 0.8636 | 17600 | 3.4235 | - | - | - |
| 0.8685 | 17700 | 3.4714 | - | - | - |
| 0.8734 | 17800 | 3.5128 | - | - | - |
| 0.8784 | 17900 | 3.4838 | - | - | - |
| 0.8833 | 18000 | 3.4997 | 3.4643 | 0.5777 | - |
| 0.8882 | 18100 | 3.4467 | - | - | - |
| 0.8931 | 18200 | 3.4836 | - | - | - |
| 0.8980 | 18300 | 3.4243 | - | - | - |
| 0.9029 | 18400 | 3.4869 | - | - | - |
| 0.9078 | 18500 | 3.4759 | - | - | - |
| 0.9127 | 18600 | 3.4671 | - | - | - |
| 0.9176 | 18700 | 3.4816 | - | - | - |
| 0.9225 | 18800 | 3.4661 | - | - | - |
| 0.9274 | 18900 | 3.4246 | - | - | - |
| 0.9323 | 19000 | 3.4658 | 3.4567 | 0.5721 | - |
| 0.9372 | 19100 | 3.4795 | - | - | - |
| 0.9421 | 19200 | 3.4253 | - | - | - |
| 0.9471 | 19300 | 3.4798 | - | - | - |
| 0.9520 | 19400 | 3.4364 | - | - | - |
| 0.9569 | 19500 | 3.4995 | - | - | - |
| 0.9618 | 19600 | 3.4943 | - | - | - |
| 0.9667 | 19700 | 3.4664 | - | - | - |
| 0.9716 | 19800 | 3.4559 | - | - | - |
| 0.9765 | 19900 | 3.4111 | - | - | - |
| 0.9814 | 20000 | 3.4768 | 3.4522 | 0.5782 | - |
| 0.9863 | 20100 | 3.4748 | - | - | - |
| 0.9912 | 20200 | 3.4464 | - | - | - |
| 0.9961 | 20300 | 3.5206 | - | - | - |
| -1 | -1 | - | - | - | 0.4684 |
Framework Versions
- Python: 3.10.18
- Sentence Transformers: 5.0.0
- Transformers: 4.53.2
- PyTorch: 2.7.1+cu126
- Accelerate: 1.9.0
- Datasets: 4.0.0
- Tokenizers: 0.21.2
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
DenoisingAutoEncoderLoss
@inproceedings{wang-2021-TSDAE,
title = "TSDAE: Using Transformer-based Sequential Denoising Auto-Encoderfor Unsupervised Sentence Embedding Learning",
author = "Wang, Kexin and Reimers, Nils and Gurevych, Iryna",
booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2021",
month = nov,
year = "2021",
address = "Punta Cana, Dominican Republic",
publisher = "Association for Computational Linguistics",
pages = "671--688",
url = "https://arxiv.org/abs/2104.06979",
}
- Downloads last month
- 7
Model tree for UmarAzam/bert-base-uncased-industrialtech
Base model
google-bert/bert-base-uncasedDataset used to train UmarAzam/bert-base-uncased-industrialtech
Evaluation results
- Pearson Cosine on sts devself-reported0.560
- Spearman Cosine on sts devself-reported0.578
- Pearson Cosine on sts testself-reported0.415
- Spearman Cosine on sts testself-reported0.468