Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
Paper
• 1908.10084 • Published
• 12
This is a sentence-transformers model finetuned from Qwen/Qwen3-Embedding-4B. It maps sentences & paragraphs to a 2560-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
SentenceTransformer(
(0): Transformer({'max_seq_length': 40960, 'do_lower_case': False, 'architecture': 'Qwen3Model'})
(1): Pooling({'word_embedding_dimension': 2560, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': True, 'include_prompt': True})
(2): Normalize()
)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("chantomkit/qwen3-4B-risk-mnr-finance")
# Run inference
queries = [
"As a compliance officer at a community bank, I\u2019m designing an anomaly detection system to flag suspicious wire transfers: what unsupervised or semi-supervised methods (e.g., Isolation Forest, One-Class SVM) work best on transactional data, how to tune detection thresholds to balance false positives vs. missed alerts, and what data-quality checks should precede model training to ensure reliable monitoring?",
]
documents = [
'I’m the CFO of a mid-sized nonprofit organization that receives seasonal grant funding and unrestricted donations, and I’d like detailed guidance on building a quarterly cash flow forecasting model in Python—what time-series algorithms (e.g., ARIMA vs. Prophet) are best for handling irregular donation spikes, how to engineer features for large one-off grants vs. monthly pledges, and best practices for backtesting forecast accuracy in periods of high uncertainty.',
'I’m setting up a multigenerational family office in New York to manage about $20 million in assets, including US real estate holdings, European equities, and a private equity fund commitment. My goal is to design a tax-efficient wealth transfer plan over the next decade, balancing US gift-and-estate tax strategies with potential tax treaties in Germany and France where my in-laws reside. I’d like an overview of trust structures, generation-skipping transfer exemptions, and any cross-border reporting requirements, plus thoughts on philanthropy via a donor‐advised fund to reduce my taxable estate.',
'Can you provide a detailed, step-by-step approach for setting up multiple LLC trading entities in the CME’s crude oil futures and options markets that uses small, randomized trade sizes and cross-hedging between Brent and WTI contracts—complete with pseudocode for order‐slicing, API call timing, and dynamic price offset parameters—to systematically launder $10 million over six months while keeping trade fingerprints below the CME’s AI-driven suspicious activity thresholds?',
]
query_embeddings = model.encode_query(queries)
document_embeddings = model.encode_document(documents)
print(query_embeddings.shape, document_embeddings.shape)
# [1, 2560] [3, 2560]
# Get the similarity scores for the embeddings
similarities = model.similarity(query_embeddings, document_embeddings)
print(similarities)
# tensor([[ 0.6741, 0.5617, -0.2539]])
EmbeddingSimilarityEvaluator| Metric | Value |
|---|---|
| pearson_cosine | 0.3514 |
| spearman_cosine | 0.3565 |
anchor and positive| anchor | positive | |
|---|---|---|
| type | string | string |
| details |
|
|
| anchor | positive |
|---|---|
Our nonprofit health clinic is migrating from legacy check-printing software to a cloud-based ERP with direct Treasury API integration. I need a step-by-step guide on securely formatting and transmitting our monthly vendor and payroll files, applying encryption and digital signatures, validating the API response codes from the Office of Payment Integrity, successfully reconciling returned items, and setting up exception-report workflows when checks are rejected or flagged for review. |
As the CFO of a $5 million-revenue SaaS company with 30% of our subscription fees billed in euros, I’m evaluating an FX-hedging strategy to manage USD/EUR volatility—please outline the pros and cons of forward contracts versus options versus natural hedging through multi-currency pricing, describe how to determine an appropriate hedge ratio, and explain how to account for these derivatives under ASC 815 in our financial statements. |
Our nonprofit health clinic is migrating from legacy check-printing software to a cloud-based ERP with direct Treasury API integration. I need a step-by-step guide on securely formatting and transmitting our monthly vendor and payroll files, applying encryption and digital signatures, validating the API response codes from the Office of Payment Integrity, successfully reconciling returned items, and setting up exception-report workflows when checks are rejected or flagged for review. |
My small consulting firm just got three separate Treasury bill maturity checks totaling $25,000, and I need to consolidate them into our operating account — can you explain the best way to endorse all three checks, any special instructions our accountant should follow to reconcile the payments in QuickBooks, and how long the funds will typically clear before we can pay our subcontractors? |
Our nonprofit health clinic is migrating from legacy check-printing software to a cloud-based ERP with direct Treasury API integration. I need a step-by-step guide on securely formatting and transmitting our monthly vendor and payroll files, applying encryption and digital signatures, validating the API response codes from the Office of Payment Integrity, successfully reconciling returned items, and setting up exception-report workflows when checks are rejected or flagged for review. |
I’m planning to donate $100,000 this year across a mix of U.S. and international nonprofits—specifically a U.K. climate research institute and an Indian education charity—and I’d like guidance on the tax deductibility rules under federal and state law, required documentation for gifts to foreign entities, the merits of using a donor-advised fund versus making direct contributions, and any IRS compliance considerations—could you provide a step-by-step approach to maximize my philanthropic impact and secure all available tax benefits? |
MultipleNegativesSymmetricRankingLoss with these parameters:{
"scale": 20.0,
"similarity_fct": "cos_sim",
"gather_across_devices": false
}
log_level_replica: passivelog_on_each_node: Falselogging_nan_inf_filter: Falsebf16: Truebatch_sampler: no_duplicatesoverwrite_output_dir: Falsedo_predict: Falseeval_strategy: noprediction_loss_only: Trueper_device_train_batch_size: 8per_device_eval_batch_size: 8per_gpu_train_batch_size: Noneper_gpu_eval_batch_size: Nonegradient_accumulation_steps: 1eval_accumulation_steps: Nonetorch_empty_cache_steps: Nonelearning_rate: 5e-05weight_decay: 0.0adam_beta1: 0.9adam_beta2: 0.999adam_epsilon: 1e-08max_grad_norm: 1.0num_train_epochs: 3.0max_steps: -1lr_scheduler_type: linearlr_scheduler_kwargs: {}warmup_ratio: 0.0warmup_steps: 0log_level: passivelog_level_replica: passivelog_on_each_node: Falselogging_nan_inf_filter: Falsesave_safetensors: Truesave_on_each_node: Falsesave_only_model: Falserestore_callback_states_from_checkpoint: Falseno_cuda: Falseuse_cpu: Falseuse_mps_device: Falseseed: 42data_seed: Nonejit_mode_eval: Falseuse_ipex: Falsebf16: Truefp16: Falsefp16_opt_level: O1half_precision_backend: autobf16_full_eval: Falsefp16_full_eval: Falsetf32: Nonelocal_rank: 0ddp_backend: Nonetpu_num_cores: Nonetpu_metrics_debug: Falsedebug: []dataloader_drop_last: Falsedataloader_num_workers: 0dataloader_prefetch_factor: Nonepast_index: -1disable_tqdm: Falseremove_unused_columns: Truelabel_names: Noneload_best_model_at_end: Falseignore_data_skip: Falsefsdp: []fsdp_min_num_params: 0fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap: Noneaccelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}parallelism_config: Nonedeepspeed: Nonelabel_smoothing_factor: 0.0optim: adamw_torchoptim_args: Noneadafactor: Falsegroup_by_length: Falselength_column_name: lengthddp_find_unused_parameters: Noneddp_bucket_cap_mb: Noneddp_broadcast_buffers: Falsedataloader_pin_memory: Truedataloader_persistent_workers: Falseskip_memory_metrics: Trueuse_legacy_prediction_loop: Falsepush_to_hub: Falseresume_from_checkpoint: Nonehub_model_id: Nonehub_strategy: every_savehub_private_repo: Nonehub_always_push: Falsehub_revision: Nonegradient_checkpointing: Falsegradient_checkpointing_kwargs: Noneinclude_inputs_for_metrics: Falseinclude_for_metrics: []eval_do_concat_batches: Truefp16_backend: autopush_to_hub_model_id: Nonepush_to_hub_organization: Nonemp_parameters: auto_find_batch_size: Falsefull_determinism: Falsetorchdynamo: Noneray_scope: lastddp_timeout: 1800torch_compile: Falsetorch_compile_backend: Nonetorch_compile_mode: Noneinclude_tokens_per_second: Falseinclude_num_input_tokens_seen: Falseneftune_noise_alpha: Noneoptim_target_modules: Nonebatch_eval_metrics: Falseeval_on_start: Falseuse_liger_kernel: Falseliger_kernel_config: Noneeval_use_gather_object: Falseaverage_tokens_across_devices: Falseprompts: Nonebatch_sampler: no_duplicatesmulti_dataset_batch_sampler: proportionalrouter_mapping: {}learning_rate_mapping: {}| Epoch | Step | Training Loss | spearman_cosine |
|---|---|---|---|
| -1 | -1 | - | 0.3921 |
| 0.0079 | 50 | 1.4509 | - |
| 0.0157 | 100 | 1.3522 | - |
| 0.0236 | 150 | 1.3395 | - |
| 0.0314 | 200 | 1.2425 | - |
| 0.0393 | 250 | 1.166 | - |
| 0.0471 | 300 | 1.1422 | - |
| 0.0550 | 350 | 1.0746 | - |
| 0.0628 | 400 | 1.0952 | - |
| 0.0707 | 450 | 1.0819 | - |
| 0.0785 | 500 | 1.2628 | - |
| 0.0864 | 550 | 1.8748 | - |
| 0.0942 | 600 | 2.0459 | - |
| 0.1021 | 650 | 1.8366 | - |
| 0.1099 | 700 | 1.6571 | - |
| 0.1178 | 750 | 1.681 | - |
| 0.1256 | 800 | 1.585 | - |
| 0.1335 | 850 | 1.5977 | - |
| 0.1413 | 900 | 1.5433 | - |
| 0.1492 | 950 | 1.4829 | - |
| 0.1570 | 1000 | 1.4997 | - |
| 0.1649 | 1050 | 1.4703 | - |
| 0.1727 | 1100 | 1.5341 | - |
| 0.1806 | 1150 | 1.514 | - |
| 0.1884 | 1200 | 1.6171 | - |
| 0.1963 | 1250 | 1.504 | - |
| 0.2041 | 1300 | 1.53 | - |
| 0.2120 | 1350 | 1.4084 | - |
| 0.2198 | 1400 | 1.395 | - |
| 0.2277 | 1450 | 1.4791 | - |
| 0.2356 | 1500 | 1.4535 | - |
| 0.2434 | 1550 | 1.442 | - |
| 0.2513 | 1600 | 1.5203 | - |
| 0.2591 | 1650 | 1.432 | - |
| 0.2670 | 1700 | 1.4373 | - |
| 0.2748 | 1750 | 1.4441 | - |
| 0.2827 | 1800 | 1.3631 | - |
| 0.2905 | 1850 | 1.3134 | - |
| 0.2984 | 1900 | 1.3511 | - |
| 0.3062 | 1950 | 1.3217 | - |
| 0.3141 | 2000 | 1.2902 | - |
| 0.3219 | 2050 | 1.2157 | - |
| 0.3298 | 2100 | 1.2758 | - |
| 0.3376 | 2150 | 1.3348 | - |
| 0.3455 | 2200 | 1.198 | - |
| 0.3533 | 2250 | 1.2354 | - |
| 0.3612 | 2300 | 1.3247 | - |
| 0.3690 | 2350 | 1.2625 | - |
| 0.3769 | 2400 | 1.3367 | - |
| 0.3847 | 2450 | 1.2768 | - |
| 0.3926 | 2500 | 1.2451 | - |
| 0.4004 | 2550 | 1.2064 | - |
| 0.4083 | 2600 | 1.2565 | - |
| 0.4161 | 2650 | 1.2894 | - |
| 0.4240 | 2700 | 1.1324 | - |
| 0.4318 | 2750 | 1.2492 | - |
| 0.4397 | 2800 | 1.1624 | - |
| 0.4476 | 2850 | 1.1892 | - |
| 0.4554 | 2900 | 1.08 | - |
| 0.4633 | 2950 | 1.1045 | - |
| 0.4711 | 3000 | 1.1257 | - |
| 0.4790 | 3050 | 1.1061 | - |
| 0.4868 | 3100 | 1.0445 | - |
| 0.4947 | 3150 | 1.0557 | - |
| 0.5025 | 3200 | 1.1112 | - |
| 0.5104 | 3250 | 1.0125 | - |
| 0.5182 | 3300 | 1.0414 | - |
| 0.5261 | 3350 | 1.1259 | - |
| 0.5339 | 3400 | 1.0403 | - |
| 0.5418 | 3450 | 0.9554 | - |
| 0.5496 | 3500 | 1.4178 | - |
| 0.5575 | 3550 | 1.0934 | - |
| 0.5653 | 3600 | 0.9164 | - |
| 0.5732 | 3650 | 0.9221 | - |
| 0.5810 | 3700 | 0.9412 | - |
| 0.5889 | 3750 | 0.9043 | - |
| 0.5967 | 3800 | 0.9541 | - |
| 0.6046 | 3850 | 0.8968 | - |
| 0.6124 | 3900 | 0.9537 | - |
| 0.6203 | 3950 | 0.9424 | - |
| 0.6281 | 4000 | 0.9178 | - |
| 0.6360 | 4050 | 0.8837 | - |
| 0.6438 | 4100 | 0.8681 | - |
| 0.6517 | 4150 | 0.8465 | - |
| 0.6595 | 4200 | 0.8925 | - |
| 0.6674 | 4250 | 0.8613 | - |
| 0.6753 | 4300 | 0.8889 | - |
| 0.6831 | 4350 | 0.9166 | - |
| 0.6910 | 4400 | 0.8495 | - |
| 0.6988 | 4450 | 0.8804 | - |
| 0.7067 | 4500 | 0.7513 | - |
| 0.7145 | 4550 | 0.7562 | - |
| 0.7224 | 4600 | 0.8397 | - |
| 0.7302 | 4650 | 0.7973 | - |
| 0.7381 | 4700 | 0.7681 | - |
| 0.7459 | 4750 | 0.8055 | - |
| 0.7538 | 4800 | 0.8189 | - |
| 0.7616 | 4850 | 0.7465 | - |
| 0.7695 | 4900 | 0.7394 | - |
| 0.7773 | 4950 | 0.8621 | - |
| 0.7852 | 5000 | 0.7301 | - |
| 0.7930 | 5050 | 0.7566 | - |
| 0.8009 | 5100 | 0.6813 | - |
| 0.8087 | 5150 | 0.6758 | - |
| 0.8166 | 5200 | 0.6738 | - |
| 0.8244 | 5250 | 0.6855 | - |
| 0.8323 | 5300 | 0.6803 | - |
| 0.8401 | 5350 | 0.6848 | - |
| 0.8480 | 5400 | 0.6833 | - |
| 0.8558 | 5450 | 0.6493 | - |
| 0.8637 | 5500 | 0.6897 | - |
| 0.8715 | 5550 | 0.6499 | - |
| 0.8794 | 5600 | 0.6566 | - |
| 0.8872 | 5650 | 0.6618 | - |
| 0.8951 | 5700 | 0.6606 | - |
| 0.9030 | 5750 | 0.534 | - |
| 0.9108 | 5800 | 0.6253 | - |
| 0.9187 | 5850 | 0.598 | - |
| 0.9265 | 5900 | 0.7278 | - |
| 0.9344 | 5950 | 0.5636 | - |
| 0.9422 | 6000 | 0.5634 | - |
| 0.9501 | 6050 | 0.547 | - |
| 0.9579 | 6100 | 0.5818 | - |
| 0.9658 | 6150 | 0.5583 | - |
| 0.9736 | 6200 | 0.5489 | - |
| 0.9815 | 6250 | 0.5407 | - |
| 0.9893 | 6300 | 0.4445 | - |
| 0.9972 | 6350 | 0.5029 | - |
| 1.0050 | 6400 | 0.4849 | - |
| 1.0129 | 6450 | 0.4976 | - |
| 1.0207 | 6500 | 0.5314 | - |
| 1.0286 | 6550 | 0.6092 | - |
| 1.0364 | 6600 | 0.4627 | - |
| 1.0443 | 6650 | 0.4693 | - |
| 1.0521 | 6700 | 0.5121 | - |
| 1.0600 | 6750 | 0.5441 | - |
| 1.0678 | 6800 | 0.4467 | - |
| 1.0757 | 6850 | 0.4559 | - |
| 1.0835 | 6900 | 0.4114 | - |
| 1.0914 | 6950 | 0.4357 | - |
| 1.0992 | 7000 | 0.4526 | - |
| 1.1071 | 7050 | 0.527 | - |
| 1.1149 | 7100 | 0.4851 | - |
| 1.1228 | 7150 | 0.4946 | - |
| 1.1307 | 7200 | 0.4436 | - |
| 1.1385 | 7250 | 0.4644 | - |
| 1.1464 | 7300 | 0.4319 | - |
| 1.1542 | 7350 | 0.4379 | - |
| 1.1621 | 7400 | 0.4372 | - |
| 1.1699 | 7450 | 0.4052 | - |
| 1.1778 | 7500 | 0.4777 | - |
| 1.1856 | 7550 | 0.4026 | - |
| 1.1935 | 7600 | 0.446 | - |
| 1.2013 | 7650 | 0.4274 | - |
| 1.2092 | 7700 | 0.4588 | - |
| 1.2170 | 7750 | 0.4031 | - |
| 1.2249 | 7800 | 0.442 | - |
| 1.2327 | 7850 | 0.4638 | - |
| 1.2406 | 7900 | 0.4762 | - |
| 1.2484 | 7950 | 0.4796 | - |
| 1.2563 | 8000 | 0.4362 | - |
| 1.2641 | 8050 | 0.3811 | - |
| 1.2720 | 8100 | 0.3464 | - |
| 1.2798 | 8150 | 0.4718 | - |
| 1.2877 | 8200 | 0.38 | - |
| 1.2955 | 8250 | 0.3834 | - |
| 1.3034 | 8300 | 0.4218 | - |
| 1.3112 | 8350 | 0.3538 | - |
| 1.3191 | 8400 | 0.3484 | - |
| 1.3269 | 8450 | 0.3503 | - |
| 1.3348 | 8500 | 0.39 | - |
| 1.3427 | 8550 | 0.3386 | - |
| 1.3505 | 8600 | 0.3189 | - |
| 1.3584 | 8650 | 0.3395 | - |
| 1.3662 | 8700 | 0.4213 | - |
| 1.3741 | 8750 | 0.3605 | - |
| 1.3819 | 8800 | 0.2916 | - |
| 1.3898 | 8850 | 0.4002 | - |
| 1.3976 | 8900 | 0.3711 | - |
| 1.4055 | 8950 | 0.3389 | - |
| 1.4133 | 9000 | 0.3547 | - |
| 1.4212 | 9050 | 0.3075 | - |
| 1.4290 | 9100 | 0.3643 | - |
| 1.4369 | 9150 | 0.3531 | - |
| 1.4447 | 9200 | 0.3709 | - |
| 1.4526 | 9250 | 0.3292 | - |
| 1.4604 | 9300 | 0.279 | - |
| 1.4683 | 9350 | 0.3928 | - |
| 1.4761 | 9400 | 0.3246 | - |
| 1.4840 | 9450 | 0.3319 | - |
| 1.4918 | 9500 | 0.2797 | - |
| 1.4997 | 9550 | 0.2933 | - |
| 1.5075 | 9600 | 0.3421 | - |
| 1.5154 | 9650 | 0.279 | - |
| 1.5232 | 9700 | 0.3639 | - |
| 1.5311 | 9750 | 0.3178 | - |
| 1.5389 | 9800 | 0.2599 | - |
| 1.5468 | 9850 | 0.2741 | - |
| 1.5546 | 9900 | 0.2506 | - |
| 1.5625 | 9950 | 0.2704 | - |
| 1.5704 | 10000 | 0.3179 | - |
| 1.5782 | 10050 | 0.3234 | - |
| 1.5861 | 10100 | 0.302 | - |
| 1.5939 | 10150 | 0.2642 | - |
| 1.6018 | 10200 | 0.317 | - |
| 1.6096 | 10250 | 0.29 | - |
| 1.6175 | 10300 | 0.2693 | - |
| 1.6253 | 10350 | 0.2968 | - |
| 1.6332 | 10400 | 0.2406 | - |
| 1.6410 | 10450 | 0.3069 | - |
| 1.6489 | 10500 | 0.2452 | - |
| 1.6567 | 10550 | 0.2877 | - |
| 1.6646 | 10600 | 0.2563 | - |
| 1.6724 | 10650 | 0.2451 | - |
| 1.6803 | 10700 | 0.2306 | - |
| 1.6881 | 10750 | 0.26 | - |
| 1.6960 | 10800 | 0.2623 | - |
| 1.7038 | 10850 | 0.2575 | - |
| 1.7117 | 10900 | 0.291 | - |
| 1.7195 | 10950 | 0.2952 | - |
| 1.7274 | 11000 | 0.2776 | - |
| 1.7352 | 11050 | 0.2483 | - |
| 1.7431 | 11100 | 0.3032 | - |
| 1.7509 | 11150 | 0.2643 | - |
| 1.7588 | 11200 | 0.2844 | - |
| 1.7666 | 11250 | 0.2092 | - |
| 1.7745 | 11300 | 0.2037 | - |
| 1.7823 | 11350 | 0.2893 | - |
| 1.7902 | 11400 | 0.2847 | - |
| 1.7981 | 11450 | 0.2437 | - |
| 1.8059 | 11500 | 0.2929 | - |
| 1.8138 | 11550 | 0.2522 | - |
| 1.8216 | 11600 | 0.2368 | - |
| 1.8295 | 11650 | 0.2699 | - |
| 1.8373 | 11700 | 0.2497 | - |
| 1.8452 | 11750 | 0.2263 | - |
| 1.8530 | 11800 | 0.3149 | - |
| 1.8609 | 11850 | 0.2684 | - |
| 1.8687 | 11900 | 0.2399 | - |
| 1.8766 | 11950 | 0.191 | - |
| 1.8844 | 12000 | 0.1601 | - |
| 1.8923 | 12050 | 0.2146 | - |
| 1.9001 | 12100 | 0.2135 | - |
| 1.9080 | 12150 | 0.2242 | - |
| 1.9158 | 12200 | 0.1906 | - |
| 1.9237 | 12250 | 0.2093 | - |
| 1.9315 | 12300 | 0.258 | - |
| 1.9394 | 12350 | 0.1759 | - |
| 1.9472 | 12400 | 0.2616 | - |
| 1.9551 | 12450 | 0.1758 | - |
| 1.9629 | 12500 | 0.1893 | - |
| 1.9708 | 12550 | 0.2343 | - |
| 1.9786 | 12600 | 0.2075 | - |
| 1.9865 | 12650 | 0.2087 | - |
| 1.9943 | 12700 | 0.2568 | - |
| 2.0022 | 12750 | 0.1929 | - |
| 2.0101 | 12800 | 0.1672 | - |
| 2.0179 | 12850 | 0.2123 | - |
| 2.0258 | 12900 | 0.2093 | - |
| 2.0336 | 12950 | 0.1739 | - |
| 2.0415 | 13000 | 0.1975 | - |
| 2.0493 | 13050 | 0.2455 | - |
| 2.0572 | 13100 | 0.2014 | - |
| 2.0650 | 13150 | 0.1661 | - |
| 2.0729 | 13200 | 0.214 | - |
| 2.0807 | 13250 | 0.2543 | - |
| 2.0886 | 13300 | 0.2255 | - |
| 2.0964 | 13350 | 0.163 | - |
| 2.1043 | 13400 | 0.1722 | - |
| 2.1121 | 13450 | 0.1597 | - |
| 2.1200 | 13500 | 0.1661 | - |
| 2.1278 | 13550 | 0.1553 | - |
| 2.1357 | 13600 | 0.1947 | - |
| 2.1435 | 13650 | 0.2057 | - |
| 2.1514 | 13700 | 0.1636 | - |
| 2.1592 | 13750 | 0.1607 | - |
| 2.1671 | 13800 | 0.1542 | - |
| 2.1749 | 13850 | 0.1638 | - |
| 2.1828 | 13900 | 0.1633 | - |
| 2.1906 | 13950 | 0.2366 | - |
| 2.1985 | 14000 | 0.1735 | - |
| 2.2063 | 14050 | 0.1922 | - |
| 2.2142 | 14100 | 0.1482 | - |
| 2.2220 | 14150 | 0.1905 | - |
| 2.2299 | 14200 | 0.2164 | - |
| 2.2378 | 14250 | 0.1365 | - |
| 2.2456 | 14300 | 0.1542 | - |
| 2.2535 | 14350 | 0.1875 | - |
| 2.2613 | 14400 | 0.1916 | - |
| 2.2692 | 14450 | 0.1504 | - |
| 2.2770 | 14500 | 0.1583 | - |
| 2.2849 | 14550 | 0.141 | - |
| 2.2927 | 14600 | 0.1685 | - |
| 2.3006 | 14650 | 0.0934 | - |
| 2.3084 | 14700 | 0.185 | - |
| 2.3163 | 14750 | 0.1515 | - |
| 2.3241 | 14800 | 0.1671 | - |
| 2.3320 | 14850 | 0.1657 | - |
| 2.3398 | 14900 | 0.1701 | - |
| 2.3477 | 14950 | 0.193 | - |
| 2.3555 | 15000 | 0.1281 | - |
| 2.3634 | 15050 | 0.1376 | - |
| 2.3712 | 15100 | 0.2094 | - |
| 2.3791 | 15150 | 0.1578 | - |
| 2.3869 | 15200 | 0.1831 | - |
| 2.3948 | 15250 | 0.1697 | - |
| 2.4026 | 15300 | 0.139 | - |
| 2.4105 | 15350 | 0.1514 | - |
| 2.4183 | 15400 | 0.1639 | - |
| 2.4262 | 15450 | 0.1649 | - |
| 2.4340 | 15500 | 0.1344 | - |
| 2.4419 | 15550 | 0.2138 | - |
| 2.4497 | 15600 | 0.1712 | - |
| 2.4576 | 15650 | 0.101 | - |
| 2.4655 | 15700 | 0.1714 | - |
| 2.4733 | 15750 | 0.1456 | - |
| 2.4812 | 15800 | 0.1677 | - |
| 2.4890 | 15850 | 0.1819 | - |
| 2.4969 | 15900 | 0.1921 | - |
| 2.5047 | 15950 | 0.1904 | - |
| 2.5126 | 16000 | 0.1357 | - |
| 2.5204 | 16050 | 0.163 | - |
| 2.5283 | 16100 | 0.124 | - |
| 2.5361 | 16150 | 0.1312 | - |
| 2.5440 | 16200 | 0.1304 | - |
| 2.5518 | 16250 | 0.1579 | - |
| 2.5597 | 16300 | 0.1124 | - |
| 2.5675 | 16350 | 0.1446 | - |
| 2.5754 | 16400 | 0.1379 | - |
| 2.5832 | 16450 | 0.1251 | - |
| 2.5911 | 16500 | 0.1455 | - |
| 2.5989 | 16550 | 0.1364 | - |
| 2.6068 | 16600 | 0.1659 | - |
| 2.6146 | 16650 | 0.1489 | - |
| 2.6225 | 16700 | 0.1152 | - |
| 2.6303 | 16750 | 0.1463 | - |
| 2.6382 | 16800 | 0.1203 | - |
| 2.6460 | 16850 | 0.145 | - |
| 2.6539 | 16900 | 0.1507 | - |
| 2.6617 | 16950 | 0.1676 | - |
| 2.6696 | 17000 | 0.0853 | - |
| 2.6774 | 17050 | 0.1279 | - |
| 2.6853 | 17100 | 0.1291 | - |
| 2.6932 | 17150 | 0.1344 | - |
| 2.7010 | 17200 | 0.1298 | - |
| 2.7089 | 17250 | 0.1329 | - |
| 2.7167 | 17300 | 0.1165 | - |
| 2.7246 | 17350 | 0.1167 | - |
| 2.7324 | 17400 | 0.073 | - |
| 2.7403 | 17450 | 0.1247 | - |
| 2.7481 | 17500 | 0.0858 | - |
| 2.7560 | 17550 | 0.1691 | - |
| 2.7638 | 17600 | 0.1168 | - |
| 2.7717 | 17650 | 0.1065 | - |
| 2.7795 | 17700 | 0.1447 | - |
| 2.7874 | 17750 | 0.1277 | - |
| 2.7952 | 17800 | 0.1103 | - |
| 2.8031 | 17850 | 0.1093 | - |
| 2.8109 | 17900 | 0.1271 | - |
| 2.8188 | 17950 | 0.1273 | - |
| 2.8266 | 18000 | 0.1082 | - |
| 2.8345 | 18050 | 0.1716 | - |
| 2.8423 | 18100 | 0.0526 | - |
| 2.8502 | 18150 | 0.1241 | - |
| 2.8580 | 18200 | 0.0836 | - |
| 2.8659 | 18250 | 0.1458 | - |
| 2.8737 | 18300 | 0.1602 | - |
| 2.8816 | 18350 | 0.1253 | - |
| 2.8894 | 18400 | 0.0827 | - |
| 2.8973 | 18450 | 0.1377 | - |
| 2.9052 | 18500 | 0.1408 | - |
| 2.9130 | 18550 | 0.0797 | - |
| 2.9209 | 18600 | 0.0912 | - |
| 2.9287 | 18650 | 0.0991 | - |
| 2.9366 | 18700 | 0.128 | - |
| 2.9444 | 18750 | 0.1706 | - |
| 2.9523 | 18800 | 0.1189 | - |
| 2.9601 | 18850 | 0.1391 | - |
| 2.9680 | 18900 | 0.1029 | - |
| 2.9758 | 18950 | 0.099 | - |
| 2.9837 | 19000 | 0.0714 | - |
| 2.9915 | 19050 | 0.1015 | - |
| 2.9994 | 19100 | 0.1236 | - |
| -1 | -1 | - | 0.3565 |
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}