SentenceTransformer based on PaDaS-Lab/xlm-roberta-base-msmarco

This is a sentence-transformers model finetuned from PaDaS-Lab/xlm-roberta-base-msmarco. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: PaDaS-Lab/xlm-roberta-base-msmarco
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 768 dimensions
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': False, 'architecture': 'XLMRobertaModel'})
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    'Impact of the physico-chemical properties of fen peat on the metal accumulation patterns in mires of Latvia',
    'Santrauka\nAbstract The article presents a study of the physico-chemical properties of fen peat and their influence on the metal accumulation patterns in three Latvian fens: Svētupes Mire, Elku Mire and Vīķu Mire. Full peat profiles were obtained at all study sites and analysed with a multi-proxy approach. The content of metals in fen peat was determined using the atomic absorption spectroscopy (AAS) and normalised to the concentration of Ti in the studied peat profiles. Both the character of deposits and agricultural land use in the mire catchment areas were taken into account and the possible natural and anthropogenic metal supply sources were evaluated. The content of metals in the studied fen peat significantly varied due to the heterogeneity of fen environment; however, noticeable similarities were also traced throughout all study sites. The results indicate an increased amount of transition metals and Pb in the upper peat layer. This can be explained by a direct impact from anthropogenic sources (agricultural land use, pollution, etc.). Metal binding in fen peat profiles is directly related to the alkali and alkaline earth metal content in peat, as Ca, Mg, Na and K ions are replaced by more tightly bound metal ions. In raised bogs, in turn, metal binding is associated with the acidic functional groups common to peat.\nDoi\xa010.5200/baltica.2016.29.03Raktažodžiai\xa0fen peat, metals, peat physico-chemical properties\nPilnas tekstas',
    'Wyckoff Method identifies accumulation and distribution phases by analyzing price and volume patterns. The goal is to locate support and resistance levels and patterns such as Springs and Upthrusts. Volume trends and moving averages provide confirmation.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[1.0000, 0.9981, 0.9960],
#         [0.9981, 1.0000, 0.9978],
#         [0.9960, 0.9978, 1.0000]])

Training Details

Training Dataset

Unnamed Dataset

  • Size: 12,753,278 training samples
  • Columns: sentence_0, sentence_1, sentence_2, and label
  • Approximate statistics based on the first 1000 samples:
    sentence_0 sentence_1 sentence_2 label
    type string string string float
    details
    • min: 6 tokens
    • mean: 15.49 tokens
    • max: 119 tokens
    • min: 9 tokens
    • mean: 74.63 tokens
    • max: 512 tokens
    • min: 11 tokens
    • mean: 107.12 tokens
    • max: 512 tokens
    • min: -0.48
    • mean: 0.16
    • max: 0.82
  • Samples:
    sentence_0 sentence_1 sentence_2 label
    Как найти актуальное зеркало Betwinner? Букмекер старается обеспечить доступ к сайту, поэтому ссылки на зеркала обновляются ежедневно. Чтобы быть в курсе всех новостей, рекомендуется подписаться на почтовую рассылку и соцсети. Актуальные рабочие зеркала 1win можно найти на официальных страницах букмекерской конторы в социальных сетях или путем обращения в службу поддержки. -0.253662109375
    Jakie są minimalne zakłady w bakaracie? Minimalny zakład w bakaracie zależy od konkretnej gry, w którą grasz. Mini Baccarat ma zwykle niskie limity zakładów, co czyni go atrakcyjnym dla nowych graczy. Istnieją też wersje gry w bakarata dla high-rollerów, które nakładają wyższy minimalny zakład. W czasie pisania tego tekstu, Cloudbet miał minimalny zakład w wysokości 0,01 BTC. Cloudbet ma jedne z najwyższych dostępnych maksymalnych zakładów, ale zazwyczaj różnią się one w zależności od płynności na dane zdarzenie. Ogólnie rzecz biorąc, im bliższe zdarzeniu zakłady, tym większa płynność i tym wyższe stawki można postawić. 0.3369140625
    Come scegliere il massimale assicurazione professionale medici? Per il momento, non sono ancora entrate in vigore sul massimale minimo per le polizze rc professionale medici. Teniamo conto però di una cosa: se si lavora (e si è lavorato nei dieci anni precedenti) esclusivamente come dipendenti o specializzandi presso l’SSN, dobbiamo sapere che la rivalsa massima dell’SSN sarà plafonata al triplo del reddito annuo lordo del medico.
    Se invece si lavora in libera professione, non c’è alcun limite. Consigliamo comunque di scegliere massimali non inferiori al milione di euro.
    Potete stipulare una nuova assicurazione di base entro il 31 dicembre, che inizierà a decorrere dal 1 gennaio dell’anno successivo. A tal fine, però, dovete aver disdetto la precedente assicurazione di base entro i termini previsti.
    In linea di massima è possibile disdire l’assicurazione di base inviando una lettera di disdetta firmata per posta, via e-mail o via fax. Alla KPT potete comunicare la disdetta anche nel portale clienti KPTnet.
    Attenzione: Per la disdetta non fa fede il timbro postale, bensì la data in cui l’assicuratore riceve la disdetta. Il termine di disdetta è rispettato se l’assicuratore riceve la disdetta l’ultimo giorno lavorativo del termine legale durante i normali orari di ufficio. Un invio raccomandato recapitato in una casella postale può essere considerato notificato solo nel momento in cui è ritirato alla Posta. Poiché non si possono escludere ritardi, vi consigliamo di inviare la disdetta per posta, mediante lettera raccomandata, entro la metà di novembre.
    C...
    0.09228515625
  • Loss: MarginMSELoss

Training Hyperparameters

Non-Default Hyperparameters

  • per_device_train_batch_size: 64
  • per_device_eval_batch_size: 64
  • num_train_epochs: 1
  • fp16: True
  • multi_dataset_batch_sampler: round_robin

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: no
  • prediction_loss_only: True
  • per_device_train_batch_size: 64
  • per_device_eval_batch_size: 64
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1
  • num_train_epochs: 1
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.0
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • parallelism_config: None
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch_fused
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • project: huggingface
  • trackio_space_id: trackio
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • hub_revision: None
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: no
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • liger_kernel_config: None
  • eval_use_gather_object: False
  • average_tokens_across_devices: True
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: round_robin
  • router_mapping: {}
  • learning_rate_mapping: {}

Training Logs

Click to expand
Epoch Step Training Loss
0.0025 500 4.076
0.0050 1000 0.0295
0.0075 1500 0.0259
0.0100 2000 0.0242
0.0125 2500 0.0226
0.0151 3000 0.022
0.0176 3500 0.0221
0.0201 4000 0.0211
0.0226 4500 0.0208
0.0251 5000 0.0204
0.0276 5500 0.0202
0.0301 6000 0.0197
0.0326 6500 0.0196
0.0351 7000 0.0194
0.0376 7500 0.0193
0.0401 8000 0.0191
0.0427 8500 0.0195
0.0452 9000 0.02
0.0477 9500 0.0189
0.0502 10000 0.0184
0.0527 10500 0.0185
0.0552 11000 0.0183
0.0577 11500 0.0191
0.0602 12000 0.018
0.0627 12500 0.0177
0.0652 13000 0.0177
0.0677 13500 0.0175
0.0703 14000 0.0174
0.0728 14500 0.0172
0.0753 15000 0.0175
0.0778 15500 0.0171
0.0803 16000 0.0174
0.0828 16500 0.0175
0.0853 17000 0.0169
0.0878 17500 0.0168
0.0903 18000 0.0167
0.0928 18500 0.0171
0.0953 19000 0.0169
0.0979 19500 0.0167
0.1004 20000 0.0163
0.1029 20500 0.0164
0.1054 21000 0.0168
0.1079 21500 0.0163
0.1104 22000 0.0167
0.1129 22500 0.0162
0.1154 23000 0.0163
0.1179 23500 0.0159
0.1204 24000 0.0163
0.1229 24500 0.0159
0.1255 25000 0.0161
0.1280 25500 0.0161
0.1305 26000 0.0159
0.1330 26500 0.0159
0.1355 27000 0.0159
0.1380 27500 0.0158
0.1405 28000 0.0157
0.1430 28500 0.0157
0.1455 29000 0.0156
0.1480 29500 0.0172
0.1505 30000 0.0155
0.1531 30500 0.0153
0.1556 31000 0.0152
0.1581 31500 0.0154
0.1606 32000 0.0153
0.1631 32500 0.0153
0.1656 33000 0.0153
0.1681 33500 0.0153
0.1706 34000 0.0151
0.1731 34500 0.015
0.1756 35000 0.0148
0.1782 35500 0.015
0.1807 36000 0.0148
0.1832 36500 0.0149
0.1857 37000 0.0147
0.1882 37500 0.0145
0.1907 38000 0.0145
0.1932 38500 0.0147
0.1957 39000 0.0149
0.1982 39500 0.0145
0.2007 40000 0.0145
0.2032 40500 0.0147
0.2058 41000 0.0147
0.2083 41500 0.0147
0.2108 42000 0.0145
0.2133 42500 0.0144
0.2158 43000 0.0147
0.2183 43500 0.0145
0.2208 44000 0.0147
0.2233 44500 0.0142
0.2258 45000 0.0145
0.2283 45500 0.0141
0.2308 46000 0.0143
0.2334 46500 0.0143
0.2359 47000 0.0141
0.2384 47500 0.0145
0.2409 48000 0.0142
0.2434 48500 0.0141
0.2459 49000 0.0142
0.2484 49500 0.0139
0.2509 50000 0.0141
0.2534 50500 0.0139
0.2559 51000 0.014
0.2584 51500 0.0139
0.2610 52000 0.014
0.2635 52500 0.0142
0.2660 53000 0.014
0.2685 53500 0.0138
0.2710 54000 0.0136
0.2735 54500 0.0138
0.2760 55000 0.0138
0.2785 55500 0.0137
0.2810 56000 0.0136
0.2835 56500 0.0138
0.2860 57000 0.0135
0.2886 57500 0.0135
0.2911 58000 0.0137
0.2936 58500 0.0136
0.2961 59000 0.0135
0.2986 59500 0.0143
0.3011 60000 0.0134
0.3036 60500 0.0135
0.3061 61000 0.0136
0.3086 61500 0.0134
0.3111 62000 0.0134
0.3136 62500 0.0132
0.3162 63000 0.0133
0.3187 63500 0.0133
0.3212 64000 0.0135
0.3237 64500 0.0133
0.3262 65000 0.0133
0.3287 65500 0.0134
0.3312 66000 0.0133
0.3337 66500 0.0132
0.3362 67000 0.0133
0.3387 67500 0.0133
0.3412 68000 0.0132
0.3438 68500 0.0131
0.3463 69000 0.0132
0.3488 69500 0.0131
0.3513 70000 0.013
0.3538 70500 0.0129
0.3563 71000 0.0127
0.3588 71500 0.0131
0.3613 72000 0.0129
0.3638 72500 0.0128
0.3663 73000 0.0129
0.3688 73500 0.0128
0.3714 74000 0.0128
0.3739 74500 0.0131
0.3764 75000 0.013
0.3789 75500 0.0127
0.3814 76000 0.0128
0.3839 76500 0.0127
0.3864 77000 0.0128
0.3889 77500 0.0129
0.3914 78000 0.0128
0.3939 78500 0.0127
0.3964 79000 0.0128
0.3990 79500 0.0126
0.4015 80000 0.0127
0.4040 80500 0.0126
0.4065 81000 0.0124
0.4090 81500 0.0126
0.4115 82000 0.0124
0.4140 82500 0.0124
0.4165 83000 0.0127
0.4190 83500 0.0123
0.4215 84000 0.0124
0.4240 84500 0.0125
0.4266 85000 0.0124
0.4291 85500 0.0124
0.4316 86000 0.0124
0.4341 86500 0.0124
0.4366 87000 0.0128
0.4391 87500 0.0124
0.4416 88000 0.0123
0.4441 88500 0.0123
0.4466 89000 0.0125
0.4491 89500 0.0125
0.4516 90000 0.0123
0.4542 90500 0.0124
0.4567 91000 0.0122
0.4592 91500 0.0122
0.4617 92000 0.0124
0.4642 92500 0.012
0.4667 93000 0.0122
0.4692 93500 0.0121
0.4717 94000 0.0121
0.4742 94500 0.0121
0.4767 95000 0.0123
0.4792 95500 0.0121
0.4818 96000 0.0121
0.4843 96500 0.0127
0.4868 97000 0.012
0.4893 97500 0.0122
0.4918 98000 0.012
0.4943 98500 0.0119
0.4968 99000 0.012
0.4993 99500 0.0121
0.5018 100000 0.012
0.5043 100500 0.0119
0.5069 101000 0.0121
0.5094 101500 0.0123
0.5119 102000 0.0117
0.5144 102500 0.0121
0.5169 103000 0.0118
0.5194 103500 0.0118
0.5219 104000 0.0118
0.5244 104500 0.0119
0.5269 105000 0.012
0.5294 105500 0.0117
0.5319 106000 0.0118
0.5345 106500 0.0118
0.5370 107000 0.0118
0.5395 107500 0.0119
0.5420 108000 0.0116
0.5445 108500 0.012
0.5470 109000 0.0116
0.5495 109500 0.0116
0.5520 110000 0.0116
0.5545 110500 0.0117
0.5570 111000 0.0117
0.5595 111500 0.0117
0.5621 112000 0.0116
0.5646 112500 0.0116
0.5671 113000 0.0116
0.5696 113500 0.0116
0.5721 114000 0.0116
0.5746 114500 0.012
0.5771 115000 0.0119
0.5796 115500 0.0115
0.5821 116000 0.0116
0.5846 116500 0.0115
0.5871 117000 0.0116
0.5897 117500 0.0116
0.5922 118000 0.0115
0.5947 118500 0.0116
0.5972 119000 0.0115
0.5997 119500 0.0116
0.6022 120000 0.0114
0.6047 120500 0.0115
0.6072 121000 0.0115
0.6097 121500 0.0114
0.6122 122000 0.0115
0.6147 122500 0.0114
0.6173 123000 0.0113
0.6198 123500 0.0112
0.6223 124000 0.0114
0.6248 124500 0.0113
0.6273 125000 0.0112
0.6298 125500 0.0115
0.6323 126000 0.0112
0.6348 126500 0.0112
0.6373 127000 0.0113
0.6398 127500 0.0113
0.6423 128000 0.0113
0.6449 128500 0.0112
0.6474 129000 0.0111
0.6499 129500 0.0114
0.6524 130000 0.0111
0.6549 130500 0.0111
0.6574 131000 0.0112
0.6599 131500 0.0111
0.6624 132000 0.0113
0.6649 132500 0.0112
0.6674 133000 0.0112
0.6699 133500 0.0111
0.6725 134000 0.0111
0.6750 134500 0.0111
0.6775 135000 0.011
0.6800 135500 0.0113
0.6825 136000 0.011
0.6850 136500 0.011
0.6875 137000 0.0111
0.6900 137500 0.0111
0.6925 138000 0.0112
0.6950 138500 0.0112
0.6975 139000 0.0109
0.7001 139500 0.0112
0.7026 140000 0.011
0.7051 140500 0.011
0.7076 141000 0.0108
0.7101 141500 0.0109
0.7126 142000 0.0108
0.7151 142500 0.0109
0.7176 143000 0.0109
0.7201 143500 0.0109
0.7226 144000 0.0112
0.7251 144500 0.011
0.7277 145000 0.0108
0.7302 145500 0.0109
0.7327 146000 0.0111
0.7352 146500 0.0109
0.7377 147000 0.0109
0.7402 147500 0.0108
0.7427 148000 0.011
0.7452 148500 0.0108
0.7477 149000 0.0109
0.7502 149500 0.0107
0.7527 150000 0.0108
0.7553 150500 0.011
0.7578 151000 0.0107
0.7603 151500 0.0108
0.7628 152000 0.0107
0.7653 152500 0.0108
0.7678 153000 0.011
0.7703 153500 0.0108
0.7728 154000 0.0108
0.7753 154500 0.0108
0.7778 155000 0.0106
0.7803 155500 0.0107
0.7829 156000 0.0107
0.7854 156500 0.0107
0.7879 157000 0.0107
0.7904 157500 0.0106
0.7929 158000 0.0107
0.7954 158500 0.0107
0.7979 159000 0.0107
0.8004 159500 0.0107
0.8029 160000 0.0105
0.8054 160500 0.0106
0.8079 161000 0.0106
0.8105 161500 0.0108
0.8130 162000 0.0107
0.8155 162500 0.0106
0.8180 163000 0.0107
0.8205 163500 0.0106
0.8230 164000 0.0107
0.8255 164500 0.0105
0.8280 165000 0.0107
0.8305 165500 0.0107
0.8330 166000 0.0107
0.8355 166500 0.0105
0.8381 167000 0.0106
0.8406 167500 0.0105
0.8431 168000 0.0106
0.8456 168500 0.0105
0.8481 169000 0.0105
0.8506 169500 0.0106
0.8531 170000 0.0105
0.8556 170500 0.0105
0.8581 171000 0.0105
0.8606 171500 0.0106
0.8632 172000 0.0106
0.8657 172500 0.0104
0.8682 173000 0.0106
0.8707 173500 0.0105
0.8732 174000 0.0105
0.8757 174500 0.0104
0.8782 175000 0.0104
0.8807 175500 0.0106
0.8832 176000 0.0106
0.8857 176500 0.0103
0.8882 177000 0.0104
0.8908 177500 0.0104
0.8933 178000 0.0105
0.8958 178500 0.0105
0.8983 179000 0.0102
0.9008 179500 0.0105
0.9033 180000 0.0104
0.9058 180500 0.0104
0.9083 181000 0.0103
0.9108 181500 0.0104
0.9133 182000 0.0104
0.9158 182500 0.0103
0.9184 183000 0.0104
0.9209 183500 0.0104
0.9234 184000 0.0103
0.9259 184500 0.0105
0.9284 185000 0.0103
0.9309 185500 0.0103
0.9334 186000 0.0106
0.9359 186500 0.0103
0.9384 187000 0.0108
0.9409 187500 0.0103
0.9434 188000 0.0103
0.9460 188500 0.0103
0.9485 189000 0.0105
0.9510 189500 0.0104
0.9535 190000 0.0102
0.9560 190500 0.0102
0.9585 191000 0.0103
0.9610 191500 0.0101
0.9635 192000 0.0103
0.9660 192500 0.0105
0.9685 193000 0.0102
0.9710 193500 0.0102
0.9736 194000 0.0103
0.9761 194500 0.0102
0.9786 195000 0.0102
0.9811 195500 0.0102
0.9836 196000 0.0104
0.9861 196500 0.0103
0.9886 197000 0.0103
0.9911 197500 0.0102
0.9936 198000 0.0103
0.9961 198500 0.0101
0.9986 199000 0.0102

Framework Versions

  • Python: 3.10.4
  • Sentence Transformers: 5.2.0
  • Transformers: 4.57.3
  • PyTorch: 2.9.1+cu128
  • Accelerate: 1.12.0
  • Datasets: 2.21.0
  • Tokenizers: 0.22.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MarginMSELoss

@misc{hofstätter2021improving,
    title={Improving Efficient Neural Ranking Models with Cross-Architecture Knowledge Distillation},
    author={Sebastian Hofstätter and Sophia Althammer and Michael Schröder and Mete Sertkan and Allan Hanbury},
    year={2021},
    eprint={2010.02666},
    archivePrefix={arXiv},
    primaryClass={cs.IR}
}
Downloads last month
10
Safetensors
Model size
0.3B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for IrvinTopi/marginmse_experiment_gte

Finetuned
(6)
this model

Papers for IrvinTopi/marginmse_experiment_gte