eval_name
stringlengths 12
111
| Precision
stringclasses 3
values | Type
stringclasses 7
values | T
stringclasses 7
values | Weight type
stringclasses 2
values | Architecture
stringclasses 64
values | Model
stringlengths 355
689
| fullname
stringlengths 4
102
| Model sha
stringlengths 0
40
| Average ⬆️
float64 0.74
52.1
| Hub License
stringclasses 27
values | Hub ❤️
int64 0
6.09k
| #Params (B)
float64 -1
141
| Available on the hub
bool 2
classes | MoE
bool 2
classes | Flagged
bool 2
classes | Chat Template
bool 2
classes | CO₂ cost (kg)
float64 0.04
187
| IFEval Raw
float64 0
0.9
| IFEval
float64 0
90
| BBH Raw
float64 0.22
0.83
| BBH
float64 0.25
76.7
| MATH Lvl 5 Raw
float64 0
0.71
| MATH Lvl 5
float64 0
71.5
| GPQA Raw
float64 0.21
0.47
| GPQA
float64 0
29.4
| MUSR Raw
float64 0.29
0.6
| MUSR
float64 0
38.7
| MMLU-PRO Raw
float64 0.1
0.73
| MMLU-PRO
float64 0
70
| Merged
bool 2
classes | Official Providers
bool 2
classes | Upload To Hub Date
stringclasses 525
values | Submission Date
stringclasses 263
values | Generation
int64 0
10
| Base Model
stringlengths 4
102
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Etherll_Chocolatine-3B-Instruct-DPO-Revised-Ties-v2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Phi3ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Etherll/Chocolatine-3B-Instruct-DPO-Revised-Ties-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/Chocolatine-3B-Instruct-DPO-Revised-Ties-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__Chocolatine-3B-Instruct-DPO-Revised-Ties-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Etherll/Chocolatine-3B-Instruct-DPO-Revised-Ties-v2
|
121b0831361743558e1a56fd89ae3d3c03272cc4
| 25.007216
| 0
| 3.821
| false
| false
| false
| false
| 1.262591
| 0.373993
| 37.399323
| 0.541065
| 35.583343
| 0.163142
| 16.314199
| 0.323826
| 9.8434
| 0.464938
| 17.817187
| 0.397773
| 33.085845
| false
| false
|
2024-10-29
| 0
|
Removed
|
||
Etherll_Herplete-LLM-Llama-3.1-8b_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/Etherll/Herplete-LLM-Llama-3.1-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/Herplete-LLM-Llama-3.1-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__Herplete-LLM-Llama-3.1-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Etherll/Herplete-LLM-Llama-3.1-8b
|
b3829cf437216f099c031a9ab5e4c8ec974766dd
| 19.588708
| 5
| 8.03
| false
| false
| false
| true
| 0.973685
| 0.467191
| 46.71915
| 0.501343
| 28.952591
| 0.027946
| 2.794562
| 0.286074
| 4.809843
| 0.386
| 6.683333
| 0.348155
| 27.572769
| false
| false
|
2024-08-24
|
2024-08-29
| 1
|
Etherll/Herplete-LLM-Llama-3.1-8b (Merge)
|
|
Etherll_Herplete-LLM-Llama-3.1-8b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/Etherll/Herplete-LLM-Llama-3.1-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/Herplete-LLM-Llama-3.1-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__Herplete-LLM-Llama-3.1-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Etherll/Herplete-LLM-Llama-3.1-8b
|
d1383d993fad005d515be4d815797019601c679f
| 26.260139
| 5
| 8.03
| false
| false
| false
| false
| 1.709613
| 0.610598
| 61.059766
| 0.534725
| 33.206608
| 0.154834
| 15.483384
| 0.314597
| 8.612975
| 0.399052
| 8.614844
| 0.375249
| 30.583259
| false
| false
|
2024-08-24
|
2024-10-18
| 1
|
Etherll/Herplete-LLM-Llama-3.1-8b (Merge)
|
|
Etherll_Herplete-LLM-Llama-3.1-8b-Ties_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/Etherll/Herplete-LLM-Llama-3.1-8b-Ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/Herplete-LLM-Llama-3.1-8b-Ties</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__Herplete-LLM-Llama-3.1-8b-Ties-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Etherll/Herplete-LLM-Llama-3.1-8b-Ties
| 26.533291
| 0
| 8.03
| false
| false
| false
| false
| 1.724403
| 0.616368
| 61.63679
| 0.533798
| 33.07089
| 0.160121
| 16.012085
| 0.317114
| 8.948546
| 0.401719
| 8.948177
| 0.375249
| 30.583259
| false
| false
|
2024-10-03
|
2024-10-17
| 1
|
Etherll/Herplete-LLM-Llama-3.1-8b-Ties (Merge)
|
||
Etherll_Qwen2.5-7B-della-test_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Etherll/Qwen2.5-7B-della-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/Qwen2.5-7B-della-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__Qwen2.5-7B-della-test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Etherll/Qwen2.5-7B-della-test
|
c2b2ffc38627e68e7b43a1b596dc16ee93c1c63b
| 35.816567
| 1
| 7.616
| false
| false
| false
| true
| 2.084641
| 0.762497
| 76.249684
| 0.544733
| 35.546894
| 0.489426
| 48.942598
| 0.308725
| 7.829978
| 0.404698
| 8.98724
| 0.436087
| 37.343011
| false
| false
|
2024-11-01
|
2024-11-14
| 1
|
Etherll/Qwen2.5-7B-della-test (Merge)
|
|
Etherll_Qwen2.5-Coder-7B-Instruct-Ties_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Etherll/Qwen2.5-Coder-7B-Instruct-Ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/Qwen2.5-Coder-7B-Instruct-Ties</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__Qwen2.5-Coder-7B-Instruct-Ties-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Etherll/Qwen2.5-Coder-7B-Instruct-Ties
|
d8c1624a2fa60f05030e34a128af391b5d8be332
| 26.51372
| 1
| 7.616
| false
| false
| false
| false
| 2.394363
| 0.500539
| 50.053857
| 0.489514
| 28.008294
| 0.291541
| 29.154079
| 0.329698
| 10.626398
| 0.437281
| 13.426823
| 0.350316
| 27.812869
| false
| false
|
2024-09-30
|
2024-10-28
| 1
|
Etherll/Qwen2.5-Coder-7B-Instruct-Ties (Merge)
|
|
Etherll_Replete-LLM-V3-Llama-3.1-8b_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/Etherll/Replete-LLM-V3-Llama-3.1-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/Replete-LLM-V3-Llama-3.1-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__Replete-LLM-V3-Llama-3.1-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Etherll/Replete-LLM-V3-Llama-3.1-8b
|
e79849d72f70ef74677ed81a8885403973b2470c
| 21.704317
| 5
| 8.03
| false
| false
| false
| true
| 1.578659
| 0.526292
| 52.629246
| 0.454338
| 22.902455
| 0.227341
| 22.734139
| 0.268456
| 2.46085
| 0.351646
| 2.055729
| 0.346991
| 27.443484
| false
| false
|
2024-08-24
|
2024-08-26
| 1
|
Etherll/Replete-LLM-V3-Llama-3.1-8b (Merge)
|
|
Etherll_SuperHermes_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/Etherll/SuperHermes" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/SuperHermes</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__SuperHermes-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Etherll/SuperHermes
|
7edd56cb37722d09b0334826e0532b223d334939
| 26.919305
| 1
| 8.03
| false
| false
| false
| false
| 1.500031
| 0.545902
| 54.590154
| 0.528953
| 32.840317
| 0.165408
| 16.540785
| 0.323826
| 9.8434
| 0.440042
| 14.938542
| 0.394864
| 32.762633
| false
| false
|
2024-10-27
|
2024-10-27
| 1
|
Etherll/SuperHermes (Merge)
|
|
Eurdem_Defne-llama3.1-8B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/Eurdem/Defne-llama3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Eurdem/Defne-llama3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Eurdem__Defne-llama3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Eurdem/Defne-llama3.1-8B
|
7832ba3066636bf4dab3e7d658c0b3ded12491ae
| 25.120605
|
llama3.1
| 6
| 8.03
| true
| false
| false
| false
| 2.664596
| 0.503612
| 50.361153
| 0.532098
| 32.822381
| 0.160121
| 16.012085
| 0.296141
| 6.152125
| 0.433094
| 13.536719
| 0.386553
| 31.83917
| false
| false
|
2024-07-29
|
2024-08-14
| 0
|
Eurdem/Defne-llama3.1-8B
|
FINGU-AI_Chocolatine-Fusion-14B_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/FINGU-AI/Chocolatine-Fusion-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FINGU-AI/Chocolatine-Fusion-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FINGU-AI__Chocolatine-Fusion-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FINGU-AI/Chocolatine-Fusion-14B
|
49b7b720ddd40ccdca303922037a4bb34b1ca33b
| 40.361559
|
mit
| 3
| 8.367
| true
| false
| false
| false
| 3.778183
| 0.694903
| 69.490286
| 0.641323
| 48.600901
| 0.385196
| 38.519637
| 0.371644
| 16.219239
| 0.494021
| 21.985937
| 0.52618
| 47.353354
| false
| false
|
2025-02-02
|
2025-02-02
| 0
|
FINGU-AI/Chocolatine-Fusion-14B
|
FINGU-AI_L3-8B_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/FINGU-AI/L3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FINGU-AI/L3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FINGU-AI__L3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FINGU-AI/L3-8B
|
7e7999af68810a8158bf1cf939b1874d430d51f1
| 28.914535
|
llama3.1
| 2
| 8.03
| true
| false
| false
| true
| 1.42419
| 0.751731
| 75.173096
| 0.498559
| 28.805821
| 0.254532
| 25.453172
| 0.295302
| 6.040268
| 0.382833
| 8.6875
| 0.363946
| 29.327349
| false
| false
|
2025-01-18
|
2025-01-18
| 0
|
FINGU-AI/L3-8B
|
FINGU-AI_Phi-4-RRStock_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/FINGU-AI/Phi-4-RRStock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FINGU-AI/Phi-4-RRStock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FINGU-AI__Phi-4-RRStock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FINGU-AI/Phi-4-RRStock
|
d2a5483701f222aedbec6de974929a83ae533c4d
| 26.415394
|
mit
| 0
| 6.652
| true
| false
| false
| false
| 2.730316
| 0.285541
| 28.554125
| 0.644344
| 48.682205
| 0.058157
| 5.81571
| 0.380034
| 17.337808
| 0.447948
| 14.960156
| 0.488281
| 43.142361
| false
| false
|
2025-02-05
|
2025-02-05
| 0
|
FINGU-AI/Phi-4-RRStock
|
FINGU-AI_Q-Small-3B_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/FINGU-AI/Q-Small-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FINGU-AI/Q-Small-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FINGU-AI__Q-Small-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FINGU-AI/Q-Small-3B
|
42ad8458821a8574c3973d7e8088208a32c2fb81
| 16.890415
|
apache-2.0
| 0
| 3.086
| true
| false
| false
| true
| 1.454658
| 0.414535
| 41.453455
| 0.431853
| 21.386477
| 0.083082
| 8.308157
| 0.266779
| 2.237136
| 0.400542
| 8.067708
| 0.279006
| 19.889554
| false
| false
|
2025-01-21
|
2025-01-21
| 0
|
FINGU-AI/Q-Small-3B
|
FINGU-AI_QwQ-Buddy-32B-Alpha_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/FINGU-AI/QwQ-Buddy-32B-Alpha" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FINGU-AI/QwQ-Buddy-32B-Alpha</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FINGU-AI__QwQ-Buddy-32B-Alpha-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FINGU-AI/QwQ-Buddy-32B-Alpha
|
d975cf81a61e62ea087d83d598d0b51a3629de52
| 35.178272
|
mit
| 1
| 19.662
| true
| false
| false
| false
| 11.118193
| 0.344642
| 34.464222
| 0.642442
| 48.730953
| 0.385196
| 38.519637
| 0.379195
| 17.225951
| 0.50599
| 24.415365
| 0.529422
| 47.713505
| false
| false
|
2025-02-05
|
2025-02-05
| 0
|
FINGU-AI/QwQ-Buddy-32B-Alpha
|
FINGU-AI_RomboUltima-32B_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/FINGU-AI/RomboUltima-32B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FINGU-AI/RomboUltima-32B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FINGU-AI__RomboUltima-32B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FINGU-AI/RomboUltima-32B
|
98a732a32e2366a2ab8f08fdc3d668892e7c1f7f
| 44.731545
|
mit
| 3
| 17.645
| true
| false
| false
| false
| 7.804715
| 0.667151
| 66.715094
| 0.693845
| 56.67377
| 0.53852
| 53.851964
| 0.371644
| 16.219239
| 0.483635
| 21.721094
| 0.578873
| 53.208112
| false
| false
|
2025-02-02
|
2025-02-02
| 0
|
FINGU-AI/RomboUltima-32B
|
FINGU-AI_Ultimos-32B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/FINGU-AI/Ultimos-32B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FINGU-AI/Ultimos-32B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FINGU-AI__Ultimos-32B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FINGU-AI/Ultimos-32B
|
d2cb2b0ee4425e06a2303c27a1f4ae4570b5f5ca
| 3.640727
|
mit
| 0
| 9.604
| true
| false
| false
| true
| 1.703313
| 0.15922
| 15.921976
| 0.290553
| 2.277935
| 0
| 0
| 0.249161
| 0
| 0.328604
| 2.408854
| 0.11112
| 1.235594
| false
| false
|
2025-02-10
|
2025-02-10
| 0
|
FINGU-AI/Ultimos-32B
|
FallenMerick_Chewy-Lemon-Cookie-11B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/FallenMerick/Chewy-Lemon-Cookie-11B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FallenMerick/Chewy-Lemon-Cookie-11B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FallenMerick__Chewy-Lemon-Cookie-11B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FallenMerick/Chewy-Lemon-Cookie-11B
|
0f5d0d6d218b3ef034f58eba32d6fe7ac4c237ae
| 22.043726
|
cc-by-4.0
| 0
| 10.732
| true
| false
| false
| false
| 1.714548
| 0.487524
| 48.752421
| 0.525112
| 33.0143
| 0.054381
| 5.438066
| 0.279362
| 3.914989
| 0.454552
| 15.952344
| 0.326712
| 25.190233
| true
| false
|
2024-06-06
|
2024-06-27
| 1
|
FallenMerick/Chewy-Lemon-Cookie-11B (Merge)
|
Felladrin_Llama-160M-Chat-v1_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/Felladrin/Llama-160M-Chat-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Felladrin/Llama-160M-Chat-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Felladrin__Llama-160M-Chat-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Felladrin/Llama-160M-Chat-v1
|
e7f50665676821867ee7dfad32d0ca9fb68fc6bc
| 4.201766
|
apache-2.0
| 18
| 0.162
| true
| false
| false
| true
| 0.363161
| 0.157546
| 15.754642
| 0.303608
| 3.166756
| 0.006042
| 0.60423
| 0.25755
| 1.006711
| 0.366125
| 3.165625
| 0.113614
| 1.512633
| false
| false
|
2023-12-20
|
2024-07-23
| 1
|
JackFram/llama-160m
|
Felladrin_Minueza-32M-UltraChat_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/Felladrin/Minueza-32M-UltraChat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Felladrin/Minueza-32M-UltraChat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Felladrin__Minueza-32M-UltraChat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Felladrin/Minueza-32M-UltraChat
|
28506b99c5902d2215eb378ec91d4226a7396c49
| 3.924256
|
apache-2.0
| 5
| 0.033
| true
| false
| false
| true
| 0.336134
| 0.137563
| 13.756278
| 0.294148
| 2.43729
| 0.004532
| 0.453172
| 0.255872
| 0.782998
| 0.374187
| 4.640104
| 0.113281
| 1.475694
| false
| false
|
2024-02-27
|
2024-07-23
| 1
|
Felladrin/Minueza-32M-Base
|
FlofloB_100k_fineweb_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit_float16
|
float16
|
🟩 continuously pretrained
|
🟩
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/FlofloB/100k_fineweb_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/100k_fineweb_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__100k_fineweb_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FlofloB/100k_fineweb_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit
|
ea6ceae8a6894f1c6ea3fe978846b2a66c3e369c
| 8.55083
|
apache-2.0
| 1
| 0.5
| true
| false
| false
| true
| 0.967387
| 0.308322
| 30.832192
| 0.332339
| 7.347825
| 0.040785
| 4.07855
| 0.269295
| 2.572707
| 0.330219
| 0.94401
| 0.149767
| 5.529699
| false
| false
|
2024-11-28
|
2024-11-29
| 3
|
Qwen/Qwen2.5-0.5B
|
FlofloB_10k_continued_pretraining_Phi-3-mini-4k-instruct_Unsloth_merged_16bit_float16
|
float16
|
🟩 continuously pretrained
|
🟩
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/FlofloB/10k_continued_pretraining_Phi-3-mini-4k-instruct_Unsloth_merged_16bit" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/10k_continued_pretraining_Phi-3-mini-4k-instruct_Unsloth_merged_16bit</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__10k_continued_pretraining_Phi-3-mini-4k-instruct_Unsloth_merged_16bit-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FlofloB/10k_continued_pretraining_Phi-3-mini-4k-instruct_Unsloth_merged_16bit
|
a2eb0460779e76bb511339bcc2545b4729c9d78e
| 24.043564
|
apache-2.0
| 1
| 16
| true
| false
| false
| true
| 0.97509
| 0.509731
| 50.973085
| 0.521499
| 32.6078
| 0.097432
| 9.743202
| 0.299497
| 6.599553
| 0.430958
| 13.569792
| 0.376912
| 30.767952
| false
| false
|
2024-11-22
|
2024-11-22
| 1
|
unsloth/phi-3-mini-4k-instruct-bnb-4bit
|
FlofloB_10k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit_float16
|
float16
|
🟩 continuously pretrained
|
🟩
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/FlofloB/10k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/10k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__10k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FlofloB/10k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit
|
2152657b389375f48fc5073413bba17835117bcc
| 8.363924
|
apache-2.0
| 1
| 0.5
| true
| false
| false
| true
| 1.01673
| 0.281544
| 28.154408
| 0.330552
| 7.530229
| 0.030967
| 3.096677
| 0.279362
| 3.914989
| 0.330219
| 1.477344
| 0.154089
| 6.0099
| false
| false
|
2024-11-25
|
2024-11-25
| 3
|
Qwen/Qwen2.5-0.5B
|
FlofloB_40k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit_float16
|
float16
|
🟩 continuously pretrained
|
🟩
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/FlofloB/40k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/40k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__40k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FlofloB/40k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit
|
64c61d9c777da56597a338afd7586cc4ad07d350
| 8.38158
|
apache-2.0
| 1
| 0.5
| true
| false
| false
| true
| 0.963134
| 0.301578
| 30.157759
| 0.332461
| 7.53209
| 0.033233
| 3.323263
| 0.267617
| 2.348993
| 0.340823
| 1.536198
| 0.148521
| 5.391179
| false
| false
|
2024-11-25
|
2024-11-25
| 3
|
Qwen/Qwen2.5-0.5B
|
FlofloB_83k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit_float16
|
float16
|
🟩 continuously pretrained
|
🟩
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/FlofloB/83k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/83k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__83k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FlofloB/83k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit
|
4c4d3660d0288295f89880a3a86f4eb9ecc9d344
| 8.427461
|
apache-2.0
| 2
| 0.5
| true
| false
| false
| true
| 0.984373
| 0.28694
| 28.693976
| 0.334653
| 8.132273
| 0.030211
| 3.021148
| 0.27349
| 3.131991
| 0.328948
| 1.41849
| 0.155502
| 6.166888
| false
| false
|
2024-11-26
|
2024-11-26
| 3
|
Qwen/Qwen2.5-0.5B
|
FlofloB_smollm2-135M_pretrained_1000k_fineweb_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_1000k_fineweb" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_1000k_fineweb</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_1000k_fineweb-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FlofloB/smollm2-135M_pretrained_1000k_fineweb
|
a0f91cfda4e5a820dbe30bd5e3fbb8f233f7467e
| 4.207808
|
apache-2.0
| 0
| 0.135
| true
| false
| false
| false
| 0.675817
| 0.148454
| 14.845388
| 0.291794
| 2.708744
| 0.009063
| 0.906344
| 0.262584
| 1.677852
| 0.358062
| 3.291146
| 0.116356
| 1.817376
| false
| false
|
2025-01-11
|
2025-01-14
| 5
|
HuggingFaceTB/SmolLM2-135M
|
FlofloB_smollm2-135M_pretrained_1000k_fineweb_uncovai_human_removed_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_1000k_fineweb_uncovai_human_removed" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_1000k_fineweb_uncovai_human_removed</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_1000k_fineweb_uncovai_human_removed-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FlofloB/smollm2-135M_pretrained_1000k_fineweb_uncovai_human_removed
|
73ba3da387b3bdc50d6e3594c5c89ddebb271e81
| 4.06135
|
apache-2.0
| 0
| 0.135
| true
| false
| false
| false
| 0.678209
| 0.155373
| 15.53733
| 0.306643
| 3.274267
| 0.006042
| 0.60423
| 0.250839
| 0.111857
| 0.358031
| 3.253906
| 0.114279
| 1.58651
| false
| false
|
2025-01-24
|
2025-01-27
| 5
|
HuggingFaceTB/SmolLM2-135M
|
FlofloB_smollm2-135M_pretrained_1000k_fineweb_uncovai_selected_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_1000k_fineweb_uncovai_selected" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_1000k_fineweb_uncovai_selected</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_1000k_fineweb_uncovai_selected-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FlofloB/smollm2-135M_pretrained_1000k_fineweb_uncovai_selected
|
e2115c3c7315400cb6338465672087c457b157ac
| 5.055843
|
apache-2.0
| 0
| 0.135
| true
| false
| false
| false
| 0.669616
| 0.146781
| 14.678054
| 0.293178
| 2.113414
| 0.006798
| 0.679758
| 0.26594
| 2.12528
| 0.40476
| 8.995052
| 0.115691
| 1.743499
| false
| false
|
2025-01-12
|
2025-01-12
| 5
|
HuggingFaceTB/SmolLM2-135M
|
FlofloB_smollm2-135M_pretrained_1200k_fineweb_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_1200k_fineweb" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_1200k_fineweb</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_1200k_fineweb-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FlofloB/smollm2-135M_pretrained_1200k_fineweb
|
d886605e0d45787f492f628fd0ea72c27f205f83
| 4.188312
|
apache-2.0
| 0
| 0.135
| true
| false
| false
| false
| 0.670761
| 0.158096
| 15.809607
| 0.294098
| 2.237296
| 0.006798
| 0.679758
| 0.264262
| 1.901566
| 0.371365
| 3.653906
| 0.10763
| 0.847739
| false
| false
|
2025-01-12
|
2025-01-12
| 6
|
HuggingFaceTB/SmolLM2-135M
|
FlofloB_smollm2-135M_pretrained_1200k_fineweb_uncovai_human_removed_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_1200k_fineweb_uncovai_human_removed" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_1200k_fineweb_uncovai_human_removed</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_1200k_fineweb_uncovai_human_removed-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FlofloB/smollm2-135M_pretrained_1200k_fineweb_uncovai_human_removed
|
d743033d6f0048af31089e1133de7cee8b1e83f5
| 4.280291
|
apache-2.0
| 0
| 0.135
| true
| false
| false
| false
| 0.672153
| 0.157771
| 15.777138
| 0.294962
| 2.849419
| 0.000755
| 0.075529
| 0.265101
| 2.013423
| 0.37
| 3.416667
| 0.113946
| 1.549572
| false
| false
|
2025-01-27
|
2025-01-27
| 6
|
HuggingFaceTB/SmolLM2-135M
|
FlofloB_smollm2-135M_pretrained_1200k_fineweb_uncovai_selected_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_1200k_fineweb_uncovai_selected" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_1200k_fineweb_uncovai_selected</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_1200k_fineweb_uncovai_selected-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FlofloB/smollm2-135M_pretrained_1200k_fineweb_uncovai_selected
|
8c05c5b2f00c84d4120b3221c81c1f481c585768
| 4.030505
|
apache-2.0
| 0
| 0.135
| true
| false
| false
| false
| 0.67059
| 0.158471
| 15.847064
| 0.296047
| 2.206545
| 0.007553
| 0.755287
| 0.263423
| 1.789709
| 0.356729
| 1.757812
| 0.116439
| 1.826611
| false
| false
|
2025-01-12
|
2025-01-14
| 6
|
HuggingFaceTB/SmolLM2-135M
|
FlofloB_smollm2-135M_pretrained_1400k_fineweb_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_1400k_fineweb" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_1400k_fineweb</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_1400k_fineweb-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FlofloB/smollm2-135M_pretrained_1400k_fineweb
|
a9c59a43cf0da87ad05ec8bd4a4c75d22c2e367c
| 4.992957
|
apache-2.0
| 0
| 0.135
| true
| false
| false
| false
| 0.688093
| 0.176381
| 17.638089
| 0.292178
| 2.1601
| 0.011329
| 1.132931
| 0.26594
| 2.12528
| 0.387333
| 6.016667
| 0.107962
| 0.884678
| false
| false
|
2025-01-13
|
2025-01-13
| 7
|
HuggingFaceTB/SmolLM2-135M
|
FlofloB_smollm2-135M_pretrained_1400k_fineweb_uncovai_human_removed_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_1400k_fineweb_uncovai_human_removed" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_1400k_fineweb_uncovai_human_removed</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_1400k_fineweb_uncovai_human_removed-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FlofloB/smollm2-135M_pretrained_1400k_fineweb_uncovai_human_removed
|
f2851eedb367100fa0ca50ed25ff610a83713de2
| 5.063032
|
apache-2.0
| 0
| 0.135
| true
| false
| false
| false
| 0.688245
| 0.170661
| 17.066051
| 0.299239
| 2.630029
| 0.010574
| 1.057402
| 0.260906
| 1.454139
| 0.393938
| 7.008854
| 0.110455
| 1.161717
| false
| false
|
2025-01-28
|
2025-01-28
| 7
|
HuggingFaceTB/SmolLM2-135M
|
FlofloB_smollm2-135M_pretrained_1400k_fineweb_uncovai_selected_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_1400k_fineweb_uncovai_selected" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_1400k_fineweb_uncovai_selected</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_1400k_fineweb_uncovai_selected-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FlofloB/smollm2-135M_pretrained_1400k_fineweb_uncovai_selected
|
098a8e666d272a8cb4863b0877b6f4507e1c230c
| 4.62464
|
apache-2.0
| 0
| 0.135
| true
| false
| false
| false
| 0.675417
| 0.15385
| 15.384956
| 0.291673
| 2.631616
| 0.010574
| 1.057402
| 0.268456
| 2.46085
| 0.374062
| 4.691146
| 0.113697
| 1.521868
| false
| false
|
2025-01-13
|
2025-01-13
| 7
|
HuggingFaceTB/SmolLM2-135M
|
FlofloB_smollm2-135M_pretrained_200k_fineweb_uncovai_human_removed_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_200k_fineweb_uncovai_human_removed" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_200k_fineweb_uncovai_human_removed</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_200k_fineweb_uncovai_human_removed-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FlofloB/smollm2-135M_pretrained_200k_fineweb_uncovai_human_removed
|
4bacfcaa1040d1cba93da123ce57749bf2ed5e82
| 3.881968
|
apache-2.0
| 0
| 0.135
| true
| false
| false
| false
| 0.666409
| 0.14748
| 14.74798
| 0.302874
| 2.82254
| 0.003776
| 0.377644
| 0.258389
| 1.118568
| 0.357844
| 2.897135
| 0.111951
| 1.32794
| false
| false
|
2025-01-17
|
2025-01-17
| 1
|
HuggingFaceTB/SmolLM2-135M
|
FlofloB_smollm2-135M_pretrained_200k_fineweb_uncovai_selected_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_200k_fineweb_uncovai_selected" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_200k_fineweb_uncovai_selected</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_200k_fineweb_uncovai_selected-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FlofloB/smollm2-135M_pretrained_200k_fineweb_uncovai_selected
|
381cdec29375aeaf0fb1bcc8ab2218443fc1cadd
| 3.492026
|
apache-2.0
| 1
| 0.135
| true
| false
| false
| false
| 0.68229
| 0.134515
| 13.451531
| 0.292719
| 2.322352
| 0.007553
| 0.755287
| 0.250839
| 0.111857
| 0.366031
| 2.853906
| 0.113115
| 1.457225
| false
| false
|
2025-01-08
|
2025-01-08
| 1
|
HuggingFaceTB/SmolLM2-135M
|
FlofloB_smollm2-135M_pretrained_400k_fineweb_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_400k_fineweb" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_400k_fineweb</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_400k_fineweb-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FlofloB/smollm2-135M_pretrained_400k_fineweb
|
2601cf93307104afc3f57f467323f5368567cb74
| 4.224945
|
apache-2.0
| 0
| 0.135
| true
| false
| false
| false
| 0.691257
| 0.151127
| 15.112679
| 0.297234
| 1.889766
| 0.012085
| 1.208459
| 0.252517
| 0.33557
| 0.379427
| 4.995052
| 0.116273
| 1.808141
| false
| false
|
2025-01-09
|
2025-01-10
| 2
|
HuggingFaceTB/SmolLM2-135M
|
FlofloB_smollm2-135M_pretrained_400k_fineweb_uncovai_human_removed_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_400k_fineweb_uncovai_human_removed" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_400k_fineweb_uncovai_human_removed</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_400k_fineweb_uncovai_human_removed-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FlofloB/smollm2-135M_pretrained_400k_fineweb_uncovai_human_removed
|
c99f5022db1982d463626b4d87c7aeeff519b3fa
| 4.710927
|
apache-2.0
| 0
| 0.135
| true
| false
| false
| false
| 0.679336
| 0.155648
| 15.564812
| 0.30488
| 3.575492
| 0.009063
| 0.906344
| 0.255034
| 0.671141
| 0.386
| 6.016667
| 0.11378
| 1.531102
| false
| false
|
2025-01-18
|
2025-01-18
| 2
|
HuggingFaceTB/SmolLM2-135M
|
FlofloB_smollm2-135M_pretrained_400k_fineweb_uncovai_selected_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_400k_fineweb_uncovai_selected" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_400k_fineweb_uncovai_selected</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_400k_fineweb_uncovai_selected-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FlofloB/smollm2-135M_pretrained_400k_fineweb_uncovai_selected
|
ecac44607d60c294b460a8786f6253d561f3de85
| 4.387331
|
apache-2.0
| 1
| 0.135
| true
| false
| false
| false
| 0.67153
| 0.158421
| 15.842077
| 0.292517
| 2.073466
| 0.006798
| 0.679758
| 0.254195
| 0.559284
| 0.382
| 5.416667
| 0.115775
| 1.752733
| false
| false
|
2025-01-09
|
2025-01-09
| 2
|
HuggingFaceTB/SmolLM2-135M
|
FlofloB_smollm2-135M_pretrained_600k_fineweb_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_600k_fineweb" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_600k_fineweb</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_600k_fineweb-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FlofloB/smollm2-135M_pretrained_600k_fineweb
|
6922498cf15ce9558b8ad2c33fc43106628d0cec
| 4.886739
|
apache-2.0
| 0
| 0.135
| true
| false
| false
| false
| 0.674154
| 0.163916
| 16.391619
| 0.301372
| 3.424053
| 0.006042
| 0.60423
| 0.26594
| 2.12528
| 0.380854
| 5.373437
| 0.112616
| 1.401817
| false
| false
|
2025-01-10
|
2025-01-11
| 3
|
HuggingFaceTB/SmolLM2-135M
|
FlofloB_smollm2-135M_pretrained_600k_fineweb_uncovai_human_removed_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_600k_fineweb_uncovai_human_removed" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_600k_fineweb_uncovai_human_removed</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_600k_fineweb_uncovai_human_removed-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FlofloB/smollm2-135M_pretrained_600k_fineweb_uncovai_human_removed
|
02a7c39af8a00dbd0ffa449cd830cf57261246b3
| 4.644402
|
apache-2.0
| 0
| 0.135
| true
| false
| false
| false
| 0.667734
| 0.164141
| 16.414115
| 0.300017
| 2.418749
| 0.009063
| 0.906344
| 0.262584
| 1.677852
| 0.379333
| 4.816667
| 0.114694
| 1.632683
| false
| false
|
2025-01-18
|
2025-01-19
| 3
|
HuggingFaceTB/SmolLM2-135M
|
FlofloB_smollm2-135M_pretrained_600k_fineweb_uncovai_selected_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_600k_fineweb_uncovai_selected" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_600k_fineweb_uncovai_selected</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_600k_fineweb_uncovai_selected-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FlofloB/smollm2-135M_pretrained_600k_fineweb_uncovai_selected
|
66e4931a5409bb8739522ff5df3b4f3373738fad
| 4.657606
|
apache-2.0
| 0
| 0.135
| true
| false
| false
| false
| 0.675576
| 0.160594
| 16.059389
| 0.298344
| 2.165156
| 0.007553
| 0.755287
| 0.260906
| 1.454139
| 0.384635
| 5.71276
| 0.11619
| 1.798907
| false
| false
|
2025-01-09
|
2025-01-09
| 3
|
HuggingFaceTB/SmolLM2-135M
|
FlofloB_smollm2-135M_pretrained_800k_fineweb_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_800k_fineweb" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_800k_fineweb</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_800k_fineweb-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FlofloB/smollm2-135M_pretrained_800k_fineweb
|
066f4d48c5f6d83ac9a44e8572a3d20c74f6ec08
| 4.174506
|
apache-2.0
| 0
| 0.135
| true
| false
| false
| false
| 0.672662
| 0.164141
| 16.414115
| 0.295944
| 2.348388
| 0.008308
| 0.830816
| 0.249161
| 0
| 0.370125
| 3.765625
| 0.115193
| 1.688091
| false
| false
|
2025-01-11
|
2025-01-14
| 4
|
HuggingFaceTB/SmolLM2-135M
|
FlofloB_smollm2-135M_pretrained_800k_fineweb_uncovai_human_removed_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_800k_fineweb_uncovai_human_removed" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_800k_fineweb_uncovai_human_removed</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_800k_fineweb_uncovai_human_removed-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FlofloB/smollm2-135M_pretrained_800k_fineweb_uncovai_human_removed
|
60c100113d77cced9b284172608f100297183ac9
| 5.032543
|
apache-2.0
| 0
| 0.135
| true
| false
| false
| false
| 0.668405
| 0.162293
| 16.229272
| 0.30381
| 3.210703
| 0.006798
| 0.679758
| 0.252517
| 0.33557
| 0.399271
| 8.208854
| 0.11378
| 1.531102
| false
| false
|
2025-01-19
|
2025-01-19
| 4
|
HuggingFaceTB/SmolLM2-135M
|
FlofloB_smollm2-135M_pretrained_800k_fineweb_uncovai_selected_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_800k_fineweb_uncovai_selected" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_800k_fineweb_uncovai_selected</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_800k_fineweb_uncovai_selected-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FlofloB/smollm2-135M_pretrained_800k_fineweb_uncovai_selected
|
7b351540b5fb395759e44385826c5fedef8672ec
| 4.118739
|
apache-2.0
| 0
| 0.135
| true
| false
| false
| false
| 0.670237
| 0.14743
| 14.742993
| 0.294281
| 1.922858
| 0.004532
| 0.453172
| 0.261745
| 1.565996
| 0.376635
| 4.579427
| 0.113032
| 1.447991
| false
| false
|
2025-01-11
|
2025-01-14
| 4
|
HuggingFaceTB/SmolLM2-135M
|
FlofloB_smollm2_pretrained_200k_fineweb_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2_pretrained_200k_fineweb" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2_pretrained_200k_fineweb</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2_pretrained_200k_fineweb-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FlofloB/smollm2_pretrained_200k_fineweb
|
c3086ab3555e766f0b3903b8b9a1a290e3e25f3d
| 4.005599
|
apache-2.0
| 1
| 0.135
| true
| false
| false
| false
| 0.659464
| 0.1527
| 15.270039
| 0.299468
| 2.872523
| 0.003776
| 0.377644
| 0.247483
| 0
| 0.369938
| 3.742187
| 0.115941
| 1.771203
| false
| false
|
2025-01-08
|
2025-01-08
| 1
|
HuggingFaceTB/SmolLM2-135M
|
FlofloB_test_continued_pretraining_Phi-3-mini-4k-instruct_Unsloth_merged_16bit_float16
|
float16
|
🟩 continuously pretrained
|
🟩
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/FlofloB/test_continued_pretraining_Phi-3-mini-4k-instruct_Unsloth_merged_16bit" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/test_continued_pretraining_Phi-3-mini-4k-instruct_Unsloth_merged_16bit</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__test_continued_pretraining_Phi-3-mini-4k-instruct_Unsloth_merged_16bit-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FlofloB/test_continued_pretraining_Phi-3-mini-4k-instruct_Unsloth_merged_16bit
|
cfd97ca5927a2e09ec30001a576d82dd8b635e09
| 24.485702
|
apache-2.0
| 2
| 16
| true
| false
| false
| true
| 1.515322
| 0.521546
| 52.154616
| 0.524083
| 32.882433
| 0.110272
| 11.02719
| 0.311242
| 8.165548
| 0.424417
| 12.452083
| 0.372091
| 30.232343
| false
| false
|
2024-11-21
|
2024-11-21
| 1
|
unsloth/phi-3-mini-4k-instruct-bnb-4bit
|
FuJhen_ft-openhermes-25-mistral-7b-irca-dpo-pairs_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Adapter
|
?
|
<a target="_blank" href="https://huggingface.co/FuJhen/ft-openhermes-25-mistral-7b-irca-dpo-pairs" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FuJhen/ft-openhermes-25-mistral-7b-irca-dpo-pairs</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FuJhen__ft-openhermes-25-mistral-7b-irca-dpo-pairs-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FuJhen/ft-openhermes-25-mistral-7b-irca-dpo-pairs
|
24c0bea14d53e6f67f1fbe2eca5bfe7cae389b33
| 20.395988
|
apache-2.0
| 0
| 14.483
| true
| false
| false
| true
| 2.004096
| 0.542004
| 54.20041
| 0.477303
| 26.596861
| 0.048338
| 4.833837
| 0.278523
| 3.803132
| 0.417375
| 11.205208
| 0.295628
| 21.73648
| false
| false
|
2024-09-12
|
2024-09-12
| 1
|
FuJhen/ft-openhermes-25-mistral-7b-irca-dpo-pairs (Merge)
|
FuJhen_mistral-instruct-7B-DPO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Adapter
|
?
|
<a target="_blank" href="https://huggingface.co/FuJhen/mistral-instruct-7B-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FuJhen/mistral-instruct-7B-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FuJhen__mistral-instruct-7B-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FuJhen/mistral-instruct-7B-DPO
|
e0bc86c23ce5aae1db576c8cca6f06f1f73af2db
| 19.029531
|
apache-2.0
| 0
| 14.496
| true
| false
| false
| true
| 2.019293
| 0.496842
| 49.684171
| 0.462391
| 24.925827
| 0.03852
| 3.851964
| 0.277685
| 3.691275
| 0.401563
| 9.428646
| 0.303358
| 22.595301
| false
| false
|
2024-09-12
|
2024-09-12
| 1
|
FuJhen/mistral-instruct-7B-DPO (Merge)
|
FuJhen_mistral_7b_v0.1_structedData_e2e_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Adapter
|
?
|
<a target="_blank" href="https://huggingface.co/FuJhen/mistral_7b_v0.1_structedData_e2e" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FuJhen/mistral_7b_v0.1_structedData_e2e</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FuJhen__mistral_7b_v0.1_structedData_e2e-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FuJhen/mistral_7b_v0.1_structedData_e2e
|
7231864981174d9bee8c7687c24c8344414eae6b
| 10.909311
|
apache-2.0
| 0
| 7
| true
| false
| false
| false
| 2.160492
| 0.172684
| 17.268403
| 0.411391
| 18.062424
| 0.004532
| 0.453172
| 0.279362
| 3.914989
| 0.372292
| 5.636458
| 0.281084
| 20.12042
| false
| false
|
2024-09-13
|
2024-09-13
| 1
|
FuJhen/mistral_7b_v0.1_structedData_e2e (Merge)
|
FuJhen_mistral_7b_v0.1_structedData_viggo_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Adapter
|
?
|
<a target="_blank" href="https://huggingface.co/FuJhen/mistral_7b_v0.1_structedData_viggo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FuJhen/mistral_7b_v0.1_structedData_viggo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FuJhen__mistral_7b_v0.1_structedData_viggo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FuJhen/mistral_7b_v0.1_structedData_viggo
|
7231864981174d9bee8c7687c24c8344414eae6b
| 12.440583
|
apache-2.0
| 0
| 14.483
| true
| false
| false
| false
| 2.152227
| 0.178329
| 17.832906
| 0.452386
| 23.960172
| 0.028701
| 2.870091
| 0.283557
| 4.474273
| 0.373813
| 3.926563
| 0.294215
| 21.579492
| false
| false
|
2024-09-13
|
2024-09-13
| 1
|
FuJhen/mistral_7b_v0.1_structedData_viggo (Merge)
|
FuseAI_FuseChat-7B-v2.0_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/FuseAI/FuseChat-7B-v2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FuseAI/FuseChat-7B-v2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FuseAI__FuseChat-7B-v2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FuseAI/FuseChat-7B-v2.0
|
65fdb310c09f56b9aca01b89a849f06f39faeb75
| 20.146367
|
apache-2.0
| 9
| 7.242
| true
| false
| false
| false
| 0.886612
| 0.342319
| 34.231949
| 0.495421
| 29.341638
| 0.061178
| 6.117825
| 0.302013
| 6.935123
| 0.479667
| 20.225
| 0.31624
| 24.02667
| false
| false
|
2024-08-13
|
2024-11-21
| 1
|
openchat/openchat_3.5
|
FuseAI_FuseChat-Llama-3.1-8B-Instruct_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/FuseAI/FuseChat-Llama-3.1-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FuseAI/FuseChat-Llama-3.1-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FuseAI__FuseChat-Llama-3.1-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FuseAI/FuseChat-Llama-3.1-8B-Instruct
|
cbb3accdd01a81194e947dfde1b95707db67f2b7
| 28.59551
|
apache-2.0
| 10
| 8.03
| true
| false
| false
| true
| 1.340429
| 0.720482
| 72.048166
| 0.511989
| 30.848065
| 0.247734
| 24.773414
| 0.305369
| 7.38255
| 0.382
| 6.15
| 0.373338
| 30.370863
| false
| false
|
2024-11-20
|
2025-01-07
| 0
|
FuseAI/FuseChat-Llama-3.1-8B-Instruct
|
FuseAI_FuseChat-Llama-3.2-3B-Instruct_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/FuseAI/FuseChat-Llama-3.2-3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FuseAI/FuseChat-Llama-3.2-3B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FuseAI__FuseChat-Llama-3.2-3B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FuseAI/FuseChat-Llama-3.2-3B-Instruct
|
db208455d103432dc8d683c242ef8b678d5b26c2
| 25.746914
| 6
| 3.213
| false
| false
| false
| true
| 0.50673
| 0.684886
| 68.48861
| 0.465837
| 24.2199
| 0.242447
| 24.244713
| 0.296141
| 6.152125
| 0.391396
| 7.691146
| 0.313165
| 23.684988
| false
| false
|
2024-12-06
|
2025-02-08
| 0
|
FuseAI/FuseChat-Llama-3.2-3B-Instruct
|
|
FuseAI_FuseChat-Qwen-2.5-7B-Instruct_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/FuseAI/FuseChat-Qwen-2.5-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FuseAI/FuseChat-Qwen-2.5-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FuseAI__FuseChat-Qwen-2.5-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FuseAI/FuseChat-Qwen-2.5-7B-Instruct
|
7735ee1acb31112cf93c35e8e22e764ad27cce3b
| 31.407716
| 13
| 7.616
| false
| false
| false
| true
| 1.309118
| 0.590564
| 59.056415
| 0.5526
| 36.251348
| 0.456193
| 45.619335
| 0.296141
| 6.152125
| 0.387365
| 6.720573
| 0.411818
| 34.646498
| false
| false
|
2024-11-12
|
2024-12-21
| 0
|
FuseAI/FuseChat-Qwen-2.5-7B-Instruct
|
|
GalrionSoftworks_MN-LooseCannon-12B-v1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/GalrionSoftworks/MN-LooseCannon-12B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">GalrionSoftworks/MN-LooseCannon-12B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/GalrionSoftworks__MN-LooseCannon-12B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
GalrionSoftworks/MN-LooseCannon-12B-v1
| 22.124427
| 8
| 12.248
| false
| false
| false
| true
| 3.058039
| 0.541779
| 54.177915
| 0.512818
| 29.976062
| 0.085347
| 8.534743
| 0.285235
| 4.697987
| 0.413844
| 10.963802
| 0.319564
| 24.396055
| false
| false
|
2024-08-09
|
2024-09-05
| 1
|
GalrionSoftworks/MN-LooseCannon-12B-v1 (Merge)
|
||
GalrionSoftworks_MagnusIntellectus-12B-v1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/GalrionSoftworks/MagnusIntellectus-12B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">GalrionSoftworks/MagnusIntellectus-12B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/GalrionSoftworks__MagnusIntellectus-12B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
GalrionSoftworks/MagnusIntellectus-12B-v1
|
fc83cb3eec2f8328448c5fe3cb830fc77983a6b9
| 21.773296
|
apache-2.0
| 5
| 12.248
| true
| false
| false
| true
| 3.248528
| 0.442137
| 44.213686
| 0.532301
| 33.262254
| 0.064955
| 6.495468
| 0.284396
| 4.58613
| 0.442802
| 15.183594
| 0.342088
| 26.898641
| true
| false
|
2024-08-13
|
2024-09-05
| 1
|
GalrionSoftworks/MagnusIntellectus-12B-v1 (Merge)
|
GenVRadmin_AryaBhatta-GemmaOrca-2-Merged_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
GemmaForCausalLM
|
<a target="_blank" href="https://huggingface.co/GenVRadmin/AryaBhatta-GemmaOrca-2-Merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">GenVRadmin/AryaBhatta-GemmaOrca-2-Merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/GenVRadmin__AryaBhatta-GemmaOrca-2-Merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
GenVRadmin/AryaBhatta-GemmaOrca-2-Merged
|
0a86455c5f0606f6b743ba0f0b1c1c26bd50976c
| 14.00645
|
mit
| 0
| 8.538
| true
| false
| false
| false
| 1.818865
| 0.306374
| 30.637375
| 0.388749
| 13.661592
| 0.049849
| 4.984894
| 0.268456
| 2.46085
| 0.455021
| 16.910938
| 0.238447
| 15.383053
| false
| false
|
2024-04-09
|
2025-02-06
| 0
|
GenVRadmin/AryaBhatta-GemmaOrca-2-Merged
|
GenVRadmin_AryaBhatta-GemmaOrca-Merged_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
GemmaForCausalLM
|
<a target="_blank" href="https://huggingface.co/GenVRadmin/AryaBhatta-GemmaOrca-Merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">GenVRadmin/AryaBhatta-GemmaOrca-Merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/GenVRadmin__AryaBhatta-GemmaOrca-Merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
GenVRadmin/AryaBhatta-GemmaOrca-Merged
|
0b0363f808aabaf8fe85ae8229e968abca2a54de
| 11.994728
|
mit
| 1
| 8.538
| true
| false
| false
| false
| 1.981063
| 0.306374
| 30.637375
| 0.413063
| 17.683588
| 0.05136
| 5.135952
| 0.255872
| 0.782998
| 0.352385
| 4.08151
| 0.222822
| 13.646941
| false
| false
|
2024-04-01
|
2025-02-06
| 0
|
GenVRadmin/AryaBhatta-GemmaOrca-Merged
|
GenVRadmin_AryaBhatta-GemmaUltra-Merged_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
GemmaForCausalLM
|
<a target="_blank" href="https://huggingface.co/GenVRadmin/AryaBhatta-GemmaUltra-Merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">GenVRadmin/AryaBhatta-GemmaUltra-Merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/GenVRadmin__AryaBhatta-GemmaUltra-Merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
GenVRadmin/AryaBhatta-GemmaUltra-Merged
|
837acef7bd681ef60f03ab16e4670fb72e47e134
| 13.282815
|
mit
| 1
| 8.538
| true
| false
| false
| false
| 2.007431
| 0.302077
| 30.207738
| 0.414145
| 17.96825
| 0.053625
| 5.362538
| 0.253356
| 0.447427
| 0.427854
| 11.648438
| 0.226563
| 14.0625
| false
| false
|
2024-04-12
|
2025-02-06
| 0
|
GenVRadmin/AryaBhatta-GemmaUltra-Merged
|
GenVRadmin_llama38bGenZ_Vikas-Merged_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/GenVRadmin/llama38bGenZ_Vikas-Merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">GenVRadmin/llama38bGenZ_Vikas-Merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/GenVRadmin__llama38bGenZ_Vikas-Merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
GenVRadmin/llama38bGenZ_Vikas-Merged
|
a15de41fcf74b13bdc8d9b680bdc7836fc5aecfe
| 16.093383
|
mit
| 0
| 8.03
| true
| false
| false
| false
| 1.435317
| 0.300029
| 30.002948
| 0.453598
| 23.13191
| 0.057402
| 5.740181
| 0.295302
| 6.040268
| 0.440167
| 13.620833
| 0.262217
| 18.024158
| false
| false
|
2024-05-22
|
2025-02-06
| 0
|
GenVRadmin/llama38bGenZ_Vikas-Merged
|
GoToCompany_gemma2-9b-cpt-sahabatai-v1-instruct_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/GoToCompany/gemma2-9b-cpt-sahabatai-v1-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">GoToCompany/gemma2-9b-cpt-sahabatai-v1-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/GoToCompany__gemma2-9b-cpt-sahabatai-v1-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
GoToCompany/gemma2-9b-cpt-sahabatai-v1-instruct
|
ca19cec82a7d2bdba20020e1bebf296417cfc3ee
| 32.46826
|
gemma
| 35
| 9.242
| true
| false
| false
| false
| 3.862189
| 0.655061
| 65.506079
| 0.595455
| 41.866504
| 0.205438
| 20.543807
| 0.334732
| 11.297539
| 0.477865
| 19.333073
| 0.426363
| 36.262559
| false
| false
|
2024-11-06
|
2024-11-20
| 1
|
GoToCompany/gemma2-9b-cpt-sahabatai-v1-instruct (Merge)
|
GoToCompany_llama3-8b-cpt-sahabatai-v1-instruct_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/GoToCompany/llama3-8b-cpt-sahabatai-v1-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">GoToCompany/llama3-8b-cpt-sahabatai-v1-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/GoToCompany__llama3-8b-cpt-sahabatai-v1-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
GoToCompany/llama3-8b-cpt-sahabatai-v1-instruct
|
20fd3cff1dc86553d11b5c4b2fdbb6f2dd1ede55
| 23.059399
|
llama3
| 11
| 8.03
| true
| false
| false
| true
| 1.346822
| 0.523845
| 52.384451
| 0.495129
| 28.539529
| 0.127644
| 12.76435
| 0.266779
| 2.237136
| 0.448844
| 15.172135
| 0.345329
| 27.258791
| false
| false
|
2024-11-06
|
2024-11-20
| 1
|
GoToCompany/llama3-8b-cpt-sahabatai-v1-instruct (Merge)
|
Goekdeniz-Guelmez_Josiefied-Qwen2.5-0.5B-Instruct-abliterated-v1_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Goekdeniz-Guelmez/Josiefied-Qwen2.5-0.5B-Instruct-abliterated-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Goekdeniz-Guelmez/Josiefied-Qwen2.5-0.5B-Instruct-abliterated-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Goekdeniz-Guelmez__Josiefied-Qwen2.5-0.5B-Instruct-abliterated-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Goekdeniz-Guelmez/Josiefied-Qwen2.5-0.5B-Instruct-abliterated-v1
|
bfc0e7dc6add02baecd9b6f84a078f7f3d164315
| 9.768236
|
apache-2.0
| 1
| 0.63
| true
| false
| false
| true
| 0.975285
| 0.34719
| 34.71899
| 0.326831
| 6.845786
| 0.089124
| 8.912387
| 0.251678
| 0.223714
| 0.32625
| 0.78125
| 0.164146
| 7.12729
| false
| false
|
2024-11-17
|
2024-11-18
| 2
|
Qwen/Qwen2.5-0.5B
|
Goekdeniz-Guelmez_Josiefied-Qwen2.5-0.5B-Instruct-abliterated-v1_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Goekdeniz-Guelmez/Josiefied-Qwen2.5-0.5B-Instruct-abliterated-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Goekdeniz-Guelmez/Josiefied-Qwen2.5-0.5B-Instruct-abliterated-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Goekdeniz-Guelmez__Josiefied-Qwen2.5-0.5B-Instruct-abliterated-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Goekdeniz-Guelmez/Josiefied-Qwen2.5-0.5B-Instruct-abliterated-v1
|
bfc0e7dc6add02baecd9b6f84a078f7f3d164315
| 8.415919
|
apache-2.0
| 1
| 0.63
| true
| false
| false
| true
| 0.498004
| 0.341694
| 34.169448
| 0.32921
| 7.221169
| 0.002266
| 0.226586
| 0.25755
| 1.006711
| 0.324917
| 0.78125
| 0.163813
| 7.090352
| false
| false
|
2024-11-17
|
2024-11-18
| 2
|
Qwen/Qwen2.5-0.5B
|
Goekdeniz-Guelmez_Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v1_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Goekdeniz-Guelmez/Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Goekdeniz-Guelmez/Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Goekdeniz-Guelmez__Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Goekdeniz-Guelmez/Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v1
|
eca7edeba61e894597e9940348e8d90817c1ad79
| 18.441175
|
apache-2.0
| 5
| 1.777
| true
| false
| false
| true
| 1.566762
| 0.476858
| 47.685807
| 0.418601
| 18.306013
| 0.208459
| 20.845921
| 0.243289
| 0
| 0.36749
| 4.002865
| 0.278258
| 19.806442
| false
| false
|
2024-09-20
|
2024-09-28
| 1
|
Qwen/Qwen2.5-1.5B
|
Goekdeniz-Guelmez_Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v2_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Goekdeniz-Guelmez/Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Goekdeniz-Guelmez/Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Goekdeniz-Guelmez__Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Goekdeniz-Guelmez/Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v2
|
ff4a6eff69adb015dfcfbff7a2d2dc43b34afe89
| 15.566749
|
apache-2.0
| 1
| 1.544
| true
| false
| false
| true
| 1.438486
| 0.421554
| 42.15537
| 0.404189
| 16.499503
| 0.126888
| 12.688822
| 0.239933
| 0
| 0.376854
| 4.706771
| 0.25615
| 17.35003
| false
| false
|
2024-09-28
|
2024-09-28
| 2
|
Qwen/Qwen2.5-1.5B
|
Goekdeniz-Guelmez_Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v3_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Goekdeniz-Guelmez/Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Goekdeniz-Guelmez/Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Goekdeniz-Guelmez__Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Goekdeniz-Guelmez/Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v3
|
03ffa6f7a6ada9d63d838707c597297f048d409b
| 15.592787
|
apache-2.0
| 1
| 1.544
| true
| false
| false
| true
| 1.412402
| 0.425251
| 42.525056
| 0.405345
| 16.439712
| 0.130665
| 13.066465
| 0.243289
| 0
| 0.370187
| 4.240104
| 0.255568
| 17.285387
| false
| false
|
2024-09-28
|
2024-09-28
| 3
|
Qwen/Qwen2.5-1.5B
|
Goekdeniz-Guelmez_Josiefied-Qwen2.5-14B-Instruct-abliterated-v4_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Goekdeniz-Guelmez/Josiefied-Qwen2.5-14B-Instruct-abliterated-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Goekdeniz-Guelmez/Josiefied-Qwen2.5-14B-Instruct-abliterated-v4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Goekdeniz-Guelmez__Josiefied-Qwen2.5-14B-Instruct-abliterated-v4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Goekdeniz-Guelmez/Josiefied-Qwen2.5-14B-Instruct-abliterated-v4
|
00afd27eef16e835fcb0d8e687435dca3c185bdf
| 42.550066
|
apache-2.0
| 17
| 14.77
| true
| false
| false
| true
| 3.494234
| 0.829167
| 82.916661
| 0.635564
| 48.05227
| 0.542296
| 54.229607
| 0.342282
| 12.304251
| 0.428667
| 13.15
| 0.501828
| 44.647606
| false
| false
|
2024-10-21
|
2024-10-23
| 2
|
Qwen/Qwen2.5-14B
|
Goekdeniz-Guelmez_Josiefied-Qwen2.5-7B-Instruct-abliterated-v2_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Goekdeniz-Guelmez/Josiefied-Qwen2.5-7B-Instruct-abliterated-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Goekdeniz-Guelmez/Josiefied-Qwen2.5-7B-Instruct-abliterated-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Goekdeniz-Guelmez__Josiefied-Qwen2.5-7B-Instruct-abliterated-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Goekdeniz-Guelmez/Josiefied-Qwen2.5-7B-Instruct-abliterated-v2
|
ecf4024048ea1be2f0840a50080fb79b88aacde9
| 35.316633
|
apache-2.0
| 6
| 7.616
| true
| false
| false
| true
| 2.403013
| 0.781381
| 78.138118
| 0.530967
| 33.333986
| 0.453172
| 45.317221
| 0.298658
| 6.487696
| 0.435396
| 13.957813
| 0.411985
| 34.664967
| false
| false
|
2024-09-20
|
2024-10-08
| 1
|
Qwen/Qwen2.5-7B
|
Goekdeniz-Guelmez_j.o.s.i.e.v4o-1.5b-dpo-stage1-v1_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Goekdeniz-Guelmez/j.o.s.i.e.v4o-1.5b-dpo-stage1-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Goekdeniz-Guelmez/j.o.s.i.e.v4o-1.5b-dpo-stage1-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Goekdeniz-Guelmez__j.o.s.i.e.v4o-1.5b-dpo-stage1-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Goekdeniz-Guelmez/j.o.s.i.e.v4o-1.5b-dpo-stage1-v1
|
d5ddad290d83b1ba8a7612a6c1cfad6fb4346fe4
| 15.078048
|
apache-2.0
| 1
| 1.544
| true
| false
| false
| true
| 1.582305
| 0.418831
| 41.883092
| 0.412421
| 17.748017
| 0.120091
| 12.009063
| 0.250839
| 0.111857
| 0.352854
| 1.440104
| 0.255485
| 17.276152
| false
| false
|
2024-10-07
|
2024-10-08
| 2
|
Qwen/Qwen2.5-1.5B
|
Goekdeniz-Guelmez_josie-3b-v6.0_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Goekdeniz-Guelmez/josie-3b-v6.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Goekdeniz-Guelmez/josie-3b-v6.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Goekdeniz-Guelmez__josie-3b-v6.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Goekdeniz-Guelmez/josie-3b-v6.0
|
3f8ce40bdaa0757ede5aaaf2cdd14538b559b4db
| 24.746541
|
apache-2.0
| 1
| 3.086
| true
| false
| false
| true
| 1.491912
| 0.600955
| 60.095546
| 0.449615
| 22.871088
| 0.293807
| 29.380665
| 0.290268
| 5.369128
| 0.386125
| 6.098958
| 0.321975
| 24.663859
| false
| false
|
2024-12-29
|
2025-01-07
| 1
|
Removed
|
Goekdeniz-Guelmez_josie-7b-v6.0_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Goekdeniz-Guelmez/josie-7b-v6.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Goekdeniz-Guelmez/josie-7b-v6.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Goekdeniz-Guelmez__josie-7b-v6.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Goekdeniz-Guelmez/josie-7b-v6.0
|
d2e22fda9ce97aa5ca745d3b6d8ca2f1f7103ed5
| 32.374168
|
apache-2.0
| 1
| 7.616
| true
| false
| false
| true
| 1.350926
| 0.741165
| 74.116455
| 0.510486
| 30.444753
| 0.435801
| 43.58006
| 0.282718
| 4.362416
| 0.415396
| 10.557812
| 0.380652
| 31.183511
| false
| false
|
2024-12-29
|
2025-01-07
| 3
|
Qwen/Qwen2.5-7B
|
Goekdeniz-Guelmez_josie-7b-v6.0-step2000_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Goekdeniz-Guelmez/josie-7b-v6.0-step2000" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Goekdeniz-Guelmez/josie-7b-v6.0-step2000</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Goekdeniz-Guelmez__josie-7b-v6.0-step2000-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Goekdeniz-Guelmez/josie-7b-v6.0-step2000
|
df28f1369c22a5f2feac05793d4a460a5f873891
| 26.970438
|
apache-2.0
| 2
| 7.616
| true
| false
| false
| true
| 1.378839
| 0.762772
| 76.277167
| 0.509781
| 30.081094
| 0
| 0
| 0.280201
| 4.026846
| 0.457938
| 17.742188
| 0.403258
| 33.695331
| false
| false
|
2024-12-11
|
2024-12-11
| 2
|
Qwen/Qwen2.5-7B
|
Goekdeniz-Guelmez_josie-7b-v6.0-step2000_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Goekdeniz-Guelmez/josie-7b-v6.0-step2000" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Goekdeniz-Guelmez/josie-7b-v6.0-step2000</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Goekdeniz-Guelmez__josie-7b-v6.0-step2000-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Goekdeniz-Guelmez/josie-7b-v6.0-step2000
|
df28f1369c22a5f2feac05793d4a460a5f873891
| 33.832926
|
apache-2.0
| 2
| 7.616
| true
| false
| false
| true
| 1.383398
| 0.759774
| 75.977407
| 0.510713
| 30.395813
| 0.423716
| 42.371601
| 0.276846
| 3.579418
| 0.453938
| 17.208854
| 0.40118
| 33.464465
| false
| false
|
2024-12-11
|
2024-12-11
| 2
|
Qwen/Qwen2.5-7B
|
GreenNode_GreenNode-small-9B-it_float16
|
float16
|
🟩 continuously pretrained
|
🟩
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/GreenNode/GreenNode-small-9B-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">GreenNode/GreenNode-small-9B-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/GreenNode__GreenNode-small-9B-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
GreenNode/GreenNode-small-9B-it
|
1ba4ce8e2267c7fcc820961a9bfc13ab80150866
| 31.194506
| 0
| 9.242
| false
| false
| false
| true
| 5.291888
| 0.743613
| 74.36125
| 0.599384
| 41.899926
| 0.174471
| 17.44713
| 0.319631
| 9.284116
| 0.420417
| 11.652083
| 0.392703
| 32.522533
| false
| false
|
2024-10-14
| 0
|
Removed
|
||
GritLM_GritLM-7B-KTO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/GritLM/GritLM-7B-KTO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">GritLM/GritLM-7B-KTO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/GritLM__GritLM-7B-KTO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
GritLM/GritLM-7B-KTO
|
b5c48669508c1de18c698460c187f64e90e7df44
| 19.235895
|
apache-2.0
| 4
| 7.242
| true
| false
| false
| true
| 1.279727
| 0.531013
| 53.101327
| 0.485294
| 27.904318
| 0.02719
| 2.719033
| 0.297819
| 6.375839
| 0.371021
| 6.644271
| 0.268035
| 18.670582
| false
| false
|
2024-04-16
|
2024-08-04
| 0
|
GritLM/GritLM-7B-KTO
|
GritLM_GritLM-8x7B-KTO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MixtralForCausalLM
|
<a target="_blank" href="https://huggingface.co/GritLM/GritLM-8x7B-KTO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">GritLM/GritLM-8x7B-KTO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/GritLM__GritLM-8x7B-KTO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
GritLM/GritLM-8x7B-KTO
|
938913477064fcc498757c5136d9899bb6e713ed
| 26.241305
|
apache-2.0
| 3
| 46.703
| true
| false
| false
| true
| 9.208926
| 0.571405
| 57.140498
| 0.58203
| 40.826162
| 0.122356
| 12.23565
| 0.296141
| 6.152125
| 0.421656
| 11.673698
| 0.364777
| 29.419696
| false
| false
|
2024-04-17
|
2024-08-04
| 0
|
GritLM/GritLM-8x7B-KTO
|
Groq_Llama-3-Groq-8B-Tool-Use_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/Groq/Llama-3-Groq-8B-Tool-Use" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Groq/Llama-3-Groq-8B-Tool-Use</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Groq__Llama-3-Groq-8B-Tool-Use-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Groq/Llama-3-Groq-8B-Tool-Use
|
3bf6b914d7043d1bbfcfc7a9aa7581a8104eabac
| 21.445601
|
llama3
| 274
| 8.03
| true
| false
| false
| true
| 1.008657
| 0.609823
| 60.982305
| 0.486338
| 27.254234
| 0.060423
| 6.042296
| 0.267617
| 2.348993
| 0.366031
| 5.38724
| 0.339927
| 26.65854
| false
| false
|
2024-06-24
|
2025-01-01
| 1
|
meta-llama/Meta-Llama-3-8B
|
Gryphe_Pantheon-RP-1.0-8b-Llama-3_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/Gryphe/Pantheon-RP-1.0-8b-Llama-3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Gryphe/Pantheon-RP-1.0-8b-Llama-3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Gryphe__Pantheon-RP-1.0-8b-Llama-3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Gryphe/Pantheon-RP-1.0-8b-Llama-3
|
70a6df202c9df9abdc6928bec5a5ab47f2667aee
| 16.873122
|
apache-2.0
| 46
| 8.03
| true
| false
| false
| true
| 1.441673
| 0.393252
| 39.325213
| 0.453908
| 23.631915
| 0.063444
| 6.344411
| 0.276007
| 3.467562
| 0.38324
| 5.504948
| 0.306682
| 22.964687
| false
| false
|
2024-05-08
|
2024-06-27
| 1
|
meta-llama/Meta-Llama-3-8B
|
Gryphe_Pantheon-RP-1.5-12b-Nemo_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/Gryphe/Pantheon-RP-1.5-12b-Nemo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Gryphe/Pantheon-RP-1.5-12b-Nemo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Gryphe__Pantheon-RP-1.5-12b-Nemo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Gryphe/Pantheon-RP-1.5-12b-Nemo
|
00107381f05f69666772d88a1b11affe77c94a47
| 21.323747
|
apache-2.0
| 31
| 12.248
| true
| false
| false
| true
| 3.371166
| 0.476308
| 47.630842
| 0.519582
| 31.750144
| 0.049094
| 4.909366
| 0.272651
| 3.020134
| 0.442031
| 15.053906
| 0.330203
| 25.578088
| false
| false
|
2024-07-25
|
2024-08-04
| 1
|
mistralai/Mistral-Nemo-Base-2407
|
Gryphe_Pantheon-RP-1.6-12b-Nemo_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/Gryphe/Pantheon-RP-1.6-12b-Nemo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Gryphe/Pantheon-RP-1.6-12b-Nemo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Gryphe__Pantheon-RP-1.6-12b-Nemo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Gryphe/Pantheon-RP-1.6-12b-Nemo
|
60cf38ae0367baf314e3cce748d9a199adfea557
| 20.566599
|
apache-2.0
| 12
| 12.248
| true
| false
| false
| true
| 3.474506
| 0.448057
| 44.805671
| 0.520401
| 31.687344
| 0.046073
| 4.607251
| 0.277685
| 3.691275
| 0.42876
| 12.928385
| 0.331117
| 25.679669
| false
| false
|
2024-08-18
|
2024-08-31
| 1
|
mistralai/Mistral-Nemo-Base-2407
|
Gryphe_Pantheon-RP-1.6-12b-Nemo-KTO_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/Gryphe/Pantheon-RP-1.6-12b-Nemo-KTO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Gryphe/Pantheon-RP-1.6-12b-Nemo-KTO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Gryphe__Pantheon-RP-1.6-12b-Nemo-KTO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Gryphe/Pantheon-RP-1.6-12b-Nemo-KTO
|
6cb6d8d9a7352d71f539ab5053987e058c090443
| 21.558599
|
apache-2.0
| 5
| 12.248
| true
| false
| false
| true
| 3.364053
| 0.463619
| 46.361875
| 0.527698
| 33.0322
| 0.05287
| 5.287009
| 0.295302
| 6.040268
| 0.424792
| 12.165625
| 0.338182
| 26.464613
| false
| false
|
2024-08-28
|
2024-08-31
| 1
|
mistralai/Mistral-Nemo-Base-2407
|
Gryphe_Pantheon-RP-Pure-1.6.2-22b-Small_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/Gryphe/Pantheon-RP-Pure-1.6.2-22b-Small" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Gryphe/Pantheon-RP-Pure-1.6.2-22b-Small</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Gryphe__Pantheon-RP-Pure-1.6.2-22b-Small-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Gryphe/Pantheon-RP-Pure-1.6.2-22b-Small
|
d031830dcb3bc5ad9634374db4dd15b3ef6ebe0f
| 28.138635
|
other
| 29
| 22.247
| true
| false
| false
| true
| 2.90664
| 0.693104
| 69.31043
| 0.530454
| 31.683163
| 0.202417
| 20.241692
| 0.328859
| 10.514541
| 0.376479
| 4.393229
| 0.394199
| 32.688756
| false
| false
|
2024-10-13
|
2024-10-15
| 1
|
mistralai/Mistral-Small-Instruct-2409
|
GuilhermeNaturaUmana_Nature-Reason-1.2-reallysmall_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/GuilhermeNaturaUmana/Nature-Reason-1.2-reallysmall" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">GuilhermeNaturaUmana/Nature-Reason-1.2-reallysmall</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/GuilhermeNaturaUmana__Nature-Reason-1.2-reallysmall-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
GuilhermeNaturaUmana/Nature-Reason-1.2-reallysmall
|
2968459668192def7382b614630cabab48f2c865
| 28.689022
| 0
| 7.616
| false
| false
| false
| true
| 0.689445
| 0.498541
| 49.854054
| 0.564484
| 37.752975
| 0.257553
| 25.755287
| 0.300336
| 6.711409
| 0.437281
| 13.960156
| 0.442902
| 38.100251
| false
| false
|
2025-03-01
| 0
|
Removed
|
||
GuilhermeNaturaUmana_Nature-Reason-1.2-reallysmall_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/GuilhermeNaturaUmana/Nature-Reason-1.2-reallysmall" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">GuilhermeNaturaUmana/Nature-Reason-1.2-reallysmall</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/GuilhermeNaturaUmana__Nature-Reason-1.2-reallysmall-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
GuilhermeNaturaUmana/Nature-Reason-1.2-reallysmall
|
2968459668192def7382b614630cabab48f2c865
| 28.37477
| 0
| 7.616
| false
| false
| false
| true
| 0.693643
| 0.479107
| 47.910655
| 0.564872
| 37.812776
| 0.25
| 25
| 0.299497
| 6.599553
| 0.443917
| 15.05625
| 0.440824
| 37.869385
| false
| false
|
2025-03-01
| 0
|
Removed
|
||
Gunulhona_Gemma-Ko-Merge_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Gunulhona/Gemma-Ko-Merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Gunulhona/Gemma-Ko-Merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Gunulhona__Gemma-Ko-Merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Gunulhona/Gemma-Ko-Merge
|
ca6b0eb1405f21db6a7a9cce3b112d21fcfdde97
| 29.044658
| 1
| 10.159
| false
| false
| false
| true
| 6.274495
| 0.641572
| 64.157214
| 0.581303
| 38.787197
| 0.188066
| 18.806647
| 0.33557
| 11.409396
| 0.404698
| 9.120573
| 0.387882
| 31.986924
| false
| false
|
2024-09-04
|
2024-10-23
| 1
|
Gunulhona/Gemma-Ko-Merge (Merge)
|
|
Gunulhona_Gemma-Ko-Merge-PEFT_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Adapter
|
?
|
<a target="_blank" href="https://huggingface.co/Gunulhona/Gemma-Ko-Merge-PEFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Gunulhona/Gemma-Ko-Merge-PEFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Gunulhona__Gemma-Ko-Merge-PEFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Gunulhona/Gemma-Ko-Merge-PEFT
|
ca6b0eb1405f21db6a7a9cce3b112d21fcfdde97
| 18.169495
| 0
| 20.318
| false
| false
| false
| false
| 5.876477
| 0.288039
| 28.803907
| 0.515409
| 30.186273
| 0
| 0
| 0.324664
| 9.955257
| 0.40801
| 8.767969
| 0.381732
| 31.303561
| false
| false
|
2024-09-30
|
2024-10-17
| 0
|
Gunulhona/Gemma-Ko-Merge-PEFT
|
|
Gunulhona_Gemma-Ko-Merge-PEFT_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Adapter
|
?
|
<a target="_blank" href="https://huggingface.co/Gunulhona/Gemma-Ko-Merge-PEFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Gunulhona/Gemma-Ko-Merge-PEFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Gunulhona__Gemma-Ko-Merge-PEFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Gunulhona/Gemma-Ko-Merge-PEFT
|
ca6b0eb1405f21db6a7a9cce3b112d21fcfdde97
| 18.06624
| 0
| 20.318
| false
| false
| false
| true
| 18.788667
| 0.444135
| 44.41349
| 0.486299
| 26.015069
| 0
| 0
| 0.307047
| 7.606264
| 0.398583
| 7.05625
| 0.309757
| 23.306368
| false
| false
|
2024-09-30
|
2024-10-23
| 0
|
Gunulhona/Gemma-Ko-Merge-PEFT
|
|
HPAI-BSC_Llama3-Aloe-8B-Alpha_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/HPAI-BSC/Llama3-Aloe-8B-Alpha" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HPAI-BSC/Llama3-Aloe-8B-Alpha</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HPAI-BSC__Llama3-Aloe-8B-Alpha-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
HPAI-BSC/Llama3-Aloe-8B-Alpha
|
f0bce5c1fee5ea2a6679bb3dc9de8548e7262c9e
| 20.230447
|
cc-by-nc-4.0
| 59
| 8.03
| true
| false
| false
| true
| 1.59049
| 0.508107
| 50.810738
| 0.483085
| 27.145978
| 0.061178
| 6.117825
| 0.294463
| 5.928412
| 0.367271
| 5.875521
| 0.329538
| 25.504211
| false
| false
|
2024-04-26
|
2024-10-29
| 0
|
HPAI-BSC/Llama3-Aloe-8B-Alpha
|
HPAI-BSC_Llama3.1-Aloe-Beta-8B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/HPAI-BSC/Llama3.1-Aloe-Beta-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HPAI-BSC/Llama3.1-Aloe-Beta-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HPAI-BSC__Llama3.1-Aloe-Beta-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
HPAI-BSC/Llama3.1-Aloe-Beta-8B
|
3f2f0bbfb03cb0a8310efa50659688c1f2c02da0
| 26.524195
|
llama3.1
| 11
| 8.03
| true
| false
| false
| true
| 2.08138
| 0.725328
| 72.532769
| 0.509276
| 30.369625
| 0.182779
| 18.277946
| 0.268456
| 2.46085
| 0.383458
| 6.832292
| 0.358045
| 28.67169
| false
| false
|
2024-10-30
|
2024-11-07
| 0
|
HPAI-BSC/Llama3.1-Aloe-Beta-8B
|
HPAI-BSC_Qwen2.5-Aloe-Beta-7B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/HPAI-BSC/Qwen2.5-Aloe-Beta-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HPAI-BSC/Qwen2.5-Aloe-Beta-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HPAI-BSC__Qwen2.5-Aloe-Beta-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
HPAI-BSC/Qwen2.5-Aloe-Beta-7B
|
853ee78094c4e6ae096fe616fbc7b617dd78f1f5
| 27.826721
|
apache-2.0
| 5
| 7.616
| true
| false
| false
| true
| 1.21672
| 0.455351
| 45.535069
| 0.5049
| 30.331605
| 0.35423
| 35.422961
| 0.291107
| 5.480984
| 0.426031
| 12.920573
| 0.435422
| 37.269134
| false
| false
|
2024-12-09
|
2024-12-17
| 0
|
HPAI-BSC/Qwen2.5-Aloe-Beta-7B
|
HarbingerX_Zeitgeist-3b-V1_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/HarbingerX/Zeitgeist-3b-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HarbingerX/Zeitgeist-3b-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HarbingerX__Zeitgeist-3b-V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
HarbingerX/Zeitgeist-3b-V1
|
de159b93ae7c7d816de552025bcbd8a91f8952c1
| 21.705414
| 0
| 3.213
| false
| false
| false
| true
| 0.591134
| 0.671172
| 67.117249
| 0.444079
| 21.64756
| 0.103474
| 10.347432
| 0.281879
| 4.250559
| 0.357938
| 4.542188
| 0.300947
| 22.327497
| false
| false
|
2025-02-17
| 0
|
Removed
|
||
HarbingerX_Zeitgeist-3b-V1.2_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/HarbingerX/Zeitgeist-3b-V1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HarbingerX/Zeitgeist-3b-V1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HarbingerX__Zeitgeist-3b-V1.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
HarbingerX/Zeitgeist-3b-V1.2
|
e4679006fa1a030eafa948852a8e084028970405
| 21.62222
| 0
| 3.213
| false
| false
| false
| true
| 0.602677
| 0.675419
| 67.5419
| 0.444065
| 21.629715
| 0.101208
| 10.120846
| 0.277685
| 3.691275
| 0.357906
| 3.904948
| 0.305602
| 22.844637
| false
| false
|
2025-02-26
| 0
|
Removed
|
||
Hastagaras_L3.2-JametMini-3B-MK.III_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/Hastagaras/L3.2-JametMini-3B-MK.III" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Hastagaras/L3.2-JametMini-3B-MK.III</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Hastagaras__L3.2-JametMini-3B-MK.III-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Hastagaras/L3.2-JametMini-3B-MK.III
|
54e451f243ab69327068e92925fe2ecbc91ed06e
| 21.750385
|
llama3.2
| 7
| 3.213
| true
| false
| false
| true
| 0.569636
| 0.618266
| 61.82662
| 0.453852
| 22.362059
| 0.14577
| 14.577039
| 0.282718
| 4.362416
| 0.368604
| 5.342188
| 0.298288
| 22.031989
| false
| false
|
2024-10-12
|
2025-02-26
| 1
|
Hastagaras/L3.2-JametMini-3B-MK.III (Merge)
|
Hastagaras_Llama-3.1-Jamet-8B-MK.I_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/Hastagaras/Llama-3.1-Jamet-8B-MK.I" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Hastagaras/Llama-3.1-Jamet-8B-MK.I</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Hastagaras__Llama-3.1-Jamet-8B-MK.I-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Hastagaras/Llama-3.1-Jamet-8B-MK.I
|
26cb97042b04fee7d0140375a7babbf92278f8ac
| 25.423806
|
llama3.1
| 1
| 8.03
| true
| false
| false
| true
| 1.43748
| 0.733821
| 73.382071
| 0.504867
| 29.503905
| 0.126888
| 12.688822
| 0.274329
| 3.243848
| 0.372604
| 6.142188
| 0.348238
| 27.582004
| false
| false
|
2024-11-18
|
2024-11-18
| 0
|
Hastagaras/Llama-3.1-Jamet-8B-MK.I
|
Hastagaras_Zabuza-8B-Llama-3.1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/Hastagaras/Zabuza-8B-Llama-3.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Hastagaras/Zabuza-8B-Llama-3.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Hastagaras__Zabuza-8B-Llama-3.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Hastagaras/Zabuza-8B-Llama-3.1
|
57ffa92f229b8308916aae1d64d8f0dc9baa0a34
| 19.925827
|
llama3.1
| 1
| 8.03
| true
| false
| false
| true
| 1.350575
| 0.626534
| 62.653426
| 0.453892
| 23.220321
| 0.055136
| 5.513595
| 0.264262
| 1.901566
| 0.356792
| 4.898958
| 0.292304
| 21.367095
| true
| false
|
2024-11-05
|
2024-11-05
| 1
|
Hastagaras/Zabuza-8B-Llama-3.1 (Merge)
|
HelpingAI_Cipher-20B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/HelpingAI/Cipher-20B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HelpingAI/Cipher-20B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HelpingAI__Cipher-20B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
HelpingAI/Cipher-20B
|
a01cc17784a3afa765de402da36805b2adff70f7
| 26.976008
|
other
| 3
| 20.551
| true
| false
| false
| true
| 4.06823
| 0.537758
| 53.775759
| 0.603243
| 43.439736
| 0.199396
| 19.939577
| 0.295302
| 6.040268
| 0.400292
| 8.169792
| 0.374418
| 30.490913
| false
| false
|
2024-12-14
|
2024-12-14
| 0
|
HelpingAI/Cipher-20B
|
HelpingAI_Dhanishtha-Large_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/HelpingAI/Dhanishtha-Large" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HelpingAI/Dhanishtha-Large</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HelpingAI__Dhanishtha-Large-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
HelpingAI/Dhanishtha-Large
|
54544ebab9ef04370a3bb41e18c60e3ce8b41d83
| 19.889173
|
apache-2.0
| 2
| 7.613
| true
| false
| false
| false
| 0.681002
| 0.245674
| 24.56737
| 0.460365
| 24.002214
| 0.385196
| 38.519637
| 0.302852
| 7.04698
| 0.38451
| 5.697135
| 0.275515
| 19.501699
| false
| false
|
2025-02-24
|
2025-02-27
| 1
|
HelpingAI/Dhanishtha-Large (Merge)
|
HelpingAI_Priya-10B_float16
|
float16
|
🟩 continuously pretrained
|
🟩
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/HelpingAI/Priya-10B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HelpingAI/Priya-10B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HelpingAI__Priya-10B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
HelpingAI/Priya-10B
|
82f217b1c0b50c3941a6d3f0cff94812aa10c0b9
| 14.143284
|
other
| 1
| 10.211
| true
| false
| false
| true
| 1.815679
| 0.404293
| 40.429283
| 0.444146
| 19.966797
| 0.018882
| 1.888218
| 0.255872
| 0.782998
| 0.379271
| 5.208854
| 0.249252
| 16.583555
| false
| false
|
2024-12-15
|
2024-12-18
| 1
|
HelpingAI/HelpingAI2.5-10B
|
HelpingAI_Priya-3B_bfloat16
|
bfloat16
|
🟩 continuously pretrained
|
🟩
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/HelpingAI/Priya-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HelpingAI/Priya-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HelpingAI__Priya-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
HelpingAI/Priya-3B
|
43681968e92d52df5b171aff6aa59baf4f3cdeba
| 13.429592
|
other
| 5
| 2.81
| true
| false
| false
| true
| 1.28266
| 0.452578
| 45.257805
| 0.396118
| 14.335273
| 0.01435
| 1.435045
| 0.256711
| 0.894855
| 0.371302
| 3.779427
| 0.233876
| 14.875148
| false
| false
|
2024-12-05
|
2024-12-14
| 0
|
HelpingAI/Priya-3B
|
Subsets and Splits
Top Models by Combined Score
Identifies top-performing models with fewer than 34 billion parameters based on a combined score of two evaluation metrics, providing insights into efficient model performance.
Top 100 Official Models <70
This query identifies the top 100 high-scoring, officially provided models with fewer than 70 billion parameters, offering a useful overview for comparing performance metrics.
Top 100 Official Models < 2
Identifies top-performing AI models with fewer than 20 billion parameters, offering insights into efficiency and precision in smaller models.
Top 500 Official Models by Score
Identifies top performing models based on a combined score of IFEval and MMLU-PRO metrics, filtering by official providers and parameter count, offering insights into efficient model performance.
Top 200 Official Models by Score
Discovers top high-performing models with less than 70 billion parameters, highlighting their evaluation scores and characteristics, which is valuable for model selection and optimization.
SQL Console for open-llm-leaderboard/contents
Identifies top-performing models with fewer than 70 billion parameters, combining two evaluation metrics to reveal the best balanced options.
Top 10 Official Leaderboard Models
The query identifies top 10 official providers with under 13 billion parameters, ordered by their average metric, revealing valuable insights into efficient models.
SQL Console for open-llm-leaderboard/contents
This query filters and ranks models within a specific parameter range (6-8 billion) for the LlamaForCausalLM architecture based on their average performance metric.
SQL Console for open-llm-leaderboard/contents
Retrieves entries related to chat models that are officially provided, offering a filtered view of the dataset.
SQL Console for open-llm-leaderboard/contents
The query retrieves entries marked as "Official Providers", offering basic filtering but limited analytical value.
Top 10 Official Training Data
The query retrieves a small sample of records from the 'train' dataset where the "Official Providers" flag is true, providing basic filtering with limited analytical value.