eval_name
stringlengths 12
111
| Precision
stringclasses 3
values | Type
stringclasses 7
values | T
stringclasses 7
values | Weight type
stringclasses 2
values | Architecture
stringclasses 64
values | Model
stringlengths 355
689
| fullname
stringlengths 4
102
| Model sha
stringlengths 0
40
| Average ⬆️
float64 0.74
52.1
| Hub License
stringclasses 27
values | Hub ❤️
int64 0
6.09k
| #Params (B)
float64 -1
141
| Available on the hub
bool 2
classes | MoE
bool 2
classes | Flagged
bool 2
classes | Chat Template
bool 2
classes | CO₂ cost (kg)
float64 0.04
187
| IFEval Raw
float64 0
0.9
| IFEval
float64 0
90
| BBH Raw
float64 0.22
0.83
| BBH
float64 0.25
76.7
| MATH Lvl 5 Raw
float64 0
0.71
| MATH Lvl 5
float64 0
71.5
| GPQA Raw
float64 0.21
0.47
| GPQA
float64 0
29.4
| MUSR Raw
float64 0.29
0.6
| MUSR
float64 0
38.7
| MMLU-PRO Raw
float64 0.1
0.73
| MMLU-PRO
float64 0
70
| Merged
bool 2
classes | Official Providers
bool 2
classes | Upload To Hub Date
stringclasses 525
values | Submission Date
stringclasses 263
values | Generation
int64 0
10
| Base Model
stringlengths 4
102
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
zelk12_MT-Gen4-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT-Gen4-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Gen4-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Gen4-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT-Gen4-gemma-2-9B
|
d44beca936d18a5b4b65799487504c1097ae1cb2
| 34.691863
|
gemma
| 1
| 10.159
| true
| false
| false
| true
| 3.563367
| 0.788301
| 78.83006
| 0.610988
| 43.960404
| 0.223565
| 22.356495
| 0.354866
| 13.982103
| 0.422802
| 11.383594
| 0.438747
| 37.63852
| true
| false
|
2024-12-13
|
2024-12-13
| 1
|
zelk12/MT-Gen4-gemma-2-9B (Merge)
|
zelk12_MT-Gen5-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT-Gen5-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Gen5-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Gen5-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT-Gen5-gemma-2-9B
|
aef27049b2a3c52138016e9602280150f70eae32
| 34.563843
|
gemma
| 1
| 10.159
| true
| false
| false
| true
| 3.574855
| 0.792322
| 79.232215
| 0.613279
| 44.398244
| 0.215257
| 21.52568
| 0.35151
| 13.534676
| 0.420167
| 10.8875
| 0.440243
| 37.804743
| true
| false
|
2024-12-22
|
2024-12-22
| 1
|
zelk12/MT-Gen5-gemma-2-9B (Merge)
|
zelk12_MT-Gen6-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT-Gen6-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Gen6-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Gen6-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT-Gen6-gemma-2-9B
|
bd348fb1c1524e0d7d625200a292e46387b04da2
| 19.816469
|
gemma
| 1
| 10.159
| true
| false
| false
| true
| 4.016062
| 0.161567
| 16.156686
| 0.584467
| 39.396915
| 0.082326
| 8.232628
| 0.333054
| 11.073826
| 0.406927
| 8.865885
| 0.416556
| 35.172872
| true
| false
|
2025-01-23
|
2025-01-23
| 1
|
zelk12/MT-Gen6-gemma-2-9B (Merge)
|
zelk12_MT-Gen6fix-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT-Gen6fix-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Gen6fix-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Gen6fix-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT-Gen6fix-gemma-2-9B
|
f733983a7f923b19fb6d1cbc9f1cdffe788984ef
| 20.06441
|
gemma
| 1
| 10.159
| true
| false
| false
| true
| 3.801218
| 0.157595
| 15.759518
| 0.591731
| 40.78635
| 0.081571
| 8.1571
| 0.337248
| 11.63311
| 0.408417
| 9.385417
| 0.411985
| 34.664967
| true
| false
|
2025-02-02
|
2025-02-02
| 1
|
zelk12/MT-Gen6fix-gemma-2-9B (Merge)
|
zelk12_MT-Gen7-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT-Gen7-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Gen7-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Gen7-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT-Gen7-gemma-2-9B
|
b9316aea6888346724d9631e1987327e103529eb
| 20.391801
|
gemma
| 2
| 10.159
| true
| false
| false
| true
| 2.058313
| 0.166413
| 16.64129
| 0.593524
| 40.939071
| 0.089124
| 8.912387
| 0.33557
| 11.409396
| 0.409781
| 9.75599
| 0.412234
| 34.692671
| true
| false
|
2025-02-15
|
2025-02-15
| 1
|
zelk12/MT-Gen7-gemma-2-9B (Merge)
|
zelk12_MT-Max-Merge_02012025163610-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT-Max-Merge_02012025163610-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Max-Merge_02012025163610-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Max-Merge_02012025163610-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT-Max-Merge_02012025163610-gemma-2-9B
|
2f279c5c648c22e77327d0c0098f90b69312afd3
| 34.703708
|
gemma
| 1
| 10.159
| true
| false
| false
| true
| 3.692239
| 0.790749
| 79.074855
| 0.614224
| 44.501684
| 0.221299
| 22.129909
| 0.35151
| 13.534676
| 0.422802
| 11.25026
| 0.439578
| 37.730866
| true
| false
|
2025-01-02
|
2025-01-02
| 1
|
zelk12/MT-Max-Merge_02012025163610-gemma-2-9B (Merge)
|
zelk12_MT-Merge-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT-Merge-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Merge-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Merge-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT-Merge-gemma-2-9B
|
f4c3b001bc8692bcbbd7005b6f8db048e651aa46
| 34.878606
| 3
| 10.159
| false
| false
| false
| true
| 6.438112
| 0.803538
| 80.353795
| 0.611838
| 44.320842
| 0.220544
| 22.054381
| 0.348154
| 13.087248
| 0.425625
| 12.103125
| 0.43617
| 37.352246
| false
| false
|
2024-10-22
|
2024-10-22
| 1
|
zelk12/MT-Merge-gemma-2-9B (Merge)
|
|
zelk12_MT-Merge1-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT-Merge1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Merge1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Merge1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT-Merge1-gemma-2-9B
|
71bb4577c877715f3f6646a224b184544639c856
| 34.855327
| 1
| 10.159
| false
| false
| false
| true
| 6.054681
| 0.790149
| 79.014903
| 0.61
| 44.058246
| 0.228852
| 22.885196
| 0.35151
| 13.534676
| 0.424385
| 12.148177
| 0.437417
| 37.490765
| false
| false
|
2024-11-07
|
2024-11-07
| 1
|
zelk12/MT-Merge1-gemma-2-9B (Merge)
|
|
zelk12_MT-Merge2-MU-gemma-2-MTg2MT1g2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT-Merge2-MU-gemma-2-MTg2MT1g2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Merge2-MU-gemma-2-MTg2MT1g2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Merge2-MU-gemma-2-MTg2MT1g2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT-Merge2-MU-gemma-2-MTg2MT1g2-9B
|
6d73ec2204800f7978c376567d3c6361c0a072cd
| 34.891869
| 2
| 10.159
| false
| false
| false
| true
| 3.68977
| 0.795595
| 79.559458
| 0.608389
| 43.8402
| 0.218278
| 21.827795
| 0.350671
| 13.422819
| 0.432229
| 13.228646
| 0.437251
| 37.472296
| false
| false
|
2024-11-25
|
2024-11-28
| 1
|
zelk12/MT-Merge2-MU-gemma-2-MTg2MT1g2-9B (Merge)
|
|
zelk12_MT-Merge2-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT-Merge2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Merge2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Merge2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT-Merge2-gemma-2-9B
|
a695e722e6fab77852f9fe59bbc4d69fe23c4208
| 34.820727
| 2
| 10.159
| false
| false
| false
| true
| 3.701582
| 0.787701
| 78.770108
| 0.610668
| 44.157197
| 0.234894
| 23.489426
| 0.350671
| 13.422819
| 0.421688
| 11.510938
| 0.438165
| 37.573877
| false
| false
|
2024-11-25
|
2024-11-25
| 1
|
zelk12/MT-Merge2-gemma-2-9B (Merge)
|
|
zelk12_MT-Merge3-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT-Merge3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Merge3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Merge3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT-Merge3-gemma-2-9B
|
3f02f5e76d3aade3340307eb34b15bc9dd5a2023
| 34.63974
|
gemma
| 0
| 10.159
| true
| false
| false
| true
| 3.762041
| 0.785853
| 78.585265
| 0.610211
| 44.066073
| 0.220544
| 22.054381
| 0.348993
| 13.199105
| 0.42575
| 12.452083
| 0.437334
| 37.481531
| true
| false
|
2024-12-11
|
2024-12-11
| 1
|
zelk12/MT-Merge3-gemma-2-9B (Merge)
|
zelk12_MT-Merge4-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT-Merge4-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Merge4-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Merge4-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT-Merge4-gemma-2-9B
|
5f076ad8a3f3c403840a1cd572a6018bea34e889
| 34.599309
|
gemma
| 1
| 10.159
| true
| false
| false
| true
| 3.74677
| 0.780732
| 78.073179
| 0.611822
| 44.053492
| 0.216767
| 21.676737
| 0.352349
| 13.646532
| 0.429437
| 12.479688
| 0.438996
| 37.666223
| true
| false
|
2024-12-21
|
2024-12-21
| 1
|
zelk12/MT-Merge4-gemma-2-9B (Merge)
|
zelk12_MT-Merge5-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT-Merge5-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Merge5-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Merge5-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT-Merge5-gemma-2-9B
|
d8adfc6c5395baaeb3f5e0b50c585ed3f662c4d9
| 34.69224
|
gemma
| 2
| 10.159
| true
| false
| false
| true
| 3.60158
| 0.784379
| 78.437878
| 0.612267
| 44.240598
| 0.218278
| 21.827795
| 0.353188
| 13.758389
| 0.428135
| 12.25026
| 0.438747
| 37.63852
| true
| false
|
2024-12-30
|
2024-12-30
| 1
|
zelk12/MT-Merge5-gemma-2-9B (Merge)
|
zelk12_MT-Merge6-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT-Merge6-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Merge6-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Merge6-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT-Merge6-gemma-2-9B
|
ce24f52c594decba760d14f77cc4d978a2b8f0dd
| 20.203466
|
gemma
| 2
| 10.159
| true
| false
| false
| true
| 2.075151
| 0.16946
| 16.946037
| 0.594911
| 41.321961
| 0.08006
| 8.006042
| 0.328859
| 10.514541
| 0.409781
| 9.822656
| 0.411486
| 34.60956
| true
| false
|
2025-02-13
|
2025-02-13
| 1
|
zelk12/MT-Merge6-gemma-2-9B (Merge)
|
zelk12_MT-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT-gemma-2-9B
|
24e1f894517b86dd866c1a5999ced4a5924dcd90
| 33.613227
| 3
| 10.159
| false
| false
| false
| true
| 6.046798
| 0.796843
| 79.684349
| 0.60636
| 43.324243
| 0.205438
| 20.543807
| 0.345638
| 12.751678
| 0.407115
| 9.55599
| 0.422374
| 35.819297
| false
| false
|
2024-10-11
|
2024-10-11
| 1
|
zelk12/MT-gemma-2-9B (Merge)
|
|
zelk12_MT1-Gen1-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT1-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT1-Gen1-gemma-2-9B
|
939ac6c12059a18fc1117cdb3861f46816eff2fb
| 34.931655
| 0
| 10.159
| false
| false
| false
| true
| 6.724969
| 0.797443
| 79.744301
| 0.611779
| 44.273282
| 0.22432
| 22.432024
| 0.34396
| 12.527964
| 0.430958
| 13.103125
| 0.437583
| 37.509235
| false
| false
|
2024-10-23
|
2024-10-24
| 1
|
zelk12/MT1-Gen1-gemma-2-9B (Merge)
|
|
zelk12_MT1-Gen2-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT1-Gen2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-Gen2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-Gen2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT1-Gen2-gemma-2-9B
|
aeaca7dc7d50a425a5d3c38d7c4a7daf1c772ad4
| 35.00544
| 2
| 10.159
| false
| false
| false
| true
| 3.99199
| 0.798367
| 79.836722
| 0.609599
| 43.919191
| 0.225076
| 22.507553
| 0.352349
| 13.646532
| 0.428354
| 12.844271
| 0.435505
| 37.278369
| false
| false
|
2024-11-11
|
2024-11-11
| 1
|
zelk12/MT1-Gen2-gemma-2-9B (Merge)
|
|
zelk12_MT1-Gen3-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT1-Gen3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-Gen3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-Gen3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT1-Gen3-gemma-2-9B
|
5cc4ee1c70f08a5b1a195d43f044d9bf6fca29f5
| 34.739851
| 0
| 10.159
| false
| false
| false
| true
| 3.889753
| 0.795969
| 79.596914
| 0.610155
| 43.990306
| 0.22432
| 22.432024
| 0.348993
| 13.199105
| 0.424323
| 12.007031
| 0.434924
| 37.213726
| false
| false
|
2024-12-01
|
2024-12-01
| 1
|
zelk12/MT1-Gen3-gemma-2-9B (Merge)
|
|
zelk12_MT1-Gen4-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT1-Gen4-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-Gen4-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-Gen4-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT1-Gen4-gemma-2-9B
|
5eaf1ef67f32805c6fbc0b51418a8caf866661a2
| 34.289209
|
gemma
| 1
| 10.159
| true
| false
| false
| true
| 3.484125
| 0.794121
| 79.412071
| 0.605757
| 43.145368
| 0.216012
| 21.601208
| 0.347315
| 12.975391
| 0.423115
| 12.089323
| 0.428607
| 36.511894
| true
| false
|
2024-12-14
|
2024-12-14
| 1
|
zelk12/MT1-Gen4-gemma-2-9B (Merge)
|
zelk12_MT1-Gen5-IF-gemma-2-S2DMv1-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT1-Gen5-IF-gemma-2-S2DMv1-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-Gen5-IF-gemma-2-S2DMv1-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-Gen5-IF-gemma-2-S2DMv1-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT1-Gen5-IF-gemma-2-S2DMv1-9B
|
53a780fd3a2d42709a0f517cac019234d7d71267
| 33.781044
| 1
| 10.159
| false
| false
| false
| true
| 3.414254
| 0.792922
| 79.292167
| 0.6
| 42.201028
| 0.203172
| 20.317221
| 0.34396
| 12.527964
| 0.424479
| 12.593229
| 0.421792
| 35.754654
| false
| false
|
2024-12-24
|
2024-12-31
| 1
|
zelk12/MT1-Gen5-IF-gemma-2-S2DMv1-9B (Merge)
|
|
zelk12_MT1-Gen5-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT1-Gen5-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-Gen5-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-Gen5-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT1-Gen5-gemma-2-9B
|
4eb54f9a0a9f482537b0e79000ffe7fb9d024c38
| 33.556617
|
gemma
| 1
| 10.159
| true
| false
| false
| true
| 3.510461
| 0.779483
| 77.948288
| 0.601746
| 42.496764
| 0.207704
| 20.770393
| 0.346477
| 12.863535
| 0.419146
| 11.459896
| 0.422207
| 35.800827
| true
| false
|
2024-12-24
|
2024-12-24
| 1
|
zelk12/MT1-Gen5-gemma-2-9B (Merge)
|
zelk12_MT1-Gen6-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT1-Gen6-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-Gen6-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-Gen6-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT1-Gen6-gemma-2-9B
|
7834a5b83bf7a9a75a0f7d75603cc166627f1e26
| 19.919694
|
gemma
| 2
| 10.159
| true
| false
| false
| true
| 3.84616
| 0.163365
| 16.336543
| 0.594355
| 41.261989
| 0.080816
| 8.081571
| 0.32802
| 10.402685
| 0.404448
| 8.622656
| 0.413314
| 34.812722
| true
| false
|
2025-02-04
|
2025-02-04
| 1
|
zelk12/MT1-Gen6-gemma-2-9B (Merge)
|
zelk12_MT1-Gen7-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT1-Gen7-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-Gen7-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-Gen7-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT1-Gen7-gemma-2-9B
|
41b009dd08c26b26d2cf4df3bc67d822a9e6f38e
| 20.199021
|
gemma
| 2
| 10.159
| true
| false
| false
| true
| 4.130286
| 0.163365
| 16.336543
| 0.593795
| 41.182076
| 0.083082
| 8.308157
| 0.32802
| 10.402685
| 0.411115
| 10.022656
| 0.414478
| 34.942007
| true
| false
|
2025-02-19
|
2025-02-19
| 1
|
zelk12/MT1-Gen7-gemma-2-9B (Merge)
|
zelk12_MT1-Max-Merge_02012025163610-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT1-Max-Merge_02012025163610-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-Max-Merge_02012025163610-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-Max-Merge_02012025163610-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT1-Max-Merge_02012025163610-gemma-2-9B
|
e9177c45a9dc1ff2ace378d4809ea92ff6e477c4
| 34.873001
|
gemma
| 1
| 10.159
| true
| false
| false
| true
| 3.75596
| 0.792872
| 79.28718
| 0.612267
| 44.226377
| 0.22281
| 22.280967
| 0.354866
| 13.982103
| 0.4255
| 11.8875
| 0.438165
| 37.573877
| true
| false
|
2025-01-04
|
2025-01-04
| 1
|
zelk12/MT1-Max-Merge_02012025163610-gemma-2-9B (Merge)
|
zelk12_MT1-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT1-gemma-2-9B
|
3a5e77518ca9c3c8ea2edac4c03bc220ee91f3ed
| 34.867465
| 2
| 10.159
| false
| false
| false
| true
| 6.691439
| 0.79467
| 79.467036
| 0.610875
| 44.161526
| 0.223565
| 22.356495
| 0.345638
| 12.751678
| 0.432229
| 13.161979
| 0.435755
| 37.306073
| false
| false
|
2024-10-12
|
2024-10-14
| 1
|
zelk12/MT1-gemma-2-9B (Merge)
|
|
zelk12_MT2-Gen1-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT2-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT2-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT2-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT2-Gen1-gemma-2-9B
|
167abf8eb4ea01fecd42dc32ad68160c51a8685a
| 34.461734
| 0
| 10.159
| false
| false
| false
| true
| 6.76642
| 0.785578
| 78.557782
| 0.61008
| 44.141103
| 0.221299
| 22.129909
| 0.343121
| 12.416107
| 0.424323
| 12.007031
| 0.437666
| 37.518469
| false
| false
|
2024-10-24
|
2024-10-27
| 1
|
zelk12/MT2-Gen1-gemma-2-9B (Merge)
|
|
zelk12_MT2-Gen2-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT2-Gen2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT2-Gen2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT2-Gen2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT2-Gen2-gemma-2-9B
|
24c487499b5833424ffb9932eed838bb254f61b4
| 34.641867
| 3
| 10.159
| false
| false
| false
| true
| 4.074883
| 0.7889
| 78.890012
| 0.609292
| 44.044503
| 0.218278
| 21.827795
| 0.346477
| 12.863535
| 0.427021
| 12.577604
| 0.43883
| 37.647754
| false
| false
|
2024-11-12
|
2024-11-12
| 1
|
zelk12/MT2-Gen2-gemma-2-9B (Merge)
|
|
zelk12_MT2-Gen3-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT2-Gen3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT2-Gen3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT2-Gen3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT2-Gen3-gemma-2-9B
|
bb750c2b76328c6dbc9adf9ae3d09551f3723758
| 34.264471
| 1
| 10.159
| false
| false
| false
| true
| 3.848754
| 0.781007
| 78.100662
| 0.610477
| 44.007274
| 0.210725
| 21.072508
| 0.346477
| 12.863535
| 0.423083
| 12.052083
| 0.437417
| 37.490765
| false
| false
|
2024-12-04
|
2024-12-04
| 1
|
zelk12/MT2-Gen3-gemma-2-9B (Merge)
|
|
zelk12_MT2-Gen4-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT2-Gen4-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT2-Gen4-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT2-Gen4-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT2-Gen4-gemma-2-9B
|
7a07de3719c3b8b8e90e79a65798bcc4ef454fc6
| 34.202322
|
gemma
| 1
| 10.159
| true
| false
| false
| true
| 3.572871
| 0.789599
| 78.959937
| 0.609655
| 43.778362
| 0.223565
| 22.356495
| 0.345638
| 12.751678
| 0.412542
| 10.467708
| 0.432098
| 36.899749
| true
| false
|
2024-12-15
|
2024-12-15
| 1
|
zelk12/MT2-Gen4-gemma-2-9B (Merge)
|
zelk12_MT2-Gen5-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT2-Gen5-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT2-Gen5-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT2-Gen5-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT2-Gen5-gemma-2-9B
|
94711cc263eab1464fa6b01c28ee5171b4467d84
| 34.049234
|
gemma
| 0
| 10.159
| true
| false
| false
| true
| 3.524271
| 0.774912
| 77.491168
| 0.606393
| 43.124281
| 0.210725
| 21.072508
| 0.35151
| 13.534676
| 0.424417
| 12.385417
| 0.430186
| 36.687352
| true
| false
|
2024-12-25
|
2024-12-25
| 1
|
zelk12/MT2-Gen5-gemma-2-9B (Merge)
|
zelk12_MT2-Gen6-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT2-Gen6-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT2-Gen6-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT2-Gen6-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT2-Gen6-gemma-2-9B
|
a1b1a2009841dd0b5bf00ca65b631bc771146a65
| 20.837842
|
gemma
| 1
| 10.159
| true
| false
| false
| true
| 3.898815
| 0.166413
| 16.64129
| 0.595965
| 41.771094
| 0.084592
| 8.459215
| 0.338087
| 11.744966
| 0.413719
| 10.748177
| 0.420961
| 35.662308
| true
| false
|
2025-02-05
|
2025-02-05
| 1
|
zelk12/MT2-Gen6-gemma-2-9B (Merge)
|
zelk12_MT2-Gen7-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT2-Gen7-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT2-Gen7-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT2-Gen7-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT2-Gen7-gemma-2-9B
|
ccefd8eab76f7a2a25e2974e9545ba176078fe8f
| 22.282909
|
gemma
| 1
| 10.159
| true
| false
| false
| true
| 2.033677
| 0.176155
| 17.615482
| 0.607892
| 43.574199
| 0.101964
| 10.196375
| 0.354866
| 13.982103
| 0.420323
| 11.540365
| 0.4311
| 36.788933
| true
| false
|
2025-02-22
|
2025-02-22
| 1
|
zelk12/MT2-Gen7-gemma-2-9B (Merge)
|
zelk12_MT2-Max-Merge_02012025163610-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT2-Max-Merge_02012025163610-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT2-Max-Merge_02012025163610-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT2-Max-Merge_02012025163610-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT2-Max-Merge_02012025163610-gemma-2-9B
|
76d8a9cc371af30b5843fb69edc25ff767d6741f
| 34.675341
|
gemma
| 0
| 10.159
| true
| false
| false
| true
| 3.65795
| 0.790149
| 79.014903
| 0.610846
| 44.040817
| 0.22432
| 22.432024
| 0.35151
| 13.534676
| 0.422833
| 11.354167
| 0.439079
| 37.675458
| true
| false
|
2025-01-07
|
2025-01-07
| 1
|
zelk12/MT2-Max-Merge_02012025163610-gemma-2-9B (Merge)
|
zelk12_MT2-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT2-gemma-2-9B
|
d20d7169ce0f53d586504c50b4b7dc470bf8a781
| 34.516135
| 1
| 10.159
| false
| false
| false
| true
| 6.38822
| 0.788575
| 78.857542
| 0.611511
| 44.167481
| 0.221299
| 22.129909
| 0.347315
| 12.975391
| 0.421656
| 11.540365
| 0.436835
| 37.426123
| false
| false
|
2024-10-14
|
2024-10-15
| 1
|
zelk12/MT2-gemma-2-9B (Merge)
|
|
zelk12_MT3-Gen1-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT3-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT3-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT3-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT3-Gen1-gemma-2-9B
|
cd78df9e67e2e710d8d305f5a03a92c01b1b425d
| 34.088581
| 1
| 10.159
| false
| false
| false
| true
| 6.227332
| 0.783779
| 78.377926
| 0.610676
| 44.119495
| 0.214502
| 21.450151
| 0.346477
| 12.863535
| 0.415115
| 10.75599
| 0.43268
| 36.964391
| false
| false
|
2024-10-24
|
2024-10-28
| 1
|
zelk12/MT3-Gen1-gemma-2-9B (Merge)
|
|
zelk12_MT3-Gen2-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT3-Gen2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT3-Gen2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT3-Gen2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT3-Gen2-gemma-2-9B
|
e4ef057d20751d89934025e9088ba98d89b921b5
| 34.349829
| 1
| 10.159
| false
| false
| false
| true
| 3.838217
| 0.784329
| 78.432891
| 0.609147
| 43.940226
| 0.223565
| 22.356495
| 0.357383
| 14.317673
| 0.411115
| 10.022656
| 0.433261
| 37.029034
| false
| false
|
2024-11-20
|
2024-11-20
| 1
|
zelk12/MT3-Gen2-gemma-2-9B (Merge)
|
|
zelk12_MT3-Gen3-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT3-Gen3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT3-Gen3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT3-Gen3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT3-Gen3-gemma-2-9B
|
4ad54d6295f6364aa87f7aaa2a7bd112fb92ec00
| 34.437034
| 1
| 10.159
| false
| false
| false
| true
| 3.808925
| 0.785628
| 78.562769
| 0.608889
| 43.78374
| 0.215257
| 21.52568
| 0.35151
| 13.534676
| 0.42575
| 12.51875
| 0.430269
| 36.696587
| false
| false
|
2024-12-07
|
2024-12-07
| 1
|
zelk12/MT3-Gen3-gemma-2-9B (Merge)
|
|
zelk12_MT3-Gen4-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT3-Gen4-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT3-Gen4-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT3-Gen4-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT3-Gen4-gemma-2-9B
|
22066f5a275797ae870d2c58e8c75ac933ee1adf
| 34.517532
|
gemma
| 4
| 10.159
| true
| false
| false
| true
| 3.641707
| 0.773713
| 77.371264
| 0.610084
| 43.779591
| 0.206193
| 20.619335
| 0.347315
| 12.975391
| 0.447635
| 14.721094
| 0.438747
| 37.63852
| true
| false
|
2024-12-16
|
2024-12-16
| 1
|
zelk12/MT3-Gen4-gemma-2-9B (Merge)
|
zelk12_MT3-Gen5-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT3-Gen5-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT3-Gen5-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT3-Gen5-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT3-Gen5-gemma-2-9B
|
b02315782a4719734b159220ca1eef0770d022a5
| 34.757682
|
gemma
| 1
| 10.159
| true
| false
| false
| true
| 3.881254
| 0.799017
| 79.901661
| 0.609862
| 43.951199
| 0.226586
| 22.65861
| 0.353188
| 13.758389
| 0.419115
| 11.422656
| 0.431682
| 36.853576
| true
| false
|
2024-12-26
|
2024-12-26
| 1
|
zelk12/MT3-Gen5-gemma-2-9B (Merge)
|
zelk12_MT3-Gen5-gemma-2-9B_v1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT3-Gen5-gemma-2-9B_v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT3-Gen5-gemma-2-9B_v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT3-Gen5-gemma-2-9B_v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT3-Gen5-gemma-2-9B_v1
|
40bfcc25ff421ff83d67a9c46474a0b40abf4f4a
| 34.734557
|
gemma
| 2
| 10.159
| true
| false
| false
| true
| 3.856304
| 0.799616
| 79.961613
| 0.611333
| 44.159602
| 0.22281
| 22.280967
| 0.348993
| 13.199105
| 0.420385
| 11.48151
| 0.435921
| 37.324542
| true
| false
|
2024-12-27
|
2024-12-27
| 1
|
zelk12/MT3-Gen5-gemma-2-9B_v1 (Merge)
|
zelk12_MT3-Gen6-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT3-Gen6-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT3-Gen6-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT3-Gen6-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT3-Gen6-gemma-2-9B
|
a8f0594d31040041bcfa0ab1e9521543d9b91040
| 21.102855
|
gemma
| 1
| 10.159
| true
| false
| false
| true
| 2.047319
| 0.176155
| 17.615482
| 0.602007
| 42.639363
| 0.088369
| 8.836858
| 0.343121
| 12.416107
| 0.412573
| 10.638281
| 0.410239
| 34.47104
| true
| false
|
2025-02-12
|
2025-02-12
| 1
|
zelk12/MT3-Gen6-gemma-2-9B (Merge)
|
zelk12_MT3-Max-Merge_02012025163610-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT3-Max-Merge_02012025163610-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT3-Max-Merge_02012025163610-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT3-Max-Merge_02012025163610-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT3-Max-Merge_02012025163610-gemma-2-9B
|
6499758258ac6278e7fdc4ba6719ab28f35709e8
| 22.467708
|
gemma
| 0
| 10.159
| true
| false
| false
| true
| 3.892564
| 0.176155
| 17.615482
| 0.612346
| 44.20652
| 0.101208
| 10.120846
| 0.350671
| 13.422819
| 0.425469
| 11.783594
| 0.438913
| 37.656989
| true
| false
|
2025-01-09
|
2025-01-09
| 1
|
zelk12/MT3-Max-Merge_02012025163610-gemma-2-9B (Merge)
|
zelk12_MT3-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT3-gemma-2-9B
|
d501b6ea59896fac3dc0a623501a5493b3573cde
| 34.215565
| 1
| 10.159
| false
| false
| false
| true
| 6.273306
| 0.778609
| 77.860854
| 0.613078
| 44.248465
| 0.216767
| 21.676737
| 0.344799
| 12.639821
| 0.424292
| 11.903125
| 0.43268
| 36.964391
| false
| false
|
2024-10-15
|
2024-10-16
| 1
|
zelk12/MT3-gemma-2-9B (Merge)
|
|
zelk12_MT4-Gen1-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT4-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT4-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT4-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT4-Gen1-gemma-2-9B
|
6ed2c66246c7f354decfd3579acb534dc4b0b48c
| 34.703101
| 0
| 10.159
| false
| false
| false
| true
| 4.207122
| 0.7895
| 78.949964
| 0.609383
| 44.009524
| 0.219789
| 21.978852
| 0.34396
| 12.527964
| 0.432229
| 13.095313
| 0.438913
| 37.656989
| false
| false
|
2024-10-25
|
2024-10-29
| 1
|
zelk12/MT4-Gen1-gemma-2-9B (Merge)
|
|
zelk12_MT4-Gen2-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT4-Gen2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT4-Gen2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT4-Gen2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT4-Gen2-gemma-2-9B
|
4d61a5799b11641a24e8b0f3eda0e987ff392089
| 35.053544
| 4
| 10.159
| false
| false
| false
| true
| 3.954095
| 0.805062
| 80.506168
| 0.610835
| 44.176658
| 0.232628
| 23.26284
| 0.345638
| 12.751678
| 0.425656
| 12.207031
| 0.436752
| 37.416888
| false
| false
|
2024-11-22
|
2024-11-22
| 1
|
zelk12/MT4-Gen2-gemma-2-9B (Merge)
|
|
zelk12_MT4-Gen3-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT4-Gen3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT4-Gen3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT4-Gen3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT4-Gen3-gemma-2-9B
|
f93026d28ca1707e8c21620be8558eed6be43b1c
| 34.372682
| 0
| 10.159
| false
| false
| false
| true
| 3.917403
| 0.784054
| 78.405409
| 0.608711
| 43.89439
| 0.219033
| 21.903323
| 0.34396
| 12.527964
| 0.424323
| 11.940365
| 0.438082
| 37.564642
| false
| false
|
2024-12-08
|
2024-12-08
| 1
|
zelk12/MT4-Gen3-gemma-2-9B (Merge)
|
|
zelk12_MT4-Gen4-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT4-Gen4-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT4-Gen4-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT4-Gen4-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT4-Gen4-gemma-2-9B
|
51f3deb0aba90d82fc3f21894b3171fa5afbffa5
| 34.38114
|
gemma
| 0
| 10.159
| true
| false
| false
| true
| 3.587482
| 0.787426
| 78.742625
| 0.607603
| 43.47581
| 0.214502
| 21.450151
| 0.352349
| 13.646532
| 0.424354
| 12.044271
| 0.432347
| 36.927453
| true
| false
|
2024-12-19
|
2024-12-19
| 1
|
zelk12/MT4-Gen4-gemma-2-9B (Merge)
|
zelk12_MT4-Gen5-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT4-Gen5-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT4-Gen5-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT4-Gen5-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT4-Gen5-gemma-2-9B
|
59681ccdc7e6f1991cc5663464806665bc3bf4c8
| 34.720511
|
gemma
| 2
| 10.159
| true
| false
| false
| true
| 3.694723
| 0.778883
| 77.888336
| 0.610666
| 43.947892
| 0.226586
| 22.65861
| 0.356544
| 14.205817
| 0.426833
| 12.020833
| 0.438414
| 37.601581
| true
| false
|
2024-12-28
|
2024-12-28
| 1
|
zelk12/MT4-Gen5-gemma-2-9B (Merge)
|
zelk12_MT4-Max-Merge_02012025163610-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT4-Max-Merge_02012025163610-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT4-Max-Merge_02012025163610-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT4-Max-Merge_02012025163610-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT4-Max-Merge_02012025163610-gemma-2-9B
|
25e64938f38ed3db0113007a2814b069fd2952b0
| 22.332038
|
gemma
| 0
| 10.159
| true
| false
| false
| true
| 3.9168
| 0.177079
| 17.707904
| 0.612013
| 44.173982
| 0.095166
| 9.516616
| 0.35151
| 13.534676
| 0.422802
| 11.383594
| 0.439079
| 37.675458
| true
| false
|
2025-01-11
|
2025-01-11
| 1
|
zelk12/MT4-Max-Merge_02012025163610-gemma-2-9B (Merge)
|
zelk12_MT4-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT4-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT4-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT4-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT4-gemma-2-9B
|
2167ea02baf9145a697a7d828a17c75b86e5e282
| 34.026402
| 0
| 10.159
| false
| false
| false
| true
| 6.310517
| 0.776161
| 77.616059
| 0.607314
| 43.553827
| 0.208459
| 20.845921
| 0.338087
| 11.744966
| 0.430927
| 12.999219
| 0.436586
| 37.398419
| false
| false
|
2024-10-16
|
2024-10-20
| 1
|
zelk12/MT4-gemma-2-9B (Merge)
|
|
zelk12_MT5-Gen1-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT5-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT5-Gen1-gemma-2-9B
|
0291b776e80f38381788cd8f1fb2c3435ad891b5
| 34.440432
| 0
| 10.159
| false
| false
| false
| true
| 4.034506
| 0.78313
| 78.312987
| 0.611048
| 44.183335
| 0.221299
| 22.129909
| 0.347315
| 12.975391
| 0.420385
| 11.614844
| 0.436835
| 37.426123
| false
| false
|
2024-10-25
|
2024-10-31
| 1
|
zelk12/MT5-Gen1-gemma-2-9B (Merge)
|
|
zelk12_MT5-Gen2-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT5-Gen2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-Gen2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-Gen2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT5-Gen2-gemma-2-9B
|
3ee2822fcba6708bd9032b79249a2789e5996b6a
| 34.55155
| 1
| 10.159
| false
| false
| false
| true
| 3.716761
| 0.796244
| 79.624397
| 0.610541
| 44.113215
| 0.220544
| 22.054381
| 0.35151
| 13.534676
| 0.416292
| 10.436458
| 0.437916
| 37.546173
| false
| false
|
2024-11-23
|
2024-11-23
| 1
|
zelk12/MT5-Gen2-gemma-2-9B (Merge)
|
|
zelk12_MT5-Gen3-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT5-Gen3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-Gen3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-Gen3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT5-Gen3-gemma-2-9B
|
4b3811c689fec5c9cc483bb1ed696734e5e88fcf
| 34.488645
| 0
| 10.159
| false
| false
| false
| true
| 3.874666
| 0.78253
| 78.253035
| 0.609049
| 43.885913
| 0.216767
| 21.676737
| 0.35151
| 13.534676
| 0.423052
| 12.08151
| 0.4375
| 37.5
| false
| false
|
2024-12-08
|
2024-12-08
| 1
|
zelk12/MT5-Gen3-gemma-2-9B (Merge)
|
|
zelk12_MT5-Gen4-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT5-Gen4-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-Gen4-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-Gen4-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT5-Gen4-gemma-2-9B
|
2f826d76460a5b7f150622a57f2d5419adfc464f
| 34.658891
|
gemma
| 0
| 10.159
| true
| false
| false
| true
| 3.643441
| 0.783455
| 78.345457
| 0.613106
| 44.323211
| 0.22432
| 22.432024
| 0.353188
| 13.758389
| 0.422833
| 11.354167
| 0.439661
| 37.7401
| true
| false
|
2024-12-20
|
2024-12-20
| 1
|
zelk12/MT5-Gen4-gemma-2-9B (Merge)
|
zelk12_MT5-Gen5-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT5-Gen5-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-Gen5-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-Gen5-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT5-Gen5-gemma-2-9B
|
d1f68652d7dda810da8207a371d26376c6a6e847
| 34.634253
|
gemma
| 2
| 10.159
| true
| false
| false
| true
| 3.783291
| 0.79472
| 79.472023
| 0.611166
| 44.115081
| 0.225831
| 22.583082
| 0.348154
| 13.087248
| 0.419115
| 11.55599
| 0.432929
| 36.992095
| true
| false
|
2024-12-29
|
2024-12-29
| 1
|
zelk12/MT5-Gen5-gemma-2-9B (Merge)
|
zelk12_MT5-Max-Merge_02012025163610-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT5-Max-Merge_02012025163610-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-Max-Merge_02012025163610-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-Max-Merge_02012025163610-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT5-Max-Merge_02012025163610-gemma-2-9B
|
a90f9ca13af28c72695fabc56da4ddd8e3a8e173
| 22.353757
|
gemma
| 0
| 10.159
| true
| false
| false
| true
| 4.124419
| 0.176155
| 17.615482
| 0.612679
| 44.274407
| 0.098187
| 9.818731
| 0.35151
| 13.534676
| 0.422771
| 11.213021
| 0.438996
| 37.666223
| true
| false
|
2025-01-14
|
2025-01-14
| 1
|
zelk12/MT5-Max-Merge_02012025163610-gemma-2-9B (Merge)
|
zelk12_MT5-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT5-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT5-gemma-2-9B
|
b627ae7d796b1ae85b59c55e0e043b8d3ae73d83
| 34.773049
| 0
| 10.159
| false
| false
| false
| true
| 6.53966
| 0.804787
| 80.478685
| 0.611223
| 44.271257
| 0.225831
| 22.583082
| 0.343121
| 12.416107
| 0.420385
| 11.48151
| 0.436669
| 37.407654
| false
| false
|
2024-10-19
|
2024-10-21
| 1
|
zelk12/MT5-gemma-2-9B (Merge)
|
|
zelk12_MTM-Merge-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MTM-Merge-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MTM-Merge-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MTM-Merge-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MTM-Merge-gemma-2-9B
|
843f23c68cf50f5bdc0206f93e72ce0f9feeca6e
| 34.614985
|
gemma
| 2
| 10.159
| true
| false
| false
| true
| 3.586692
| 0.779808
| 77.980758
| 0.613335
| 44.380677
| 0.217523
| 21.752266
| 0.354866
| 13.982103
| 0.426771
| 11.946354
| 0.43883
| 37.647754
| true
| false
|
2025-01-01
|
2025-01-01
| 1
|
zelk12/MTM-Merge-gemma-2-9B (Merge)
|
zelk12_MTMaMe-Merge_02012025163610-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MTMaMe-Merge_02012025163610-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MTMaMe-Merge_02012025163610-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MTMaMe-Merge_02012025163610-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MTMaMe-Merge_02012025163610-gemma-2-9B
|
ce68b2468bcba0c5dcde79bbf5346db81f069b12
| 22.385497
|
gemma
| 0
| 10.159
| true
| false
| false
| true
| 3.795679
| 0.178603
| 17.860277
| 0.611679
| 44.160463
| 0.095921
| 9.592145
| 0.352349
| 13.646532
| 0.424104
| 11.479687
| 0.438165
| 37.573877
| true
| false
|
2025-01-16
|
2025-01-16
| 1
|
zelk12/MTMaMe-Merge_02012025163610-gemma-2-9B (Merge)
|
zelk12_Rv0.4DMv1t0.25-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/Rv0.4DMv1t0.25-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/Rv0.4DMv1t0.25-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__Rv0.4DMv1t0.25-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/Rv0.4DMv1t0.25-gemma-2-9B
|
23e7337dabbf023177c25ded4923286a2e3936fc
| 34.114018
| 0
| 10.159
| false
| false
| false
| true
| 3.837289
| 0.749658
| 74.965758
| 0.606971
| 43.664764
| 0.225831
| 22.583082
| 0.345638
| 12.751678
| 0.430927
| 12.932552
| 0.440076
| 37.786274
| false
| false
|
2024-12-31
|
2024-12-31
| 1
|
zelk12/Rv0.4DMv1t0.25-gemma-2-9B (Merge)
|
|
zelk12_Rv0.4DMv1t0.25Tt0.25-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/Rv0.4DMv1t0.25Tt0.25-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/Rv0.4DMv1t0.25Tt0.25-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__Rv0.4DMv1t0.25Tt0.25-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/Rv0.4DMv1t0.25Tt0.25-gemma-2-9B
|
28fbcc2fa23f46aaaed327984784251527c78815
| 33.877515
|
gemma
| 0
| 10.159
| true
| false
| false
| true
| 3.825166
| 0.76462
| 76.46201
| 0.609786
| 43.914819
| 0.206949
| 20.694864
| 0.342282
| 12.304251
| 0.428292
| 12.703125
| 0.434674
| 37.186022
| true
| false
|
2024-12-31
|
2024-12-31
| 1
|
zelk12/Rv0.4DMv1t0.25Tt0.25-gemma-2-9B (Merge)
|
zelk12_Rv0.4MT4g2-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/Rv0.4MT4g2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/Rv0.4MT4g2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__Rv0.4MT4g2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/Rv0.4MT4g2-gemma-2-9B
|
ef595241d2c62203c27d84e6643d384a7cf99bd4
| 33.255962
|
gemma
| 1
| 10.159
| true
| false
| false
| true
| 3.706507
| 0.732022
| 73.202215
| 0.60412
| 43.199046
| 0.194864
| 19.486405
| 0.353188
| 13.758389
| 0.423083
| 11.91875
| 0.441739
| 37.970966
| true
| false
|
2025-01-04
|
2025-01-04
| 1
|
zelk12/Rv0.4MT4g2-gemma-2-9B (Merge)
|
zelk12_T31122024203920-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/T31122024203920-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/T31122024203920-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__T31122024203920-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/T31122024203920-gemma-2-9B
|
25cb58c73a3adf43cee33b50238b1d332b5ccc13
| 34.209071
|
gemma
| 0
| 10.159
| true
| false
| false
| true
| 3.732738
| 0.767618
| 76.76177
| 0.609563
| 43.728997
| 0.205438
| 20.543807
| 0.350671
| 13.422819
| 0.432198
| 13.32474
| 0.437251
| 37.472296
| true
| false
|
2024-12-31
|
2024-12-31
| 1
|
zelk12/T31122024203920-gemma-2-9B (Merge)
|
zelk12_Test01012025155054_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/Test01012025155054" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/Test01012025155054</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__Test01012025155054-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/Test01012025155054
|
c607186b0b079975e3305e0223e0a55f0cbc19e5
| 3.591417
| 0
| 3.817
| false
| false
| false
| true
| 1.400948
| 0.155523
| 15.55229
| 0.28295
| 1.280547
| 0
| 0
| 0.241611
| 0
| 0.367021
| 3.710937
| 0.109043
| 1.004728
| false
| false
|
2025-01-01
|
2025-01-01
| 1
|
zelk12/Test01012025155054 (Merge)
|
|
zelk12_Test01012025155054t0.5_gemma-2_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/Test01012025155054t0.5_gemma-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/Test01012025155054t0.5_gemma-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__Test01012025155054t0.5_gemma-2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/Test01012025155054t0.5_gemma-2
|
14fcae0d420d303df84bd9b9c8744a6f0fa147fb
| 3.591417
| 0
| 3.817
| false
| false
| false
| true
| 1.395928
| 0.155523
| 15.55229
| 0.28295
| 1.280547
| 0
| 0
| 0.241611
| 0
| 0.367021
| 3.710937
| 0.109043
| 1.004728
| false
| false
|
2025-01-01
|
2025-01-01
| 1
|
zelk12/Test01012025155054t0.5_gemma-2 (Merge)
|
|
zelk12_gemma-2-S2MTM-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/gemma-2-S2MTM-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/gemma-2-S2MTM-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__gemma-2-S2MTM-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/gemma-2-S2MTM-9B
|
fd6860743943114eeca6fc2e800e27c87873bcc5
| 33.89283
|
gemma
| 0
| 10.159
| true
| false
| false
| true
| 3.530205
| 0.782256
| 78.225553
| 0.606084
| 43.115728
| 0.204683
| 20.468278
| 0.345638
| 12.751678
| 0.421844
| 12.163802
| 0.429688
| 36.631944
| true
| false
|
2024-12-11
|
2024-12-11
| 1
|
zelk12/gemma-2-S2MTM-9B (Merge)
|
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1
|
b4208ddf6c741884c16c77b9433d9ead8f216354
| 33.919919
| 2
| 10.159
| false
| false
| false
| true
| 6.886383
| 0.764895
| 76.489492
| 0.607451
| 43.706516
| 0.228097
| 22.809668
| 0.349832
| 13.310962
| 0.413625
| 10.303125
| 0.432098
| 36.899749
| false
| false
|
2024-10-03
|
2024-10-03
| 1
|
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1 (Merge)
|
|
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25
|
e652c9e07265526851dad994f4640aa265b9ab56
| 34.282119
| 1
| 10.159
| false
| false
| false
| true
| 6.389981
| 0.770665
| 77.066517
| 0.607543
| 43.85035
| 0.214502
| 21.450151
| 0.343121
| 12.416107
| 0.43226
| 13.132552
| 0.439993
| 37.777039
| false
| false
|
2024-10-04
|
2024-10-04
| 1
|
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25 (Merge)
|
|
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75
|
eb0e589291630ba20328db650f74af949d217a97
| 31.782789
| 0
| 10.159
| false
| false
| false
| true
| 7.502906
| 0.720806
| 72.080635
| 0.59952
| 42.487153
| 0.201662
| 20.166163
| 0.349832
| 13.310962
| 0.395115
| 7.75599
| 0.414063
| 34.895833
| false
| false
|
2024-10-04
|
2024-10-04
| 1
|
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75 (Merge)
|
|
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.2_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2
|
76f56b25bf6d8704282f8c77bfda28ca384883bc
| 33.626064
| 1
| 10.159
| false
| false
| false
| true
| 6.827351
| 0.759999
| 75.999902
| 0.606626
| 43.633588
| 0.22281
| 22.280967
| 0.348154
| 13.087248
| 0.410958
| 9.836458
| 0.432264
| 36.918218
| false
| false
|
2024-10-07
|
2024-10-11
| 1
|
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2 (Merge)
|
|
zelk12_recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1
|
1e3e623e9f0b386bfd967c629dd39c87daef5bed
| 33.904825
| 1
| 10.159
| false
| false
| false
| true
| 9.69897
| 0.761523
| 76.152276
| 0.609878
| 43.941258
| 0.20997
| 20.996979
| 0.341443
| 12.192394
| 0.431021
| 13.310937
| 0.431516
| 36.835106
| false
| false
|
2024-10-07
|
2024-10-07
| 1
|
zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1 (Merge)
|
|
zelk12_recoilme-gemma-2-Ifable-9B-v0.1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ifable-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ifable-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ifable-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/recoilme-gemma-2-Ifable-9B-v0.1
|
8af6620b39c9a36239879b6b2bd88f66e9e9d930
| 34.406991
| 0
| 10.159
| false
| false
| false
| true
| 9.808856
| 0.794396
| 79.439554
| 0.60644
| 43.39057
| 0.220544
| 22.054381
| 0.35151
| 13.534676
| 0.420229
| 11.095313
| 0.432347
| 36.927453
| false
| false
|
2024-10-07
|
2024-10-07
| 1
|
zelk12/recoilme-gemma-2-Ifable-9B-v0.1 (Merge)
|
|
zelk12_recoilme-gemma-2-psy10k-mental_healt-9B-v0.1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-psy10k-mental_healt-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1
|
ced039b03be6f65ac0f713efcee76c6534e65639
| 32.586531
| 1
| 10.159
| false
| false
| false
| true
| 6.264441
| 0.744537
| 74.453672
| 0.597759
| 42.132683
| 0.188822
| 18.882175
| 0.34396
| 12.527964
| 0.429469
| 12.183594
| 0.418052
| 35.339096
| false
| false
|
2024-10-07
|
2024-10-07
| 1
|
zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1 (Merge)
|
|
zetasepic_Qwen2.5-32B-Instruct-abliterated-v2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zetasepic/Qwen2.5-32B-Instruct-abliterated-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zetasepic/Qwen2.5-32B-Instruct-abliterated-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zetasepic__Qwen2.5-32B-Instruct-abliterated-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zetasepic/Qwen2.5-32B-Instruct-abliterated-v2
|
5894fbf0a900e682dfc0ed794db337093bd8d26b
| 46.888673
|
apache-2.0
| 14
| 32.764
| true
| false
| false
| true
| 13.489578
| 0.833413
| 83.341312
| 0.693402
| 56.533818
| 0.595166
| 59.516616
| 0.36745
| 15.659955
| 0.435427
| 14.928385
| 0.562168
| 51.35195
| false
| false
|
2024-10-11
|
2024-12-07
| 2
|
Qwen/Qwen2.5-32B
|
zetasepic_Qwen2.5-72B-Instruct-abliterated_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zetasepic/Qwen2.5-72B-Instruct-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zetasepic/Qwen2.5-72B-Instruct-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zetasepic__Qwen2.5-72B-Instruct-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zetasepic/Qwen2.5-72B-Instruct-abliterated
|
af94b3c05c9857dbac73afb1cbce00e4833ec9ef
| 46.337953
|
other
| 28
| 72.706
| true
| false
| false
| false
| 37.618363
| 0.715261
| 71.526106
| 0.715226
| 59.912976
| 0.524169
| 52.416918
| 0.406879
| 20.917226
| 0.471917
| 19.122917
| 0.587184
| 54.131575
| false
| false
|
2024-10-01
|
2024-11-08
| 2
|
Qwen/Qwen2.5-72B
|
zhengr_MixTAO-7Bx2-MoE-v8.1_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MixtralForCausalLM
|
<a target="_blank" href="https://huggingface.co/zhengr/MixTAO-7Bx2-MoE-v8.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zhengr/MixTAO-7Bx2-MoE-v8.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zhengr__MixTAO-7Bx2-MoE-v8.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zhengr/MixTAO-7Bx2-MoE-v8.1
|
828e963abf2db0f5af9ed0d4034e538fc1cf5f40
| 17.067606
|
apache-2.0
| 55
| 12.879
| true
| true
| false
| true
| 1.85478
| 0.418781
| 41.878106
| 0.420194
| 19.176907
| 0.060423
| 6.042296
| 0.298658
| 6.487696
| 0.397625
| 8.303125
| 0.284658
| 20.517509
| false
| false
|
2024-02-26
|
2024-06-27
| 0
|
zhengr/MixTAO-7Bx2-MoE-v8.1
|
Subsets and Splits
Top Models by Combined Score
Identifies top-performing models with fewer than 34 billion parameters based on a combined score of two evaluation metrics, providing insights into efficient model performance.
Top 100 Official Models <70
This query identifies the top 100 high-scoring, officially provided models with fewer than 70 billion parameters, offering a useful overview for comparing performance metrics.
Top 100 Official Models < 2
Identifies top-performing AI models with fewer than 20 billion parameters, offering insights into efficiency and precision in smaller models.
Top 500 Official Models by Score
Identifies top performing models based on a combined score of IFEval and MMLU-PRO metrics, filtering by official providers and parameter count, offering insights into efficient model performance.
Top 200 Official Models by Score
Discovers top high-performing models with less than 70 billion parameters, highlighting their evaluation scores and characteristics, which is valuable for model selection and optimization.
SQL Console for open-llm-leaderboard/contents
Identifies top-performing models with fewer than 70 billion parameters, combining two evaluation metrics to reveal the best balanced options.
Top 10 Official Leaderboard Models
The query identifies top 10 official providers with under 13 billion parameters, ordered by their average metric, revealing valuable insights into efficient models.
SQL Console for open-llm-leaderboard/contents
This query filters and ranks models within a specific parameter range (6-8 billion) for the LlamaForCausalLM architecture based on their average performance metric.
SQL Console for open-llm-leaderboard/contents
Retrieves entries related to chat models that are officially provided, offering a filtered view of the dataset.
SQL Console for open-llm-leaderboard/contents
The query retrieves entries marked as "Official Providers", offering basic filtering but limited analytical value.
Top 10 Official Training Data
The query retrieves a small sample of records from the 'train' dataset where the "Official Providers" flag is true, providing basic filtering with limited analytical value.