eval_name
stringlengths 12
111
| Precision
stringclasses 3
values | Type
stringclasses 7
values | T
stringclasses 7
values | Weight type
stringclasses 2
values | Architecture
stringclasses 64
values | Model
stringlengths 355
689
| fullname
stringlengths 4
102
| Model sha
stringlengths 0
40
| Average ⬆️
float64 0.74
52.1
| Hub License
stringclasses 27
values | Hub ❤️
int64 0
6.09k
| #Params (B)
float64 -1
141
| Available on the hub
bool 2
classes | MoE
bool 2
classes | Flagged
bool 2
classes | Chat Template
bool 2
classes | CO₂ cost (kg)
float64 0.04
187
| IFEval Raw
float64 0
0.9
| IFEval
float64 0
90
| BBH Raw
float64 0.22
0.83
| BBH
float64 0.25
76.7
| MATH Lvl 5 Raw
float64 0
0.71
| MATH Lvl 5
float64 0
71.5
| GPQA Raw
float64 0.21
0.47
| GPQA
float64 0
29.4
| MUSR Raw
float64 0.29
0.6
| MUSR
float64 0
38.7
| MMLU-PRO Raw
float64 0.1
0.73
| MMLU-PRO
float64 0
70
| Merged
bool 2
classes | Official Providers
bool 2
classes | Upload To Hub Date
stringclasses 525
values | Submission Date
stringclasses 263
values | Generation
int64 0
10
| Base Model
stringlengths 4
102
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
DreadPoor_inexpertus-8B-Model_Stock_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/DreadPoor/inexpertus-8B-Model_Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/inexpertus-8B-Model_Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__inexpertus-8B-Model_Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
DreadPoor/inexpertus-8B-Model_Stock
|
e6da16c921c073facbe15769fae301d02163ef34
| 29.70787
| 0
| 8.03
| false
| false
| false
| true
| 0.665457
| 0.779533
| 77.953275
| 0.528019
| 32.463374
| 0.170695
| 17.069486
| 0.309564
| 7.941834
| 0.411823
| 11.811198
| 0.379072
| 31.008053
| false
| false
|
2025-03-07
|
2025-03-07
| 1
|
DreadPoor/inexpertus-8B-Model_Stock (Merge)
|
|
DreadPoor_inexpertus_1.1-8B-LINEAR_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/DreadPoor/inexpertus_1.1-8B-LINEAR" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/inexpertus_1.1-8B-LINEAR</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__inexpertus_1.1-8B-LINEAR-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
DreadPoor/inexpertus_1.1-8B-LINEAR
|
497e83a2ead3a83d693f78531c0bc802849eef64
| 29.609548
| 2
| 8.03
| false
| false
| false
| true
| 0.659765
| 0.752705
| 75.270504
| 0.552464
| 36.199191
| 0.172961
| 17.296073
| 0.297819
| 6.375839
| 0.417344
| 11.101302
| 0.382729
| 31.414376
| false
| false
|
2025-03-07
|
2025-03-07
| 1
|
DreadPoor/inexpertus_1.1-8B-LINEAR (Merge)
|
|
DreadPoor_inexpertus_1.2-8B-LINEAR_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/DreadPoor/inexpertus_1.2-8B-LINEAR" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/inexpertus_1.2-8B-LINEAR</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__inexpertus_1.2-8B-LINEAR-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
DreadPoor/inexpertus_1.2-8B-LINEAR
|
9236276e5e9528276e327197b0fa00fb0826e6f9
| 28.786491
| 0
| 8.03
| false
| false
| false
| true
| 0.649394
| 0.734795
| 73.479479
| 0.552344
| 36.056523
| 0.15861
| 15.861027
| 0.295302
| 6.040268
| 0.413344
| 10.301302
| 0.378823
| 30.980349
| false
| false
|
2025-03-08
|
2025-03-08
| 1
|
DreadPoor/inexpertus_1.2-8B-LINEAR (Merge)
|
|
DreadPoor_mergekit-nuslerp-nqzkedi_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/DreadPoor/mergekit-nuslerp-nqzkedi" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/mergekit-nuslerp-nqzkedi</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__mergekit-nuslerp-nqzkedi-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
DreadPoor/mergekit-nuslerp-nqzkedi
|
2aa839aaa0200a0b3cd6c6be0b82c30ca0dc84b4
| 30.296302
| 0
| 8.03
| false
| false
| false
| true
| 1.437432
| 0.776485
| 77.648528
| 0.536192
| 34.095228
| 0.188066
| 18.806647
| 0.301174
| 6.823266
| 0.422458
| 11.973958
| 0.391872
| 32.430186
| false
| false
|
2025-01-29
| 0
|
Removed
|
||
DreadPoor_remember_to_breathe-8b-Model-Stock_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/DreadPoor/remember_to_breathe-8b-Model-Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/remember_to_breathe-8b-Model-Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__remember_to_breathe-8b-Model-Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
DreadPoor/remember_to_breathe-8b-Model-Stock
|
fa88f1b06cf9ca7bd0d859c6a4b2240485363ae0
| 28.256524
| 0
| 8.03
| false
| false
| false
| true
| 1.32708
| 0.710415
| 71.041503
| 0.541165
| 34.678991
| 0.148792
| 14.879154
| 0.301174
| 6.823266
| 0.414458
| 11.440625
| 0.37608
| 30.675606
| false
| false
|
2024-12-06
| 0
|
Removed
|
||
DreadPoor_test_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/DreadPoor/test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
DreadPoor/test
|
8f4e90a3e665f1d4d7cf737b43e7bdb360de3ffa
| 24.952991
| 0
| 8.03
| false
| false
| false
| true
| 2.207847
| 0.493695
| 49.369451
| 0.537187
| 34.287514
| 0.193353
| 19.335347
| 0.270973
| 2.796421
| 0.435083
| 14.51875
| 0.364694
| 29.410461
| false
| false
|
2025-01-28
| 0
|
Removed
|
||
DreadPoor_test_ALT_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/DreadPoor/test_ALT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/test_ALT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__test_ALT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
DreadPoor/test_ALT
|
15f7baaea9416ce8ba8b1ea972969fd54c2bacdd
| 24.26567
| 0
| 8.03
| false
| false
| false
| true
| 1.562993
| 0.49969
| 49.968971
| 0.537043
| 33.986912
| 0.170695
| 17.069486
| 0.269295
| 2.572707
| 0.436292
| 14.303125
| 0.349235
| 27.692819
| false
| false
|
2025-01-28
| 0
|
Removed
|
||
DreadPoor_tests_pending-do_not_use_yet_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/DreadPoor/tests_pending-do_not_use_yet" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/tests_pending-do_not_use_yet</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__tests_pending-do_not_use_yet-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
DreadPoor/tests_pending-do_not_use_yet
|
47f0a91f2a6b06724ff51b4fcb4ee6831c1f49e9
| 29.64146
| 0
| 8.03
| false
| false
| false
| true
| 1.263487
| 0.769141
| 76.914143
| 0.54079
| 34.674509
| 0.197885
| 19.78852
| 0.29698
| 6.263982
| 0.400479
| 8.793229
| 0.382729
| 31.414376
| false
| false
|
2025-02-04
| 0
|
Removed
|
||
ECE-ILAB-PRYMMAL_ILAB-Merging-3B-V2_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Phi3ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ECE-ILAB-PRYMMAL/ILAB-Merging-3B-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ECE-ILAB-PRYMMAL/ILAB-Merging-3B-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ECE-ILAB-PRYMMAL__ILAB-Merging-3B-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ECE-ILAB-PRYMMAL/ILAB-Merging-3B-V2
|
6f26ffcb82b8a8d14400471da7047b8b4a8e4d10
| 24.065566
|
apache-2.0
| 1
| 3.821
| true
| false
| false
| false
| 0.42455
| 0.402894
| 40.289432
| 0.540194
| 36.004037
| 0.151813
| 15.181269
| 0.305369
| 7.38255
| 0.433219
| 13.752344
| 0.386054
| 31.783762
| true
| false
|
2025-03-09
|
2025-03-09
| 1
|
ECE-ILAB-PRYMMAL/ILAB-Merging-3B-V2 (Merge)
|
EVA-UNIT-01_EVA-Qwen2.5-14B-v0.2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/EVA-UNIT-01/EVA-Qwen2.5-14B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EVA-UNIT-01/EVA-Qwen2.5-14B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EVA-UNIT-01__EVA-Qwen2.5-14B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EVA-UNIT-01/EVA-Qwen2.5-14B-v0.2
| 33.812606
|
apache-2.0
| 20
| 14.77
| true
| false
| false
| false
| 4.665346
| 0.403843
| 40.384291
| 0.609024
| 43.607849
| 0.340634
| 34.063444
| 0.394295
| 19.239374
| 0.479448
| 19.63099
| 0.513547
| 45.94969
| false
| false
|
2024-11-06
|
2024-12-26
| 1
|
Qwen/Qwen2.5-14B
|
|
EVA-UNIT-01_EVA-Qwen2.5-72B-v0.2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/EVA-UNIT-01/EVA-Qwen2.5-72B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EVA-UNIT-01/EVA-Qwen2.5-72B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EVA-UNIT-01__EVA-Qwen2.5-72B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EVA-UNIT-01/EVA-Qwen2.5-72B-v0.2
|
2590214b30391392b9a84e7cbe40fff3a92c6814
| 44.221596
|
other
| 17
| 72.706
| true
| false
| false
| true
| 45.910197
| 0.687884
| 68.78837
| 0.708801
| 59.066733
| 0.431269
| 43.126888
| 0.408557
| 21.14094
| 0.471979
| 19.730729
| 0.581283
| 53.475916
| false
| false
|
2024-11-21
|
2024-11-27
| 1
|
Qwen/Qwen2.5-72B
|
Edgerunners_meta-llama-3-8b-instruct-hf-ortho-baukit-34fail-3000total-bf16_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/Edgerunners/meta-llama-3-8b-instruct-hf-ortho-baukit-34fail-3000total-bf16" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Edgerunners/meta-llama-3-8b-instruct-hf-ortho-baukit-34fail-3000total-bf16</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Edgerunners__meta-llama-3-8b-instruct-hf-ortho-baukit-34fail-3000total-bf16-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Edgerunners/meta-llama-3-8b-instruct-hf-ortho-baukit-34fail-3000total-bf16
|
4b8290f9ef1f7d33df282d3764f795af4e64022c
| 23.448867
|
cc-by-nc-4.0
| 0
| 8.03
| true
| false
| false
| true
| 1.31156
| 0.714711
| 71.471141
| 0.497991
| 28.256392
| 0.090634
| 9.063444
| 0.260067
| 1.342282
| 0.334156
| 1.269531
| 0.363614
| 29.290411
| false
| false
|
2024-05-12
|
2025-01-30
| 0
|
Edgerunners/meta-llama-3-8b-instruct-hf-ortho-baukit-34fail-3000total-bf16
|
EleutherAI_gpt-j-6b_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
GPTJForCausalLM
|
<a target="_blank" href="https://huggingface.co/EleutherAI/gpt-j-6b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/gpt-j-6b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__gpt-j-6b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EleutherAI/gpt-j-6b
|
47e169305d2e8376be1d31e765533382721b2cc1
| 6.570412
|
apache-2.0
| 1,488
| 6
| true
| false
| false
| false
| 1.534864
| 0.252219
| 25.221856
| 0.319104
| 4.912818
| 0.013595
| 1.359517
| 0.245805
| 0
| 0.36575
| 5.252083
| 0.124086
| 2.676197
| false
| true
|
2022-03-02
|
2024-08-19
| 0
|
EleutherAI/gpt-j-6b
|
EleutherAI_gpt-neo-1.3B_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
GPTNeoForCausalLM
|
<a target="_blank" href="https://huggingface.co/EleutherAI/gpt-neo-1.3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/gpt-neo-1.3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__gpt-neo-1.3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EleutherAI/gpt-neo-1.3B
|
dbe59a7f4a88d01d1ba9798d78dbe3fe038792c8
| 5.391091
|
mit
| 293
| 1.366
| true
| false
| false
| false
| 0.718848
| 0.207905
| 20.790503
| 0.303923
| 3.024569
| 0.010574
| 1.057402
| 0.255872
| 0.782998
| 0.381656
| 4.873698
| 0.116356
| 1.817376
| false
| true
|
2022-03-02
|
2024-06-12
| 0
|
EleutherAI/gpt-neo-1.3B
|
EleutherAI_gpt-neo-125m_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
GPTNeoForCausalLM
|
<a target="_blank" href="https://huggingface.co/EleutherAI/gpt-neo-125m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/gpt-neo-125m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__gpt-neo-125m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EleutherAI/gpt-neo-125m
|
21def0189f5705e2521767faed922f1f15e7d7db
| 4.407322
|
mit
| 199
| 0.15
| true
| false
| false
| false
| 0.405805
| 0.190544
| 19.054442
| 0.311516
| 3.436739
| 0.006042
| 0.60423
| 0.253356
| 0.447427
| 0.359333
| 2.616667
| 0.10256
| 0.284427
| false
| true
|
2022-03-02
|
2024-08-10
| 0
|
EleutherAI/gpt-neo-125m
|
EleutherAI_gpt-neo-2.7B_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
GPTNeoForCausalLM
|
<a target="_blank" href="https://huggingface.co/EleutherAI/gpt-neo-2.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/gpt-neo-2.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__gpt-neo-2.7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EleutherAI/gpt-neo-2.7B
|
e24fa291132763e59f4a5422741b424fb5d59056
| 6.431048
|
mit
| 478
| 2.718
| true
| false
| false
| false
| 1.016763
| 0.258963
| 25.896289
| 0.313952
| 4.178603
| 0.010574
| 1.057402
| 0.26594
| 2.12528
| 0.355365
| 3.520573
| 0.116273
| 1.808141
| false
| true
|
2022-03-02
|
2024-06-12
| 0
|
EleutherAI/gpt-neo-2.7B
|
EleutherAI_gpt-neox-20b_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
GPTNeoXForCausalLM
|
<a target="_blank" href="https://huggingface.co/EleutherAI/gpt-neox-20b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/gpt-neox-20b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__gpt-neox-20b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EleutherAI/gpt-neox-20b
|
c292233c833e336628618a88a648727eb3dff0a7
| 6.116522
|
apache-2.0
| 555
| 20.739
| true
| false
| false
| false
| 6.293473
| 0.258688
| 25.868806
| 0.316504
| 4.929114
| 0.013595
| 1.359517
| 0.243289
| 0
| 0.364667
| 2.816667
| 0.115525
| 1.72503
| false
| true
|
2022-04-07
|
2024-06-09
| 0
|
EleutherAI/gpt-neox-20b
|
EleutherAI_pythia-1.4b_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
GPTNeoXForCausalLM
|
<a target="_blank" href="https://huggingface.co/EleutherAI/pythia-1.4b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/pythia-1.4b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__pythia-1.4b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EleutherAI/pythia-1.4b
|
fedc38a16eea3bd36a96b906d78d11d2ce18ed79
| 6.008531
|
apache-2.0
| 23
| 1.515
| true
| false
| false
| false
| 0.387233
| 0.237081
| 23.708095
| 0.315043
| 3.878989
| 0.015106
| 1.510574
| 0.261745
| 1.565996
| 0.353781
| 4.022656
| 0.112284
| 1.364879
| false
| true
|
2023-02-09
|
2025-01-28
| 0
|
EleutherAI/pythia-1.4b
|
EleutherAI_pythia-12b_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
GPTNeoXForCausalLM
|
<a target="_blank" href="https://huggingface.co/EleutherAI/pythia-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/pythia-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__pythia-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EleutherAI/pythia-12b
|
35c9d7f32fbb108fb8b5bdd574eb03369d1eed49
| 6.059841
|
apache-2.0
| 135
| 12
| true
| false
| false
| false
| 2.236014
| 0.247148
| 24.714757
| 0.317965
| 4.987531
| 0.016616
| 1.661631
| 0.246644
| 0
| 0.364698
| 3.78724
| 0.110871
| 1.20789
| false
| true
|
2023-02-28
|
2024-06-12
| 0
|
EleutherAI/pythia-12b
|
EleutherAI_pythia-160m_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
GPTNeoXForCausalLM
|
<a target="_blank" href="https://huggingface.co/EleutherAI/pythia-160m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/pythia-160m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__pythia-160m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EleutherAI/pythia-160m
|
50f5173d932e8e61f858120bcb800b97af589f46
| 5.730395
|
apache-2.0
| 30
| 0.213
| true
| false
| false
| false
| 0.470677
| 0.181552
| 18.155162
| 0.297044
| 2.198832
| 0.009063
| 0.906344
| 0.258389
| 1.118568
| 0.417938
| 10.675521
| 0.111951
| 1.32794
| false
| true
|
2023-02-08
|
2024-06-09
| 0
|
EleutherAI/pythia-160m
|
EleutherAI_pythia-1b_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
GPTNeoXForCausalLM
|
<a target="_blank" href="https://huggingface.co/EleutherAI/pythia-1b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/pythia-1b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__pythia-1b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EleutherAI/pythia-1b
|
f73d7dcc545c8bd326d8559c8ef84ffe92fea6b2
| 5.070268
|
apache-2.0
| 37
| 1.079
| true
| false
| false
| false
| 0.312295
| 0.220794
| 22.079416
| 0.300409
| 2.293986
| 0.009063
| 0.906344
| 0.256711
| 0.894855
| 0.355208
| 2.734375
| 0.113614
| 1.512633
| false
| true
|
2023-03-10
|
2025-01-27
| 0
|
EleutherAI/pythia-1b
|
EleutherAI_pythia-2.8b_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
GPTNeoXForCausalLM
|
<a target="_blank" href="https://huggingface.co/EleutherAI/pythia-2.8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/pythia-2.8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__pythia-2.8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EleutherAI/pythia-2.8b
|
2a259cdd96a4beb1cdf467512e3904197345f6a9
| 5.554946
|
apache-2.0
| 30
| 2.909
| true
| false
| false
| false
| 1.507804
| 0.217322
| 21.732226
| 0.322409
| 5.077786
| 0.013595
| 1.359517
| 0.25
| 0
| 0.348573
| 3.638281
| 0.113697
| 1.521868
| false
| true
|
2023-02-13
|
2024-06-12
| 0
|
EleutherAI/pythia-2.8b
|
EleutherAI_pythia-410m_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
GPTNeoXForCausalLM
|
<a target="_blank" href="https://huggingface.co/EleutherAI/pythia-410m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/pythia-410m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__pythia-410m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EleutherAI/pythia-410m
|
9879c9b5f8bea9051dcb0e68dff21493d67e9d4f
| 5.227072
|
apache-2.0
| 23
| 0.506
| true
| false
| false
| false
| 0.754164
| 0.219545
| 21.954525
| 0.302813
| 2.715428
| 0.009819
| 0.981873
| 0.259228
| 1.230425
| 0.357813
| 3.059896
| 0.112783
| 1.420287
| false
| true
|
2023-02-13
|
2024-06-09
| 0
|
EleutherAI/pythia-410m
|
EleutherAI_pythia-6.9b_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
GPTNeoXForCausalLM
|
<a target="_blank" href="https://huggingface.co/EleutherAI/pythia-6.9b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/pythia-6.9b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__pythia-6.9b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EleutherAI/pythia-6.9b
|
f271943e880e60c0c715fd10e4dc74ec4e31eb44
| 5.966547
|
apache-2.0
| 50
| 6.9
| true
| false
| false
| false
| 1.737734
| 0.228114
| 22.811363
| 0.323229
| 5.881632
| 0.01435
| 1.435045
| 0.251678
| 0.223714
| 0.359052
| 3.814844
| 0.114694
| 1.632683
| false
| true
|
2023-02-14
|
2024-06-12
| 0
|
EleutherAI/pythia-6.9b
|
Enno-Ai_EnnoAi-Pro-French-Llama-3-8B-v0.4_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/Enno-Ai/EnnoAi-Pro-French-Llama-3-8B-v0.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Enno-Ai/EnnoAi-Pro-French-Llama-3-8B-v0.4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Enno-Ai__EnnoAi-Pro-French-Llama-3-8B-v0.4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Enno-Ai/EnnoAi-Pro-French-Llama-3-8B-v0.4
|
328722ae96e3a112ec900dbe77d410788a526c5c
| 15.684469
|
creativeml-openrail-m
| 0
| 8.031
| true
| false
| false
| true
| 2.018256
| 0.418881
| 41.888079
| 0.407495
| 16.875928
| 0.036254
| 3.625378
| 0.270973
| 2.796421
| 0.417
| 10.758333
| 0.263464
| 18.162677
| false
| false
|
2024-06-27
|
2024-06-30
| 0
|
Enno-Ai/EnnoAi-Pro-French-Llama-3-8B-v0.4
|
Enno-Ai_EnnoAi-Pro-Llama-3-8B_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/Enno-Ai/EnnoAi-Pro-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Enno-Ai/EnnoAi-Pro-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Enno-Ai__EnnoAi-Pro-Llama-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Enno-Ai/EnnoAi-Pro-Llama-3-8B
|
6a5d745bdd304753244fe601e2a958d37d13cd71
| 12.514546
|
creativeml-openrail-m
| 0
| 8.031
| true
| false
| false
| true
| 2.368675
| 0.319538
| 31.953772
| 0.415158
| 17.507545
| 0.021903
| 2.190332
| 0.261745
| 1.565996
| 0.407052
| 9.08151
| 0.215093
| 12.788121
| false
| false
|
2024-07-01
|
2024-07-08
| 0
|
Enno-Ai/EnnoAi-Pro-Llama-3-8B
|
Enno-Ai_EnnoAi-Pro-Llama-3-8B-v0.3_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/Enno-Ai/EnnoAi-Pro-Llama-3-8B-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Enno-Ai/EnnoAi-Pro-Llama-3-8B-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Enno-Ai__EnnoAi-Pro-Llama-3-8B-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Enno-Ai/EnnoAi-Pro-Llama-3-8B-v0.3
|
cf29b8b484a909132e3a1f85ce891d28347c0d13
| 18.128287
|
creativeml-openrail-m
| 0
| 8.03
| true
| false
| false
| true
| 2.941671
| 0.508257
| 50.825698
| 0.410058
| 16.668386
| 0.048338
| 4.833837
| 0.265101
| 2.013423
| 0.423573
| 12.313281
| 0.299036
| 22.1151
| false
| false
|
2024-06-26
|
2024-06-26
| 0
|
Enno-Ai/EnnoAi-Pro-Llama-3-8B-v0.3
|
Enno-Ai_EnnoAi-Pro-Llama-3.1-8B-v0.9_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/Enno-Ai/EnnoAi-Pro-Llama-3.1-8B-v0.9" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Enno-Ai/EnnoAi-Pro-Llama-3.1-8B-v0.9</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Enno-Ai__EnnoAi-Pro-Llama-3.1-8B-v0.9-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Enno-Ai/EnnoAi-Pro-Llama-3.1-8B-v0.9
|
c740871122fd471a1a225cf2b4368e333752d74c
| 15.5751
|
apache-2.0
| 0
| 8.03
| true
| false
| false
| true
| 1.865142
| 0.468915
| 46.89147
| 0.416027
| 17.498296
| 0.037764
| 3.776435
| 0.26594
| 2.12528
| 0.383177
| 5.430469
| 0.259558
| 17.72865
| false
| false
|
2024-08-22
|
2024-09-06
| 0
|
Enno-Ai/EnnoAi-Pro-Llama-3.1-8B-v0.9
|
EnnoAi_EnnoAi-7B-French-Instruct-202502_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EnnoAi/EnnoAi-7B-French-Instruct-202502" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EnnoAi/EnnoAi-7B-French-Instruct-202502</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EnnoAi__EnnoAi-7B-French-Instruct-202502-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EnnoAi/EnnoAi-7B-French-Instruct-202502
|
46438f0966b908da5594a4a2abb0202ef08c0355
| 31.287438
|
apache-2.0
| 0
| 7.456
| true
| false
| false
| false
| 0.689644
| 0.556442
| 55.644246
| 0.557455
| 36.924131
| 0.372356
| 37.23565
| 0.295302
| 6.040268
| 0.459979
| 18.397396
| 0.401346
| 33.482934
| false
| false
|
2025-02-11
|
2025-02-11
| 0
|
EnnoAi/EnnoAi-7B-French-Instruct-202502
|
EnnoAi_EnnoAi-Pro-Llama-3.1-8B-v1.0_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EnnoAi/EnnoAi-Pro-Llama-3.1-8B-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EnnoAi/EnnoAi-Pro-Llama-3.1-8B-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EnnoAi__EnnoAi-Pro-Llama-3.1-8B-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EnnoAi/EnnoAi-Pro-Llama-3.1-8B-v1.0
|
c740871122fd471a1a225cf2b4368e333752d74c
| 15.600496
|
apache-2.0
| 0
| 8.03
| true
| false
| false
| true
| 1.891283
| 0.470438
| 47.043844
| 0.416027
| 17.498296
| 0.037764
| 3.776435
| 0.26594
| 2.12528
| 0.383177
| 5.430469
| 0.259558
| 17.72865
| false
| false
|
2024-08-22
|
2024-09-06
| 0
|
EnnoAi/EnnoAi-Pro-Llama-3.1-8B-v1.0
|
Epiculous_Azure_Dusk-v0.2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/Epiculous/Azure_Dusk-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Epiculous/Azure_Dusk-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Epiculous__Azure_Dusk-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Epiculous/Azure_Dusk-v0.2
|
ebddf1b2efbe7f9cae066d263b0991ded89c88e8
| 14.239649
|
apache-2.0
| 8
| 12.248
| true
| false
| false
| true
| 3.982823
| 0.346716
| 34.67156
| 0.411972
| 17.396414
| 0.029456
| 2.945619
| 0.260906
| 1.454139
| 0.383458
| 6.365625
| 0.303441
| 22.604536
| false
| false
|
2024-09-09
|
2024-09-14
| 0
|
Epiculous/Azure_Dusk-v0.2
|
Epiculous_Crimson_Dawn-v0.2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/Epiculous/Crimson_Dawn-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Epiculous/Crimson_Dawn-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Epiculous__Crimson_Dawn-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Epiculous/Crimson_Dawn-v0.2
|
4cceb1e25026afef241ad5325097e88eccd8f37a
| 15.085951
|
apache-2.0
| 14
| 12.248
| true
| false
| false
| true
| 5.2534
| 0.310345
| 31.034544
| 0.448238
| 21.688249
| 0.043051
| 4.305136
| 0.276007
| 3.467562
| 0.415177
| 10.897135
| 0.272108
| 19.123079
| false
| false
|
2024-09-02
|
2024-09-05
| 0
|
Epiculous/Crimson_Dawn-v0.2
|
Epiculous_NovaSpark_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/Epiculous/NovaSpark" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Epiculous/NovaSpark</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Epiculous__NovaSpark-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Epiculous/NovaSpark
|
a46340895859e470c3e69661f0b894677cf4c5cb
| 25.253738
|
apache-2.0
| 7
| 8.03
| true
| false
| false
| true
| 1.63637
| 0.640847
| 64.08474
| 0.506396
| 29.526911
| 0.151813
| 15.181269
| 0.297819
| 6.375839
| 0.388198
| 6.92474
| 0.36486
| 29.42893
| false
| false
|
2024-10-13
|
2024-10-20
| 1
|
Epiculous/NovaSpark (Merge)
|
Epiculous_Violet_Twilight-v0.2_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/Epiculous/Violet_Twilight-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Epiculous/Violet_Twilight-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Epiculous__Violet_Twilight-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Epiculous/Violet_Twilight-v0.2
|
30c8bad3c1f565150afbf2fc90cacf4f45d096f6
| 18.552773
|
apache-2.0
| 31
| 12.248
| true
| false
| false
| true
| 1.770436
| 0.453178
| 45.317757
| 0.461455
| 23.940537
| 0.028701
| 2.870091
| 0.26594
| 2.12528
| 0.429938
| 13.608854
| 0.311087
| 23.454122
| true
| false
|
2024-09-12
|
2024-09-16
| 0
|
Epiculous/Violet_Twilight-v0.2
|
EpistemeAI_Alpaca-Llama3.1-8B_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Alpaca-Llama3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Alpaca-Llama3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Alpaca-Llama3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Alpaca-Llama3.1-8B
|
3152dfa17322dff7c6af6dbf3daceaf5db51e230
| 13.985046
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 1.841705
| 0.159869
| 15.986915
| 0.475526
| 25.935227
| 0.050604
| 5.060423
| 0.290268
| 5.369128
| 0.34026
| 6.599219
| 0.324634
| 24.959368
| false
| false
|
2024-09-11
|
2024-08-13
| 2
|
meta-llama/Meta-Llama-3.1-8B
|
EpistemeAI_Athena-gemma-2-2b-it_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Athena-gemma-2-2b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Athena-gemma-2-2b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Athena-gemma-2-2b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Athena-gemma-2-2b-it
|
661c1dc6a1a096222e33416e099bd02b7b970405
| 14.546092
|
apache-2.0
| 2
| 2
| true
| false
| false
| false
| 3.027716
| 0.313417
| 31.341729
| 0.426423
| 19.417818
| 0.049094
| 4.909366
| 0.268456
| 2.46085
| 0.435052
| 13.348177
| 0.242188
| 15.798611
| false
| false
|
2024-08-29
|
2024-09-06
| 4
|
google/gemma-2-9b
|
EpistemeAI_Athena-gemma-2-2b-it-Philos_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Athena-gemma-2-2b-it-Philos" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Athena-gemma-2-2b-it-Philos</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Athena-gemma-2-2b-it-Philos-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Athena-gemma-2-2b-it-Philos
|
dea2b35d496bd32ed3c88d42ff3022654153f2e1
| 15.663946
|
apache-2.0
| 0
| 2
| true
| false
| false
| true
| 2.257186
| 0.462095
| 46.209502
| 0.379478
| 13.212088
| 0.037009
| 3.700906
| 0.28104
| 4.138702
| 0.431365
| 12.853906
| 0.224817
| 13.868573
| false
| false
|
2024-09-05
|
2024-09-05
| 1
|
unsloth/gemma-2-2b-it-bnb-4bit
|
EpistemeAI_Athene-codegemma-2-7b-it-alpaca-v1.3_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
GemmaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Athene-codegemma-2-7b-it-alpaca-v1.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Athene-codegemma-2-7b-it-alpaca-v1.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Athene-codegemma-2-7b-it-alpaca-v1.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Athene-codegemma-2-7b-it-alpaca-v1.3
|
9c26e1242a11178b53937bc0e9a744ef6141e05a
| 17.314022
|
apache-2.0
| 1
| 7
| true
| false
| false
| false
| 1.943956
| 0.402994
| 40.299406
| 0.433192
| 20.873795
| 0.061934
| 6.193353
| 0.280201
| 4.026846
| 0.450302
| 14.854427
| 0.258727
| 17.636303
| false
| false
|
2024-09-06
|
2024-09-06
| 2
|
Removed
|
EpistemeAI_DeepPhi-3.5-mini-instruct_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/DeepPhi-3.5-mini-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/DeepPhi-3.5-mini-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__DeepPhi-3.5-mini-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/DeepPhi-3.5-mini-instruct
|
8fd61f3c0003a629524752d2f857c01d2f9843f4
| 3.464329
|
mit
| 0
| 3.821
| true
| false
| false
| false
| 0.423738
| 0.132592
| 13.259152
| 0.288229
| 1.667358
| 0.006798
| 0.679758
| 0.233221
| 0
| 0.365625
| 4.036458
| 0.110289
| 1.143248
| false
| false
|
2025-02-28
|
2025-02-28
| 2
|
microsoft/Phi-3.5-mini-instruct
|
EpistemeAI_DeepThinkers-Phi4_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/DeepThinkers-Phi4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/DeepThinkers-Phi4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__DeepThinkers-Phi4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/DeepThinkers-Phi4
|
3e2b390dc391232880542300f3ca1578f3b53ef5
| 39.407109
|
mit
| 4
| 14.66
| true
| false
| false
| true
| 0.914303
| 0.693979
| 69.397864
| 0.679042
| 53.786669
| 0.458459
| 45.845921
| 0.340604
| 12.080537
| 0.398063
| 8.024479
| 0.525765
| 47.307181
| false
| false
|
2025-02-28
|
2025-03-01
| 2
|
microsoft/phi-4
|
EpistemeAI_FineLlama3.1-8B-Instruct_4bit
|
4bit
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Adapter
|
?
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/FineLlama3.1-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/FineLlama3.1-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__FineLlama3.1-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/FineLlama3.1-8B-Instruct
|
a8b0fc584b10e0110e04f9d21c7f10d24391c1d5
| 11.239256
| 0
| 14.483
| false
| false
| false
| false
| 4.709922
| 0.08001
| 8.000993
| 0.455736
| 23.506619
| 0.034743
| 3.47432
| 0.280201
| 4.026846
| 0.348167
| 4.954167
| 0.311253
| 23.472592
| false
| false
|
2024-08-10
| 0
|
Removed
|
||
EpistemeAI_Fireball-12B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-12B
|
e2ed12c3244f2502321fb20e76dfc72ad7817d6e
| 15.534531
|
apache-2.0
| 1
| 12.248
| true
| false
| false
| false
| 3.237042
| 0.18335
| 18.335018
| 0.511089
| 30.666712
| 0.040785
| 4.07855
| 0.261745
| 1.565996
| 0.423635
| 12.521094
| 0.334358
| 26.03982
| false
| false
|
2024-08-20
|
2024-08-21
| 2
|
Removed
|
EpistemeAI_Fireball-12B-v1.13a-philosophers_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-12B-v1.13a-philosophers" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-12B-v1.13a-philosophers</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-12B-v1.13a-philosophers-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-12B-v1.13a-philosophers
|
7fa824d4a40abca3f8c75d432ea151dc0d1d67d6
| 14.466041
|
apache-2.0
| 2
| 12
| true
| false
| false
| false
| 3.325327
| 0.087553
| 8.755325
| 0.51027
| 30.336233
| 0.046073
| 4.607251
| 0.301174
| 6.823266
| 0.408073
| 9.975781
| 0.336686
| 26.298389
| false
| false
|
2024-08-28
|
2024-09-03
| 1
|
Removed
|
EpistemeAI_Fireball-Alpaca-Llama-3.1-8B-Philos-DPO-200_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Alpaca-Llama-3.1-8B-Philos-DPO-200" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Alpaca-Llama-3.1-8B-Philos-DPO-200</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Alpaca-Llama-3.1-8B-Philos-DPO-200-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-Alpaca-Llama-3.1-8B-Philos-DPO-200
|
27d67626304954db71f21fec9e7fc516421274ec
| 21.129914
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 1.844762
| 0.457724
| 45.772439
| 0.48384
| 26.377774
| 0.123112
| 12.311178
| 0.300336
| 6.711409
| 0.394458
| 6.907292
| 0.358295
| 28.699394
| false
| false
|
2024-09-16
|
2024-09-16
| 4
|
meta-llama/Meta-Llama-3.1-8B
|
EpistemeAI_Fireball-Alpaca-Llama3.1.07-8B-Philos-Math-KTO-beta_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Alpaca-Llama3.1.07-8B-Philos-Math-KTO-beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Alpaca-Llama3.1.07-8B-Philos-Math-KTO-beta</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Alpaca-Llama3.1.07-8B-Philos-Math-KTO-beta-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-Alpaca-Llama3.1.07-8B-Philos-Math-KTO-beta
|
2851384717556dd6ac14c00ed87aac1f267eb263
| 25.242228
|
apache-2.0
| 0
| 8
| true
| false
| false
| true
| 1.77129
| 0.727401
| 72.740107
| 0.486489
| 26.897964
| 0.152568
| 15.256798
| 0.280201
| 4.026846
| 0.361938
| 4.275521
| 0.354305
| 28.256132
| false
| false
|
2024-09-12
|
2024-09-14
| 5
|
meta-llama/Meta-Llama-3.1-8B
|
EpistemeAI_Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R2_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R2
|
b19336101aa5f4807d1574f4c11eebc1c1a1c34e
| 22.550477
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 1.623486
| 0.467316
| 46.731561
| 0.493203
| 28.247009
| 0.123867
| 12.386707
| 0.286074
| 4.809843
| 0.462365
| 16.995573
| 0.335189
| 26.132166
| false
| false
|
2024-09-14
|
2024-09-14
| 3
|
meta-llama/Meta-Llama-3.1-8B
|
EpistemeAI_Fireball-Meta-Llama-3.1-8B-Instruct-0.001-128K-auto_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-0.001-128K-auto" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-0.001-128K-auto</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.1-8B-Instruct-0.001-128K-auto-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-0.001-128K-auto
|
19b23c434b6c4524e2146926cdbf4f0e927ae3ab
| 21.567362
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 1.389989
| 0.443186
| 44.31863
| 0.482364
| 26.832967
| 0.132931
| 13.293051
| 0.312081
| 8.277405
| 0.406646
| 8.730729
| 0.351563
| 27.951389
| false
| false
|
2024-11-14
|
2024-11-15
| 2
|
meta-llama/Meta-Llama-3.1-8B
|
EpistemeAI_Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K
|
b4a88fb5fb27fc5d8a503303cdb7aaeff373fd92
| 20.627168
|
apache-2.0
| 3
| 8
| true
| false
| false
| false
| 1.629573
| 0.445734
| 44.573399
| 0.489732
| 28.025161
| 0.120846
| 12.084592
| 0.294463
| 5.928412
| 0.376229
| 4.895312
| 0.354305
| 28.256132
| false
| false
|
2024-09-26
|
2024-10-05
| 1
|
Removed
|
EpistemeAI_Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code
|
8e8f1569a8a01ed3d6588f2669c730d4993355b5
| 23.934714
|
apache-2.0
| 2
| 8
| true
| false
| false
| false
| 1.708636
| 0.597533
| 59.753343
| 0.490419
| 28.171888
| 0.133686
| 13.36858
| 0.302013
| 6.935123
| 0.401031
| 8.46224
| 0.342254
| 26.91711
| false
| false
|
2024-10-04
|
2024-10-05
| 2
|
Removed
|
EpistemeAI_Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds
|
8b73dd02349f0544c48c581cc73ada5cac6ff946
| 23.144165
|
llama3.1
| 2
| 8
| true
| false
| false
| true
| 2.593826
| 0.669099
| 66.90991
| 0.466807
| 24.462654
| 0.133686
| 13.36858
| 0.272651
| 3.020134
| 0.341781
| 4.55599
| 0.33893
| 26.547725
| false
| false
|
2024-10-14
|
2024-10-15
| 4
|
Removed
|
EpistemeAI_Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto
|
f18598c62a783bcc0d436a35df0c8a335e8ee5d7
| 23.749941
|
apache-2.0
| 7
| 8.03
| true
| false
| false
| true
| 2.285306
| 0.730498
| 73.049841
| 0.464925
| 24.586737
| 0.139728
| 13.97281
| 0.26594
| 2.12528
| 0.320885
| 1.210677
| 0.347989
| 27.5543
| false
| false
|
2024-10-21
|
2024-10-29
| 1
|
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto (Merge)
|
EpistemeAI_Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto
|
055e87600d18e58594a8d193f45c0ee9a90e1780
| 23.627287
|
apache-2.0
| 7
| 8.03
| true
| false
| false
| true
| 1.344136
| 0.720707
| 72.070661
| 0.461009
| 23.544253
| 0.13142
| 13.141994
| 0.270134
| 2.684564
| 0.34324
| 4.171615
| 0.335356
| 26.150635
| false
| false
|
2024-10-21
|
2024-11-27
| 1
|
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto (Merge)
|
EpistemeAI_Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-COT_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-COT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-COT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-COT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-COT
|
bb90c19dc7c4a509e7bd73f4620dca818b58be25
| 20.857427
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 1.678073
| 0.457824
| 45.782413
| 0.476052
| 25.820865
| 0.138218
| 13.821752
| 0.293624
| 5.816555
| 0.388135
| 6.45026
| 0.347074
| 27.452719
| false
| false
|
2024-10-11
|
2024-10-11
| 3
|
Removed
|
EpistemeAI_Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-ds-auto_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-ds-auto" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-ds-auto</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-ds-auto-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-ds-auto
|
db5ddb161ed26bc16baa814e31892dbe2f22b7a0
| 23.874259
|
apache-2.0
| 1
| 8
| true
| false
| false
| true
| 1.490262
| 0.720482
| 72.048166
| 0.48178
| 26.45206
| 0.143505
| 14.350453
| 0.248322
| 0
| 0.33
| 2.083333
| 0.354804
| 28.31154
| false
| false
|
2024-11-14
|
2024-11-14
| 1
|
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-ds-auto (Merge)
|
EpistemeAI_Fireball-Meta-Llama-3.1-8B-Instruct-Math_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Math" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Math</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.1-8B-Instruct-Math-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Math
|
677c97b4f92bfc330d4fae628e9a1df1ef606dcc
| 20.557929
|
apache-2.0
| 0
| 8.03
| true
| false
| false
| false
| 1.820543
| 0.462296
| 46.22956
| 0.498295
| 28.959344
| 0.108006
| 10.800604
| 0.291107
| 5.480984
| 0.364073
| 5.975781
| 0.333112
| 25.9013
| false
| false
|
2024-09-23
|
2024-09-23
| 3
|
meta-llama/Meta-Llama-3.1-8B
|
EpistemeAI_Fireball-Meta-Llama-3.2-8B-Instruct-agent-003-128k-code-DPO_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.2-8B-Instruct-agent-003-128k-code-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.2-8B-Instruct-agent-003-128k-code-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.2-8B-Instruct-agent-003-128k-code-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-Meta-Llama-3.2-8B-Instruct-agent-003-128k-code-DPO
|
b3c0fce7daa359cd8ed5be6595dd1a76ca2cfea2
| 21.293561
|
apache-2.0
| 1
| 8
| true
| false
| false
| false
| 1.667152
| 0.461097
| 46.109656
| 0.480101
| 26.317878
| 0.125378
| 12.537764
| 0.300336
| 6.711409
| 0.399823
| 8.077865
| 0.352061
| 28.006797
| false
| false
|
2024-10-08
|
2024-10-09
| 3
|
Removed
|
EpistemeAI_Fireball-Mistral-Nemo-Base-2407-v1-DPO2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Mistral-Nemo-Base-2407-v1-DPO2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Mistral-Nemo-Base-2407-v1-DPO2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Mistral-Nemo-Base-2407-v1-DPO2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-Mistral-Nemo-Base-2407-v1-DPO2
|
2cf732fbffefdf37341b946edd7995f14d3f9487
| 15.33934
|
apache-2.0
| 0
| 12.248
| true
| false
| false
| false
| 3.542538
| 0.186073
| 18.607295
| 0.496777
| 28.567825
| 0.036254
| 3.625378
| 0.291946
| 5.592841
| 0.40401
| 9.501302
| 0.335273
| 26.141401
| false
| false
|
2024-08-19
|
2024-08-19
| 1
|
Removed
|
EpistemeAI_Fireball-R1-Llama-3.1-8B_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-R1-Llama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-R1-Llama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-R1-Llama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-R1-Llama-3.1-8B
|
7d7ca4fa9887a0c6d721353fa962ed93e633d856
| 14.729863
|
llama3.1
| 0
| 8.03
| true
| false
| false
| true
| 0.751711
| 0.442736
| 44.273638
| 0.36435
| 10.273656
| 0.311178
| 31.117825
| 0.248322
| 0
| 0.328792
| 1.432292
| 0.111536
| 1.281767
| false
| false
|
2025-02-11
|
2025-02-12
| 2
|
deepseek-ai/DeepSeek-R1-Distill-Llama-8B
|
EpistemeAI_Fireball-R1-Llama-3.1-8B-Medical-COT_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-R1-Llama-3.1-8B-Medical-COT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-R1-Llama-3.1-8B-Medical-COT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-R1-Llama-3.1-8B-Medical-COT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-R1-Llama-3.1-8B-Medical-COT
|
66b8420e4c1003aedfc809f68e8f346ae972710a
| 14.486213
|
apache-2.0
| 1
| 8.03
| true
| false
| false
| true
| 0.747616
| 0.321611
| 32.16111
| 0.371627
| 12.153439
| 0.327039
| 32.703927
| 0.274329
| 3.243848
| 0.311365
| 2.18724
| 0.140209
| 4.467716
| false
| false
|
2025-02-16
|
2025-02-28
| 3
|
deepseek-ai/DeepSeek-R1-Distill-Llama-8B
|
EpistemeAI_Fireball-R1.1-Llama-3.1-8B_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-R1.1-Llama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-R1.1-Llama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-R1.1-Llama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-R1.1-Llama-3.1-8B
|
863882f3081647135d269b82698e079b4c78d9ee
| 10.130882
|
llama3.1
| 2
| 8.03
| true
| false
| false
| true
| 0.80739
| 0.367623
| 36.762346
| 0.3326
| 6.286856
| 0.138218
| 13.821752
| 0.251678
| 0.223714
| 0.341938
| 2.408854
| 0.111536
| 1.281767
| false
| false
|
2025-02-27
|
2025-03-02
| 3
|
deepseek-ai/DeepSeek-R1-Distill-Llama-8B
|
EpistemeAI_Llama-3.2-3B-Agent007-Coder_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Llama-3.2-3B-Agent007-Coder" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Llama-3.2-3B-Agent007-Coder</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Llama-3.2-3B-Agent007-Coder-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Llama-3.2-3B-Agent007-Coder
|
7ff4e77796b6d308e96d0150e1a01081c0b82e01
| 18.914562
|
apache-2.0
| 0
| 3
| true
| false
| false
| false
| 1.421631
| 0.539956
| 53.995621
| 0.430376
| 19.025809
| 0.111027
| 11.102719
| 0.25755
| 1.006711
| 0.366802
| 7.783594
| 0.285156
| 20.572917
| false
| false
|
2024-10-08
|
2024-10-08
| 2
|
meta-llama/Llama-3.2-3B-Instruct
|
EpistemeAI_Mistral-Nemo-Instruct-12B-Philosophy-Math_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Mistral-Nemo-Instruct-12B-Philosophy-Math" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Mistral-Nemo-Instruct-12B-Philosophy-Math</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Mistral-Nemo-Instruct-12B-Philosophy-Math-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Mistral-Nemo-Instruct-12B-Philosophy-Math
|
1ac4205f8da109326b4a5cf173e5491a20087d76
| 16.603997
|
apache-2.0
| 1
| 12.248
| true
| false
| false
| false
| 2.727215
| 0.069468
| 6.94679
| 0.536493
| 33.835811
| 0.095921
| 9.592145
| 0.331376
| 10.850112
| 0.429219
| 12.885677
| 0.329621
| 25.513446
| false
| false
|
2024-09-15
|
2024-09-26
| 1
|
unsloth/Mistral-Nemo-Instruct-2407-bnb-4bit
|
EpistemeAI_OpenReasoner-Llama-3.2-3B-rs1.0_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/OpenReasoner-Llama-3.2-3B-rs1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/OpenReasoner-Llama-3.2-3B-rs1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__OpenReasoner-Llama-3.2-3B-rs1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/OpenReasoner-Llama-3.2-3B-rs1.0
|
94ac34a32fd2266e84f92e60eab63131540fce2e
| 22.939579
|
llama3.2
| 1
| 3.213
| true
| false
| false
| true
| 0.592436
| 0.727401
| 72.740107
| 0.451859
| 22.807808
| 0.134441
| 13.444109
| 0.271812
| 2.908277
| 0.346063
| 2.024479
| 0.313414
| 23.712692
| false
| false
|
2025-02-14
|
2025-02-15
| 2
|
Removed
|
EpistemeAI_Polypsyche-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-Empathy_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Polypsyche-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-Empathy" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Polypsyche-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-Empathy</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Polypsyche-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-Empathy-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Polypsyche-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-Empathy
|
daabf0dcd2915991531abac59da346f27864c7e7
| 23.321646
|
apache-2.0
| 0
| 8
| true
| false
| false
| true
| 1.335663
| 0.71009
| 71.009034
| 0.462799
| 24.419414
| 0.139728
| 13.97281
| 0.276846
| 3.579418
| 0.31949
| 1.269531
| 0.331117
| 25.679669
| false
| false
|
2024-12-13
|
2024-12-13
| 2
|
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto (Merge)
|
EpistemeAI_Polypsyche-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-Logic_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Polypsyche-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-Logic" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Polypsyche-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-Logic</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Polypsyche-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-Logic-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Polypsyche-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-Logic
|
20a0141e08db10f1d0ffb771676e56c7d2045acf
| 23.204901
|
apache-2.0
| 1
| 8.03
| true
| false
| false
| true
| 1.368996
| 0.712214
| 71.221359
| 0.456594
| 23.576451
| 0.124622
| 12.462236
| 0.284396
| 4.58613
| 0.32349
| 1.269531
| 0.335023
| 26.113697
| false
| false
|
2024-12-13
|
2024-12-20
| 2
|
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto (Merge)
|
EpistemeAI_Polypsyche-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-divergent_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Polypsyche-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-divergent" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Polypsyche-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-divergent</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Polypsyche-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-divergent-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Polypsyche-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-divergent
|
3cba0f0085c1f95f011cbf76d35a2303c54b2141
| 23.027256
|
apache-2.0
| 1
| 8.03
| true
| false
| false
| true
| 1.395884
| 0.691531
| 69.153069
| 0.452473
| 22.890368
| 0.129154
| 12.915408
| 0.266779
| 2.237136
| 0.35775
| 5.51875
| 0.329039
| 25.448803
| false
| false
|
2024-12-13
|
2024-12-20
| 2
|
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto (Merge)
|
EpistemeAI_Reasoning-Llama-3.1-CoT-RE1-NMT_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Reasoning-Llama-3.1-CoT-RE1-NMT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Reasoning-Llama-3.1-CoT-RE1-NMT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Reasoning-Llama-3.1-CoT-RE1-NMT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Reasoning-Llama-3.1-CoT-RE1-NMT
|
3ae39e39a02ff222a7436499462261b22ca28367
| 19.208175
|
apache-2.0
| 1
| 8.03
| true
| false
| false
| true
| 1.44784
| 0.482853
| 48.285327
| 0.473576
| 25.544054
| 0.129909
| 12.990937
| 0.260906
| 1.454139
| 0.318219
| 0.94401
| 0.334275
| 26.030585
| false
| false
|
2025-01-29
|
2025-01-29
| 0
|
EpistemeAI/Reasoning-Llama-3.1-CoT-RE1-NMT
|
EpistemeAI_Reasoning-Llama-3.1-CoT-RE1-NMT-V2-ORPO_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Reasoning-Llama-3.1-CoT-RE1-NMT-V2-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Reasoning-Llama-3.1-CoT-RE1-NMT-V2-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Reasoning-Llama-3.1-CoT-RE1-NMT-V2-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Reasoning-Llama-3.1-CoT-RE1-NMT-V2-ORPO
|
d37dcdb2f9a663c356fb670b6e449b4ef1b54977
| 21.33158
|
apache-2.0
| 0
| 8.03
| true
| false
| false
| false
| 1.442784
| 0.455326
| 45.532631
| 0.480422
| 25.895601
| 0.129154
| 12.915408
| 0.307047
| 7.606264
| 0.393125
| 7.173958
| 0.359791
| 28.865618
| false
| false
|
2025-01-31
|
2025-01-31
| 2
|
EpistemeAI/Reasoning-Llama-3.1-CoT-RE1-NMT
|
EpistemeAI_Reasoning-Llama-3.2-1B-Instruct-v1.2_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Reasoning-Llama-3.2-1B-Instruct-v1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Reasoning-Llama-3.2-1B-Instruct-v1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Reasoning-Llama-3.2-1B-Instruct-v1.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Reasoning-Llama-3.2-1B-Instruct-v1.2
|
a72b9f8f059647f799209c19931e263be79fbc03
| 9.508932
|
apache-2.0
| 0
| 1.236
| true
| false
| false
| true
| 0.903168
| 0.408714
| 40.871443
| 0.33245
| 6.577215
| 0.050604
| 5.060423
| 0.260906
| 1.454139
| 0.322188
| 1.106771
| 0.117852
| 1.983599
| false
| false
|
2025-02-04
|
2025-02-04
| 1
|
Removed
|
EpistemeAI_Reasoning-Llama-3.2-1B-Instruct-v1.3_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Reasoning-Llama-3.2-1B-Instruct-v1.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Reasoning-Llama-3.2-1B-Instruct-v1.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Reasoning-Llama-3.2-1B-Instruct-v1.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Reasoning-Llama-3.2-1B-Instruct-v1.3
|
b7dfff75dc619c3a5705a5ffbdea2310db121b96
| 8.170099
|
apache-2.0
| 0
| 1.236
| true
| false
| false
| true
| 0.753863
| 0.327282
| 32.728161
| 0.326282
| 6.111151
| 0.050604
| 5.060423
| 0.258389
| 1.118568
| 0.326
| 2.083333
| 0.117271
| 1.918957
| false
| false
|
2025-02-04
|
2025-02-05
| 2
|
Removed
|
EpistemeAI_Reasoning-Llama-3.2-3B-Math-Instruct-RE1_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Reasoning-Llama-3.2-3B-Math-Instruct-RE1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Reasoning-Llama-3.2-3B-Math-Instruct-RE1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Reasoning-Llama-3.2-3B-Math-Instruct-RE1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Reasoning-Llama-3.2-3B-Math-Instruct-RE1
|
a40bd9becd2d0bd8ed6ca5727d5b2b4f5cb75393
| 17.780032
| 0
| 3.213
| false
| false
| false
| true
| 1.226327
| 0.511954
| 51.195384
| 0.438108
| 20.728882
| 0.108006
| 10.800604
| 0.264262
| 1.901566
| 0.343521
| 2.173438
| 0.278923
| 19.880319
| false
| false
|
2025-02-04
| 0
|
Removed
|
||
EpistemeAI_Reasoning-Llama-3.2-3B-Math-Instruct-RE1-ORPO_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Reasoning-Llama-3.2-3B-Math-Instruct-RE1-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Reasoning-Llama-3.2-3B-Math-Instruct-RE1-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Reasoning-Llama-3.2-3B-Math-Instruct-RE1-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Reasoning-Llama-3.2-3B-Math-Instruct-RE1-ORPO
|
563f7d4f8cd930e2b8079ec4844f8259ac19ad1c
| 23.43064
|
apache-2.0
| 0
| 3.213
| true
| false
| false
| true
| 1.35051
| 0.728975
| 72.897468
| 0.451819
| 23.00465
| 0.153323
| 15.332326
| 0.27349
| 3.131991
| 0.348667
| 2.883333
| 0.310007
| 23.334072
| false
| false
|
2025-01-31
|
2025-02-04
| 1
|
Removed
|
EpistemeAI_ReasoningCore-1.0-3B-Instruct-r01-Reflect-Math_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/ReasoningCore-1.0-3B-Instruct-r01-Reflect-Math" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/ReasoningCore-1.0-3B-Instruct-r01-Reflect-Math</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__ReasoningCore-1.0-3B-Instruct-r01-Reflect-Math-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/ReasoningCore-1.0-3B-Instruct-r01-Reflect-Math
|
28e5169a7406b61fa7bbfbeecf8a8d544f1650dd
| 19.475017
|
llama3.2
| 0
| 3.213
| true
| false
| false
| true
| 0.596276
| 0.590289
| 59.028932
| 0.43638
| 19.821269
| 0.148036
| 14.803625
| 0.260067
| 1.342282
| 0.331427
| 1.595052
| 0.28233
| 20.258939
| false
| false
|
2025-03-08
|
2025-03-08
| 4
|
meta-llama/Llama-3.2-3B-Instruct
|
EpistemeAI_ReasoningCore-3B-0_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/ReasoningCore-3B-0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/ReasoningCore-3B-0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__ReasoningCore-3B-0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/ReasoningCore-3B-0
|
8eebaf7d2bef9d80ba3e99c19e61f46da7bd83a9
| 23.526304
|
llama3.2
| 2
| 3.213
| true
| false
| false
| true
| 0.585392
| 0.734145
| 73.41454
| 0.444607
| 22.166823
| 0.15861
| 15.861027
| 0.272651
| 3.020134
| 0.355396
| 2.557813
| 0.317237
| 24.137485
| false
| false
|
2025-02-07
|
2025-02-07
| 2
|
meta-llama/Llama-3.2-3B-Instruct
|
EpistemeAI_ReasoningCore-3B-Instruct-r01-Reflect_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/ReasoningCore-3B-Instruct-r01-Reflect" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/ReasoningCore-3B-Instruct-r01-Reflect</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__ReasoningCore-3B-Instruct-r01-Reflect-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/ReasoningCore-3B-Instruct-r01-Reflect
|
62e3cb0026f2cbd1d70e8fb45fcaf26d7256cc7d
| 23.51163
|
llama3.2
| 1
| 3.213
| true
| false
| false
| true
| 0.567969
| 0.733496
| 73.349601
| 0.444963
| 22.265679
| 0.154079
| 15.407855
| 0.27349
| 3.131991
| 0.352729
| 3.091146
| 0.314412
| 23.823508
| false
| false
|
2025-02-28
|
2025-02-28
| 2
|
meta-llama/Llama-3.2-3B-Instruct
|
EpistemeAI_ReasoningCore-3B-R01_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/ReasoningCore-3B-R01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/ReasoningCore-3B-R01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__ReasoningCore-3B-R01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/ReasoningCore-3B-R01
|
046404c2e8b0c956f0c50f0e5e8f423455306ff1
| 14.035245
|
apache-2.0
| 1
| 3.213
| true
| false
| false
| true
| 0.607064
| 0.297606
| 29.760591
| 0.437252
| 20.624367
| 0.129909
| 12.990937
| 0.260906
| 1.454139
| 0.319458
| 1.698958
| 0.259142
| 17.682476
| false
| false
|
2025-02-07
|
2025-02-08
| 3
|
meta-llama/Llama-3.2-3B-Instruct
|
EpistemeAI_ReasoningCore-3B-RE1-V2_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/ReasoningCore-3B-RE1-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/ReasoningCore-3B-RE1-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__ReasoningCore-3B-RE1-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/ReasoningCore-3B-RE1-V2
|
429b18420956128532a4286a9d2180f6ba3aacae
| 23.570873
|
llama3.2
| 0
| 3.213
| true
| false
| false
| true
| 0.582416
| 0.739316
| 73.931613
| 0.446239
| 22.472884
| 0.156344
| 15.634441
| 0.27349
| 3.131991
| 0.354063
| 2.024479
| 0.318068
| 24.229832
| false
| false
|
2025-02-18
|
2025-02-19
| 3
|
meta-llama/Llama-3.2-3B-Instruct
|
EpistemeAI_ReasoningCore-3B-RE1-V2A_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/ReasoningCore-3B-RE1-V2A" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/ReasoningCore-3B-RE1-V2A</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__ReasoningCore-3B-RE1-V2A-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/ReasoningCore-3B-RE1-V2A
|
d8a325429306b3993f731eb580187d545e221de6
| 18.398292
|
llama3.2
| 0
| 3.213
| true
| false
| false
| true
| 0.585427
| 0.573253
| 57.325341
| 0.41899
| 18.059429
| 0.0929
| 9.29003
| 0.277685
| 3.691275
| 0.335208
| 2.734375
| 0.273604
| 19.289303
| false
| false
|
2025-02-24
|
2025-02-25
| 4
|
meta-llama/Llama-3.2-3B-Instruct
|
EpistemeAI_ReasoningCore-3B-RE1-V2B_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/ReasoningCore-3B-RE1-V2B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/ReasoningCore-3B-RE1-V2B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__ReasoningCore-3B-RE1-V2B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/ReasoningCore-3B-RE1-V2B
|
a66f75bde34c2d08062cb1ae455a789e0bba9e1d
| 16.803596
|
llama3.2
| 1
| 3.213
| true
| false
| false
| true
| 0.606126
| 0.50511
| 50.510978
| 0.416789
| 17.62919
| 0.107251
| 10.725076
| 0.261745
| 1.565996
| 0.344823
| 1.802865
| 0.267287
| 18.58747
| false
| false
|
2025-02-24
|
2025-02-25
| 5
|
meta-llama/Llama-3.2-3B-Instruct
|
EpistemeAI_ReasoningCore-3B-RE1-V2C_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/ReasoningCore-3B-RE1-V2C" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/ReasoningCore-3B-RE1-V2C</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__ReasoningCore-3B-RE1-V2C-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/ReasoningCore-3B-RE1-V2C
|
aa270574bdcf8d85d273d7bc74ce85de5d9505b1
| 16.648062
|
llama3.2
| 0
| 3.213
| true
| false
| false
| true
| 0.598041
| 0.505709
| 50.57093
| 0.417746
| 17.793271
| 0.097432
| 9.743202
| 0.260906
| 1.454139
| 0.342156
| 1.536198
| 0.269116
| 18.790632
| false
| false
|
2025-02-26
|
2025-02-27
| 6
|
meta-llama/Llama-3.2-3B-Instruct
|
EpistemeAI_ReasoningCore-3B-T1-V1_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/ReasoningCore-3B-T1-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/ReasoningCore-3B-T1-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__ReasoningCore-3B-T1-V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/ReasoningCore-3B-T1-V1
|
a6f33848cdb2eb198d4ea44b1988238c25a2501b
| 23.243643
|
llama3.2
| 0
| 3.213
| true
| false
| false
| true
| 0.576198
| 0.720756
| 72.075648
| 0.451691
| 23.065333
| 0.14577
| 14.577039
| 0.276007
| 3.467562
| 0.354031
| 2.720573
| 0.312001
| 23.555703
| false
| false
|
2025-02-10
|
2025-02-11
| 3
|
Removed
|
EpistemeAI_ReasoningCore-3B-T1_1_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/ReasoningCore-3B-T1_1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/ReasoningCore-3B-T1_1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__ReasoningCore-3B-T1_1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/ReasoningCore-3B-T1_1
|
a1564589a7b766e23d444f2adfdfaef21aee3ce3
| 23.492475
|
apache-2.0
| 0
| 3.213
| true
| false
| false
| true
| 1.165466
| 0.727451
| 72.745094
| 0.452394
| 23.094999
| 0.154079
| 15.407855
| 0.276007
| 3.467562
| 0.355365
| 2.720573
| 0.311669
| 23.518765
| false
| false
|
2025-02-06
|
2025-02-06
| 2
|
Removed
|
EpistemeAI2_Athene-codegemma-2-7b-it-alpaca-v1.2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
GemmaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Athene-codegemma-2-7b-it-alpaca-v1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Athene-codegemma-2-7b-it-alpaca-v1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Athene-codegemma-2-7b-it-alpaca-v1.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI2/Athene-codegemma-2-7b-it-alpaca-v1.2
|
21b31062334a316b50680e8c3a141a72e4c30b61
| 15.718391
|
apache-2.0
| 0
| 7
| true
| false
| false
| false
| 1.93927
| 0.435118
| 43.511771
| 0.417542
| 18.97137
| 0.042296
| 4.229607
| 0.270973
| 2.796421
| 0.416969
| 10.38776
| 0.229721
| 14.413416
| false
| false
|
2024-08-26
|
2024-08-26
| 2
|
Removed
|
EpistemeAI2_Fireball-12B-v1.2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-12B-v1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-12B-v1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-12B-v1.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI2/Fireball-12B-v1.2
|
57af42edf8232189ee99e9a21e33a0c306e3f561
| 15.200287
|
apache-2.0
| 1
| 12
| true
| false
| false
| false
| 3.745129
| 0.135539
| 13.553926
| 0.501858
| 29.776014
| 0.041541
| 4.154079
| 0.298658
| 6.487696
| 0.417313
| 11.264062
| 0.333693
| 25.965943
| false
| false
|
2024-08-27
|
2024-08-28
| 1
|
Removed
|
EpistemeAI2_Fireball-Alpaca-Llama3.1-8B-Philos_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1-8B-Philos" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1-8B-Philos</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1-8B-Philos-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI2/Fireball-Alpaca-Llama3.1-8B-Philos
|
3dcca4cf9bdd9003c8dc91f5c78cefef1d4ae0d7
| 22.551674
|
apache-2.0
| 1
| 8
| true
| false
| false
| false
| 1.696664
| 0.49864
| 49.864027
| 0.497758
| 29.259226
| 0.11858
| 11.858006
| 0.292785
| 5.704698
| 0.427667
| 11.891667
| 0.340592
| 26.732417
| false
| false
|
2024-08-29
|
2024-08-29
| 3
|
meta-llama/Meta-Llama-3.1-8B
|
EpistemeAI2_Fireball-Alpaca-Llama3.1.01-8B-Philos_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.01-8B-Philos" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.01-8B-Philos</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.01-8B-Philos-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI2/Fireball-Alpaca-Llama3.1.01-8B-Philos
|
f97293ed5cec7fb9482b16600259967c6c923e4b
| 21.567144
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 1.741143
| 0.421179
| 42.117914
| 0.495611
| 28.628475
| 0.135952
| 13.595166
| 0.288591
| 5.145414
| 0.437062
| 13.432813
| 0.338348
| 26.483082
| false
| false
|
2024-09-03
|
2024-09-03
| 3
|
meta-llama/Meta-Llama-3.1-8B
|
EpistemeAI2_Fireball-Alpaca-Llama3.1.03-8B-Philos_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.03-8B-Philos" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.03-8B-Philos</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.03-8B-Philos-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI2/Fireball-Alpaca-Llama3.1.03-8B-Philos
|
6e60f783f80f7d126b8e4f2b417e14dea63d2c4f
| 20.274573
|
apache-2.0
| 1
| 8
| true
| false
| false
| false
| 1.595046
| 0.388081
| 38.80814
| 0.495087
| 27.992549
| 0.128399
| 12.839879
| 0.278523
| 3.803132
| 0.42801
| 12.034635
| 0.335522
| 26.169105
| false
| false
|
2024-09-04
|
2024-09-04
| 3
|
meta-llama/Meta-Llama-3.1-8B
|
EpistemeAI2_Fireball-Alpaca-Llama3.1.04-8B-Philos_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.04-8B-Philos" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.04-8B-Philos</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.04-8B-Philos-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI2/Fireball-Alpaca-Llama3.1.04-8B-Philos
|
efd0c251373e1a2fa2bc8cead502c03ff6dc7c8b
| 21.094517
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 1.530496
| 0.40844
| 40.843961
| 0.493001
| 27.963798
| 0.120091
| 12.009063
| 0.290268
| 5.369128
| 0.437219
| 13.685677
| 0.340259
| 26.695479
| false
| false
|
2024-09-05
|
2024-09-05
| 3
|
meta-llama/Meta-Llama-3.1-8B
|
EpistemeAI2_Fireball-Alpaca-Llama3.1.06-8B-Philos-dpo_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.06-8B-Philos-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.06-8B-Philos-dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.06-8B-Philos-dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI2/Fireball-Alpaca-Llama3.1.06-8B-Philos-dpo
|
3e76f190b505b515479cc25e92f8229c2b05159f
| 21.867632
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 1.869548
| 0.486576
| 48.657562
| 0.488077
| 27.207177
| 0.130665
| 13.066465
| 0.297819
| 6.375839
| 0.393188
| 6.848437
| 0.361453
| 29.05031
| false
| false
|
2024-09-09
|
2024-09-09
| 5
|
meta-llama/Meta-Llama-3.1-8B
|
EpistemeAI2_Fireball-Alpaca-Llama3.1.07-8B-Philos-Math_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.07-8B-Philos-Math" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.07-8B-Philos-Math</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.07-8B-Philos-Math-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI2/Fireball-Alpaca-Llama3.1.07-8B-Philos-Math
|
0b2842bddfa6c308f67eb5a20daf04536a4e6d1a
| 21.97087
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 1.804059
| 0.507908
| 50.790791
| 0.484702
| 26.901201
| 0.120091
| 12.009063
| 0.296141
| 6.152125
| 0.406302
| 7.854427
| 0.353059
| 28.117612
| false
| false
|
2024-09-10
|
2024-09-10
| 4
|
meta-llama/Meta-Llama-3.1-8B
|
EpistemeAI2_Fireball-Alpaca-Llama3.1.08-8B-C-R1-KTO-Reflection_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-C-R1-KTO-Reflection" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-C-R1-KTO-Reflection</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.08-8B-C-R1-KTO-Reflection-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-C-R1-KTO-Reflection
|
dc900138b4406353b7e84251bc8649d70c16f13f
| 20.894625
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 1.767948
| 0.395226
| 39.522578
| 0.495531
| 27.571611
| 0.124622
| 12.462236
| 0.299497
| 6.599553
| 0.404813
| 10.401563
| 0.359292
| 28.81021
| false
| false
|
2024-09-16
|
2024-09-16
| 6
|
meta-llama/Meta-Llama-3.1-8B
|
EpistemeAI2_Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R1_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R1
|
c57c786426123635baf6c8b4d30638d2053f4565
| 22.511188
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 1.819518
| 0.531638
| 53.163828
| 0.482793
| 26.763685
| 0.123867
| 12.386707
| 0.29698
| 6.263982
| 0.410302
| 8.454427
| 0.352311
| 28.034501
| false
| false
|
2024-09-13
|
2024-09-13
| 4
|
meta-llama/Meta-Llama-3.1-8B
|
EpistemeAI2_Fireball-Llama-3.1-8B-Philos-Reflection_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Llama-3.1-8B-Philos-Reflection" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Llama-3.1-8B-Philos-Reflection</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Llama-3.1-8B-Philos-Reflection-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI2/Fireball-Llama-3.1-8B-Philos-Reflection
|
4b0b75d9235886e8a947c45b94f87c5a65a81467
| 20.376721
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 1.789887
| 0.359605
| 35.960474
| 0.489769
| 27.769796
| 0.128399
| 12.839879
| 0.307886
| 7.718121
| 0.395729
| 9.632813
| 0.355053
| 28.339243
| false
| false
|
2024-09-17
|
2024-09-17
| 5
|
meta-llama/Meta-Llama-3.1-8B
|
EpistemeAI2_Fireball-MathMistral-Nemo-Base-2407-v2dpo_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-MathMistral-Nemo-Base-2407-v2dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-MathMistral-Nemo-Base-2407-v2dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-MathMistral-Nemo-Base-2407-v2dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI2/Fireball-MathMistral-Nemo-Base-2407-v2dpo
|
6b7d851c66359f39d16da6fbcf810b816dc6e4bc
| 11.369983
|
apache-2.0
| 1
| 11.58
| true
| false
| false
| true
| 3.762851
| 0.30972
| 30.972043
| 0.432764
| 21.145528
| 0.037009
| 3.700906
| 0.263423
| 1.789709
| 0.402958
| 8.969792
| 0.114777
| 1.641918
| false
| false
|
2024-08-21
|
2024-08-24
| 2
|
unsloth/Mistral-Nemo-Base-2407-bnb-4bit
|
EpistemeAI2_Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-math_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-math" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-math</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-math-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI2/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-math
|
aa21037cf0984cb293facb69c41895e7fccb1340
| 22.727957
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 1.583365
| 0.551547
| 55.154656
| 0.480756
| 26.743767
| 0.135196
| 13.519637
| 0.30453
| 7.270694
| 0.36925
| 6.789583
| 0.342005
| 26.889406
| false
| false
|
2024-10-11
|
2024-10-12
| 3
|
Removed
|
EpistemeAI2_Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.005-128K-code-COT_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.005-128K-code-COT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.005-128K-code-COT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.005-128K-code-COT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI2/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.005-128K-code-COT
|
cf8b99d4aa00c18fdaebfb24fa3c674ee6defa1a
| 21.037759
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 1.601635
| 0.46332
| 46.331955
| 0.479083
| 26.400992
| 0.117069
| 11.706949
| 0.312081
| 8.277405
| 0.377438
| 5.013021
| 0.356466
| 28.496232
| false
| false
|
2024-10-11
|
2024-10-11
| 3
|
Removed
|
EpistemeAI2_Fireball-Phi-3-medium-4k-inst-Philos_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Phi-3-medium-4k-inst-Philos" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Phi-3-medium-4k-inst-Philos</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Phi-3-medium-4k-inst-Philos-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI2/Fireball-Phi-3-medium-4k-inst-Philos
|
147715051102034fac98091e2a0cae6cade15ae0
| 29.676367
|
apache-2.0
| 0
| 13.96
| true
| false
| false
| true
| 1.543628
| 0.531288
| 53.128809
| 0.617784
| 46.208873
| 0.170695
| 17.069486
| 0.332215
| 10.961969
| 0.413906
| 10.704948
| 0.459857
| 39.984116
| false
| false
|
2024-09-19
|
2024-09-20
| 1
|
unsloth/phi-3-medium-4k-instruct-bnb-4bit
|
Eric111_CatunaMayo_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/Eric111/CatunaMayo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Eric111/CatunaMayo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Eric111__CatunaMayo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Eric111/CatunaMayo
|
23337893381293975cbcc35f75b634954fbcefaf
| 21.273979
|
apache-2.0
| 0
| 7.242
| true
| false
| false
| false
| 1.101649
| 0.407416
| 40.741566
| 0.524364
| 33.299426
| 0.084592
| 8.459215
| 0.291946
| 5.592841
| 0.45399
| 15.348698
| 0.317819
| 24.202128
| true
| false
|
2024-02-15
|
2024-07-03
| 0
|
Eric111/CatunaMayo
|
Eric111_CatunaMayo-DPO_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/Eric111/CatunaMayo-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Eric111/CatunaMayo-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Eric111__CatunaMayo-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Eric111/CatunaMayo-DPO
|
6bdbe06c10d57d152dd8a79a71edd8e30135b689
| 21.292885
|
apache-2.0
| 1
| 7.242
| true
| false
| false
| false
| 1.108045
| 0.421454
| 42.145396
| 0.522399
| 33.089952
| 0.081571
| 8.1571
| 0.291946
| 5.592841
| 0.445031
| 14.66224
| 0.316988
| 24.109781
| true
| false
|
2024-02-21
|
2024-06-27
| 0
|
Eric111/CatunaMayo-DPO
|
Etherll_Chocolatine-3B-Instruct-DPO-Revised-Ties_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Phi3ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Etherll/Chocolatine-3B-Instruct-DPO-Revised-Ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/Chocolatine-3B-Instruct-DPO-Revised-Ties</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__Chocolatine-3B-Instruct-DPO-Revised-Ties-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Etherll/Chocolatine-3B-Instruct-DPO-Revised-Ties
|
8a9c3d745e0805e769b544622b3f5c039abc9b07
| 24.981821
| 0
| 3.821
| false
| false
| false
| false
| 1.270994
| 0.372469
| 37.246949
| 0.541065
| 35.583343
| 0.163142
| 16.314199
| 0.323826
| 9.8434
| 0.464938
| 17.817187
| 0.397773
| 33.085845
| false
| false
|
2024-10-28
| 0
|
Removed
|
Subsets and Splits
Top Models by Combined Score
Identifies top-performing models with fewer than 34 billion parameters based on a combined score of two evaluation metrics, providing insights into efficient model performance.
Top 100 Official Models <70
This query identifies the top 100 high-scoring, officially provided models with fewer than 70 billion parameters, offering a useful overview for comparing performance metrics.
Top 100 Official Models < 2
Identifies top-performing AI models with fewer than 20 billion parameters, offering insights into efficiency and precision in smaller models.
Top 500 Official Models by Score
Identifies top performing models based on a combined score of IFEval and MMLU-PRO metrics, filtering by official providers and parameter count, offering insights into efficient model performance.
Top 200 Official Models by Score
Discovers top high-performing models with less than 70 billion parameters, highlighting their evaluation scores and characteristics, which is valuable for model selection and optimization.
SQL Console for open-llm-leaderboard/contents
Identifies top-performing models with fewer than 70 billion parameters, combining two evaluation metrics to reveal the best balanced options.
Top 10 Official Leaderboard Models
The query identifies top 10 official providers with under 13 billion parameters, ordered by their average metric, revealing valuable insights into efficient models.
SQL Console for open-llm-leaderboard/contents
This query filters and ranks models within a specific parameter range (6-8 billion) for the LlamaForCausalLM architecture based on their average performance metric.
SQL Console for open-llm-leaderboard/contents
Retrieves entries related to chat models that are officially provided, offering a filtered view of the dataset.
SQL Console for open-llm-leaderboard/contents
The query retrieves entries marked as "Official Providers", offering basic filtering but limited analytical value.
Top 10 Official Training Data
The query retrieves a small sample of records from the 'train' dataset where the "Official Providers" flag is true, providing basic filtering with limited analytical value.