Update README.md
Browse files
README.md
CHANGED
|
@@ -17,8 +17,8 @@ configs:
|
|
| 17 |
|
| 18 |
---
|
| 19 |
|
|
|
|
| 20 |
|
| 21 |
-
<img src="[dolma-mix.png](https://cdn-uploads.huggingface.co/production/uploads/65316953791d5a2611426c20/9PIqq_MtdFV8epxvE38Up.png)" alt="logo for the mix for Dolma 3" width=300>
|
| 22 |
|
| 23 |
# Dolma 3 Mix (5.5T)
|
| 24 |
The Dolma 3 Mix (5.5T) is the collection of data used during the pretraining stage to train the Olmo-3-1125-32B model. This dataset is made up of ~5.5 trillion tokens from a diverse mix of web content, academic publications, code, and more. The majority of this dataset comes from Common Crawl.
|
|
|
|
| 17 |
|
| 18 |
---
|
| 19 |
|
| 20 |
+
<img src="https://cdn-uploads.huggingface.co/production/uploads/65316953791d5a2611426c20/JopP0oxXQlhiB7YHQGZhY.png" width="300" alt="dolma-mix">
|
| 21 |
|
|
|
|
| 22 |
|
| 23 |
# Dolma 3 Mix (5.5T)
|
| 24 |
The Dolma 3 Mix (5.5T) is the collection of data used during the pretraining stage to train the Olmo-3-1125-32B model. This dataset is made up of ~5.5 trillion tokens from a diverse mix of web content, academic publications, code, and more. The majority of this dataset comes from Common Crawl.
|