Model_name string | Train_size int64 | Test_size int64 | arg dict | lora list | Parameters int64 | Trainable_parameters int64 | r int64 | Memory Allocation string | Training Time string | accuracy float64 | f1_macro float64 | f1_weighted float64 | precision float64 | recall float64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
FacebookAI/roberta-large | 50,775 | 12,652 | {
"adafactor": false,
"adam_beta1": 0.9,
"adam_beta2": 0.999,
"adam_epsilon": 1e-8,
"bf16": false,
"fp16": false,
"fp16_opt_level": "O1",
"gradient_accumulation_steps": 4,
"half_precision_backend": "auto",
"label_smoothing_factor": 0,
"learning_rate": 0.00005,
"lr_scheduler_type": "linear",
"max_grad_norm": 1,
"max_steps": -1,
"n_gpu": 2,
"num_train_epochs": 1,
"optim": "adamw_8bit",
"optim_args": "Not have",
"per_device_eval_batch_size": 8,
"per_device_train_batch_size": 8,
"warmup_ratio": 0,
"warmup_steps": 5,
"weight_decay": 0.01
} | [
"dense"
] | 394,184,730 | 38,811,661 | 128 | 6011.75 | 1801.62 | 0.895511 | 0.891005 | 0.895706 | 0.891305 | 0.89089 |
README.md exists but content is empty.
- Downloads last month
- 8