Instructions to use ModelTC/bert-base-uncased-cola with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use ModelTC/bert-base-uncased-cola with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="ModelTC/bert-base-uncased-cola")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("ModelTC/bert-base-uncased-cola") model = AutoModelForSequenceClassification.from_pretrained("ModelTC/bert-base-uncased-cola") - Notebooks
- Google Colab
- Kaggle
| { | |
| "epoch": 5.0, | |
| "eval_loss": 0.7111271023750305, | |
| "eval_matthews_correlation": 0.5960380981891474, | |
| "eval_runtime": 12.4873, | |
| "eval_samples": 1043, | |
| "eval_samples_per_second": 83.525, | |
| "eval_steps_per_second": 10.491 | |
| } |