Yaxin/SemEval2015Task12Raw
Updated • 30 • 2
How to use StevenLimcorn/bert-large-uncased-semeval2015-laptops with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("fill-mask", model="StevenLimcorn/bert-large-uncased-semeval2015-laptops") # Load model directly
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("StevenLimcorn/bert-large-uncased-semeval2015-laptops")
model = AutoModelForMaskedLM.from_pretrained("StevenLimcorn/bert-large-uncased-semeval2015-laptops")This model is a fine-tuned version of bert-large-uncased on the Yaxin/SemEval2015Task12Raw laptops dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training: