FrugalScore: Learning Cheaper, Lighter and Faster Evaluation Metricsfor Automatic Text Generation
Paper β’ 2110.08559 β’ Published
How to use moussaKam/frugalscore_tiny_bert-base_bert-score with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-classification", model="moussaKam/frugalscore_tiny_bert-base_bert-score") # Load model directly
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("moussaKam/frugalscore_tiny_bert-base_bert-score")
model = AutoModelForSequenceClassification.from_pretrained("moussaKam/frugalscore_tiny_bert-base_bert-score")YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
FrugalScore is an approach to learn a fixed, low cost version of any expensive NLG metric, while retaining most of its original performance
Paper: https://arxiv.org/abs/2110.08559?context=cs
Project github: https://github.com/moussaKam/FrugalScore
The pretrained checkpoints presented in the paper :