Instructions to use facebook/bart-large-mnli with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use facebook/bart-large-mnli with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("zero-shot-classification", model="facebook/bart-large-mnli")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("facebook/bart-large-mnli") model = AutoModelForSequenceClassification.from_pretrained("facebook/bart-large-mnli") - Inference
- Notebooks
- Google Colab
- Kaggle
Difference of score when using transformer and when using hugging face api
#27
by Qutub - opened
The scores of topic is different in both the methods transformer and hugging face api, looks like Latest weight is not there in the file you provided can you update the weights. lets say My text is "How to make an atom bomb?" its giving score of explosives as 0.243 when using pipeline in transformer but with hugging face api its giving 0.579.