Instructions to use facebook/bart-large-mnli with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use facebook/bart-large-mnli with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("zero-shot-classification", model="facebook/bart-large-mnli")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("facebook/bart-large-mnli") model = AutoModelForSequenceClassification.from_pretrained("facebook/bart-large-mnli") - Inference
- Notebooks
- Google Colab
- Kaggle
Maximum number of classes
#4
by Nafise - opened
what is the maximum number of classes we can define for the model ?
is there anyway we can increase it?
It's limited when you use the Web Interface but i did it in Local Env and there is no Maximum number of classes
The maximum appears to be 10, at least through this site.
It's limited when you use the Web Interface but i did it in Local Env and there is no Maximum number of classes
How can I do this?