Instructions to use PeppoCola/IssueReportClassifier-NLBSE22 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use PeppoCola/IssueReportClassifier-NLBSE22 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="PeppoCola/IssueReportClassifier-NLBSE22")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("PeppoCola/IssueReportClassifier-NLBSE22") model = AutoModelForSequenceClassification.from_pretrained("PeppoCola/IssueReportClassifier-NLBSE22") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 8eb0e2945d7697c2d5021737df265c0151cd597b9af5fc3e3ef59d5f31945bd6
- Size of remote file:
- 499 MB
- SHA256:
- 73b56581d5100d305bd3ee710c5530ced62f81013e4dfbc6e30d424bf0edb7ca
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.