Instructions to use SmartPy/readability-longformer with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use SmartPy/readability-longformer with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="SmartPy/readability-longformer")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("SmartPy/readability-longformer") model = AutoModelForSequenceClassification.from_pretrained("SmartPy/readability-longformer") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 8a58272e4e221b4bcb04dff80193110832e254752c28feef837f5785829d9444
- Size of remote file:
- 595 MB
- SHA256:
- a6bcf857e81af6e867c804850b0782d845336fa28b935dfce67ef5483fbf0285
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.