You need to agree to share your contact information to access this model
This repository is publicly accessible, but you have to accept the conditions to access its files and content.
Our models are intended for academic use only. If you are not affiliated with an academic institution, please provide a rationale for using our models. Please allow us a few business days to manually review subscriptions.
Log in or Sign Up to review the conditions and access this model content.
Model description
An xlm-roberta-large model fine-tuned on UK survey questions (in English) labeled with the 16 major topic codes from the CLOSER codecook https://ucldata.atlassian.net/wiki/spaces/CLOS/pages/37323020/Topics.
Fine-tuning procedure
This model was fine-tuned with the following key hyperparameters:
- Number of Training Epochs: 10
- Batch Size: 8
- Learning Rate: 5e-06
How to Use the Model
from transformers import AutoTokenizer, pipeline
tokenizer = AutoTokenizer.from_pretrained("xlm-roberta-large")
pipe = pipeline(
model="poltextlab/xlm-roberta-large_ontolisst_v1",
task="text-classification",
tokenizer=tokenizer,
use_fast=False,
token="<your_hf_read_only_token>"
)
text = "Apart from yourself (and your husband/wife/partner), does anyone else living in your household make a contribution towards the cost of the accommodation?
pipe(text)
Gated Access
This model requires gated access. You must pass the token parameter when loading the model. In earlier versions of the Transformers package, you may need to use the use_auth_token parameter instead.
Model Performance
The model was evaluated on a test set of 4,216 English examples.
Accuracy: 0.88
Precision: 0.88
Recall: 0.88
Weighted Average F1-score: 0.88
| Unnamed: 0 | precision | recall | f1-score | support |
|---|---|---|---|---|
| 0 | 0.88 | 0.86 | 0.87 | 111.00 |
| 1 | 0.88 | 0.82 | 0.85 | 207.00 |
| 2 | 0.89 | 0.89 | 0.89 | 789.00 |
| 3 | 0.89 | 0.90 | 0.90 | 504.00 |
| 4 | 0.67 | 0.73 | 0.70 | 98.00 |
| 5 | 0.89 | 0.93 | 0.91 | 297.00 |
| 6 | 0.90 | 0.89 | 0.89 | 416.00 |
| 7 | 0.90 | 0.89 | 0.89 | 301.00 |
| 8 | 0.91 | 0.91 | 0.91 | 426.00 |
| 9 | 0.77 | 0.81 | 0.79 | 79.00 |
| 10 | 0.90 | 0.93 | 0.91 | 244.00 |
| 11 | 0.86 | 0.92 | 0.89 | 48.00 |
| 12 | 0.00 | 0.00 | 0.00 | 1.00 |
| 13 | 0.89 | 0.69 | 0.78 | 61.00 |
| 14 | 0.76 | 0.75 | 0.76 | 113.00 |
| 15 | 0.91 | 0.89 | 0.90 | 521.00 |
| accuracy | 0.88 | 4216.00 | ||
| macro avg | 0.81 | 0.80 | 0.80 | 4216.00 |
| weighted avg | 0.88 | 0.88 | 0.88 | 4216.00 |
- Downloads last month
- 1,193