My First Question Answering Model ❓
A fine-tuned DistilBERT model for extractive question answering. Given a context paragraph and a question, it extracts the answer directly from the text.
How to Use
from transformers import pipeline
qa = pipeline("question-answering", model="Kapilydv6/my-qa-model")
result = qa(
question="Who created Python?",
context="Python is a programming language created by Guido van Rossum."
)
print(result["answer"]) # "Guido van Rossum"
Training Details
| Parameter | Value |
|---|---|
| Base model | distilbert-base-uncased |
| Dataset | SQuAD v1.1 (3000 samples) |
| Epochs | 3 |
| Learning rate | 3e-5 |
| Max sequence len | 384 |
| Framework | PyTorch + Transformers |
What I Learned
This is my second Hugging Face model! Key concepts:
- Extractive Q&A: The model doesn't generate text — it finds the answer span within the given context
- Token-level prediction: Unlike classification (one label per text), Q&A predicts start and end token positions
- Stride: Long documents are split into overlapping chunks so no answer is missed
Limitations
Trained on a small subset of SQuAD for learning purposes. For production, train on the full dataset (~87k examples).
- Downloads last month
- -