Datasets:
NSFW Detection Dataset
Summary
This dataset is specifically designed for training NSFW (Not Safe For Work) detection models in the context of artistic content and image classification. The collection follows the established categorization format from popular NSFW detection implementations, providing a comprehensive benchmark for content moderation systems. The dataset contains images organized into five distinct classes that represent different levels of appropriateness and content types commonly encountered in online platforms.
The classification framework divides images into drawing, hentai, neutral, porn, and sexy categories, enabling models to distinguish between various types of potentially sensitive content with fine-grained precision. This multi-class approach allows for nuanced detection capabilities beyond simple binary classification, making it particularly valuable for applications requiring content filtering in creative and artistic contexts where traditional NSFW detection might produce false positives.
The dataset's structure and categorization methodology are derived from well-established implementations in the NSFW detection community, including GantMan/nsfw_model and yangbisheng2009/nsfw-resnet, ensuring compatibility with existing training pipelines and evaluation frameworks. With a size category of 10K-100K samples, the dataset provides sufficient diversity for training robust classification models while maintaining manageable computational requirements.
Keywords: NSFW detection, image classification, content moderation, artistic content, multi-class categorization
Usage
The dataset is provided as a single ZIP file containing organized image folders. To use this dataset:
# Download and extract the dataset
wget https://huggingface.co/datasets/deepghs/nsfw_detect/resolve/main/nsfw_dataset_v1.zip
unzip nsfw_dataset_v1.zip
The extracted folder will contain images organized into the following subdirectories:
drawing/- Artistic drawings and illustrationshentai/- Anime/manga-style explicit contentneutral/- Safe and appropriate contentporn/- Explicit adult contentsexy/- Suggestive but not explicit content
Original Content
The dataset used for training the NSFW Detect classification model is divided into five categories: drawing, hentai, neutral, porn, and sexy, following the format mentioned in GantMan/nsfw_model and yangbisheng2009/nsfw-resnet.
Citation
@misc{nsfw_detect_dataset,
title = {NSFW Detection Dataset},
author = {deepghs},
howpublished = {\url{https://huggingface.co/datasets/deepghs/nsfw_detect}},
year = {2023},
note = {Multi-class image classification dataset for NSFW content detection with five categories: drawing, hentai, neutral, porn, and sexy},
abstract = {This dataset is specifically designed for training NSFW (Not Safe For Work) detection models in the context of artistic content and image classification. The collection follows the established categorization format from popular NSFW detection implementations, providing a comprehensive benchmark for content moderation systems. The dataset contains images organized into five distinct classes that represent different levels of appropriateness and content types commonly encountered in online platforms. The classification framework divides images into drawing, hentai, neutral, porn, and sexy categories, enabling models to distinguish between various types of potentially sensitive content with fine-grained precision.},
keywords = {NSFW detection, image classification, content moderation, artistic content, multi-class categorization}
}
- Downloads last month
- 332