NSFW Detection
Instantly check images for inappropriate content with our powerful NSFW detection API. Get detailed content classification with confidence scores to make informed moderation decisions and keep your platform safe in real-time.
API Endpoint
How It Works
Upload any image and receive a detailed assessment of its content. Our AI not only determines if an image contains NSFW content, but also provides specific content labels with confidence percentages for precise categorization.
Our detection system is trained on millions of images and can rapidly identify inappropriate content with high precision. Get detailed results in milliseconds with confidence scores that help you make informed moderation decisions.
We do not store or retain the images you submit for NSFW detection. All processing is done on-the-fly, ensuring your data remains private and secure while still providing comprehensive content analysis.
Parameters
Authorization
Header - Required Your API key for authentication.
image
JSON - Required Base64-encoded image data to check for NSFW content. Supports JPEG, PNG, and WebP formats.
Python Example
import requests
import base64
# Configuration
API_KEY = "your_api_key_here"
IMAGE_PATH = "path/to/your/image.jpg"
# Read and encode image
with open(IMAGE_PATH, 'rb') as image_file:
image_data = base64.b64encode(image_file.read()).decode('utf-8')
# Prepare JSON payload
payload = {
"image": image_data
}
headers = {
"Authorization": API_KEY,
"Content-Type": "application/json"
}
# Send request for NSFW detection
response = requests.post(
"https://api.vecstore.app/nsfw-detection",
headers=headers,
json=payload
)
# Process response
if response.status_code == 200:
result = response.json()
if result["nsfw"]:
print("Image contains NSFW content")
# Display content labels and confidence scores
print("Content categories detected:")
for label in result["labels"]:
print(f"- {label['label']}: {label['confidence']}")
else:
print("Image is safe for work")
print(f"Processing time: {result['time']}")
Response Format
{
"nsfw": true,
"time": 1340,
"labels": [
{
"label": "Explicit",
"confidence": "94.3%"
},
{
"label": "Exposed Female Private parts",
"confidence": "94.3%"
},
{
"label": "Explicit Nudity",
"confidence": "94.3%"
}
]
}
User Uploads
Screen user-uploaded images in real-time to prevent inappropriate content from appearing on your platform.
Content Moderation
Automate your content moderation workflow by flagging potentially inappropriate images for review.
Child-Safe Platforms
Maintain a safe environment for all ages by ensuring no inappropriate content reaches your younger audience.
API Credit Usage
Each NSFW detection request consumes one API credit.