POST
NSFW Detection
Analyze images for NSFW content. Send a base64-encoded image or an image URL and receive a moderation result with detailed labels and confidence scores.
Endpoint
POST /nsfw/detect
Request Body
Send either a base64-encoded image or an image URL for NSFW analysis. Provide one of image or image_url.
imagestring · optionalBase64-encoded image to analyzeimage_urlstring · optionalURL of the image to analyzeExample Request (base64)
{
"image": "iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAYAAAAfFcSJAAAADUlEQVR42mNk+M9QDwADhgGAWjR9awAAAABJRU5ErkJggg=="
}Example Request (URL)
{
"image_url": "https://example.com/images/photo.jpg"
}Response
Returns a boolean NSFW flag and an array of moderation labels with confidence scores.
nsfwbooleanWhether the image contains NSFW contentlabelsarrayArray of moderation label objectsModeration Label Object
labelstringName of the detected moderation categoryconfidencestringConfidence percentage for the labelExample Response
{
"nsfw": true,
"labels": [
{
"label": "Non-Explicit Nudity of Intimate parts and Kissing",
"confidence": "98.2%"
},
{
"label": "Non-Explicit Nudity",
"confidence": "98.2%"
},
{
"label": "Partially Exposed Buttocks",
"confidence": "98.2%"
},
{
"label": "Explicit",
"confidence": "90.1%"
},
{
"label": "Exposed Buttocks or Anus",
"confidence": "90.1%"
},
{
"label": "Explicit Nudity",
"confidence": "90.1%"
},
{
"label": "Bare Back",
"confidence": "87.6%"
}
]
}