checkToxicityImage
Description
Check toxicity of an image provided a valid URL. It has two different models to check the toxicity, one is [bumble](https://github.com/bumble-tech/private-detector) and second is Yahoo [Open NSFW](https://github.com/yahoo/open_nsfw) model.
Parameters
imageUrl
string
image URL
Response
message
string
message about the response
status
int
1 for successfull execution
-1 for any error occurs during the execution of the request
response
object
this will have three properties bumble - toxicity score of bumble model opennfsw - toxicity score of open nsfw model is_toxic - true for toxic and false for not toxic
Example Request and Response
Prerequisites
Before making requests with Volary SDK, you must have it installed.
You can install Volary SDK using either npm
or yarn
. Use the following commands to install Volary SDK:
Request
Here is an example of how to make a checkToxicityImage
request using the Volary SDK:
Response
Last updated