Vigil AI CAID Classifier, the “best-trained eyes to help platforms spot and stop child sexual abuse imagery at scale”.

 

The Vigil AI CAID Classifier is a Deep Learning based AI tool that spots and classifies child abuse imagery just like the best-trained law enforcement operators in the world, only a million times faster (and without the emotional and psychological burden).

It is grounded in deep technical skill, law enforcement and victim identification expertise, and unbeatable training data; it is the only classifier in the world that has been trained on the UK Child Abuse Image Database (CAID), implementing the finest grained categorization standards available anywhere.

Better trained engine  ➜  better grading, less false alarms  ➜  more illegal content identified and removed.

The solution is built on state-of-the-art machine learning technology and know-how fit for scanning large volumes of images and video* (known or unknown), for determining whether or not it is CSE content, and for establishing the severity of the image, and attributing a percentage confidence score for each category.  Better at identifying and categorising entirely new images/videos.

* The Vigil AI CAID Video Classifier is available by invitation only.
* Can be deployed for processing pre-recorded or live streams.