The UK Child Abuse Image Database (CAID) was created to help identify and safeguard victims, make investigating child sexual exploitation and abuse faster and more effective, and support international efforts to remove CSE content from the internet.
The Home Office developed CAID in collaboration with the police, industry partners, and British and international SMEs. It went live in 2014, and was rolled out across UK territorial police forces and the National Crime Agency (NCA) the following year. It now houses all CSE images UK Police Forces and the NCA encounter.
Assessing case images against CAID helps analysts and investigators quickly understand which images are already known about, the categories of those images, and whether they contain an identified victim. Unknown images must receive three votes by qualified image analysts before being submitted to CAID with an allocated category.
CAID is an ever-growing, invaluable, collaborative effort. It helps to streamline the investigation and prosecution of offenders, and more quickly identify and protect victims.
The Vigil AI CAID classifier supercharges this innovation, helping analysts churn through huge volumes of unknown images at high speed, which can then be added to CAID once they have gone through the voting system. As the database grows, so too does its power as a resource to end sexual violence against children.