Overview of Cloud Scan categories

Cloud Scan looks for concerning images saved in cloud storage and assigns them into categories that Safeguarding Leads reference when reviewing flagged content.

Cloud Scan categories

Category Description
Alcohol Images of alcohol and depictions of alcohol consumption and alcoholism
Drugs Depictions of drug abuse or images of drug paraphernalia and prohibited substances
Extremism Depictions of extremist activities and extremist messaging, including fanaticism, dogmatism and bigotry
Gore Depictions of graphic violence and injuries resulting from accidents and gruesome acts.
Pornography Depictions of sexual activity, nudity, erotica
Risqué Depictions of sexually suggestive behaviour or entertainment
Weapons Guns, weapons of war, or any device intended to inflict bodily harm (e.g. guns) or physical damage (e.g. explosives)

 

In addition to assigning a category, Cloud Scan provides details about the flagged content, such as its uploader and the date it was uploaded. For images it flags for the first time, Cloud Scan shows the number of users the images have been shared to.

While Cloud Scan identifies and categorises images with a high degree of accuracy, a small number may still be in unexpected categories. Cloud Scan’s results also include an image’s details, but they do not make inferences about its intended use or the uploader.

Cloud Scan only flags an image based solely on what it shows, regardless of the cultural landscape or prevailing customs where the organisation runs. For instance, an image may be flagged for violence although it only depicts contact sport (for example, kickboxing) or recreation (for example, paintball gun). That is why it’s up to Safeguarding Leads to decide to retain or remove images based on community and organisational standards.

Important

Your organisation is responsible for ensuring their users–and the staff reviewing flagged content–adhere to policies, laws and regulations on information technology and security, data management, and privacy. Organisations are also responsible for warning their staff about potentially disturbing themes in flagged images and for providing further support where necessary.

Was this article helpful?
0 out of 0 found this helpful
Share

Comments

0 comments

Please sign in to leave a comment.