Protecting children from abuse in the AI era
Charity Agasaro – 31 July 2023
The National Crime Agency (NCA) has issued a warning that the increasing use of artificial intelligence (AI) could worsen the problem of child sexual abuse. According to the NCA, around 830,000 adults, approximately 1.6% of the adult population, pose a risk to children in terms of sexual danger. The easy accessibility of abuse images online has had a "radicalizing" effect, normalizing such behaviour. In addition, with the rapid adoption of AI technology, the risk to young people is expected to rise as fake images inundate the internet. Offenders have been using AI tools to create realistic abuse images, making it difficult to identify real victims. Further, the Internet Watch Foundation (IWF) has discovered guides online that aid offenders in training AI tools to produce lifelike abuse images. This pushed IWF to send a plea urging the government to prioritize addressing this issue at the coming global AI summit in London.
According to the 2009 Coroners and Justice Act in the UK, producing AI-generated child sexual abuse images is illegal under the provisions for making and possessing indecent pseudo-photographs. However, the IWF argues that the legislation may need to be updated to directly address AI-generated images. The Ada Lovelace Institute also suggests that the UK should strengthen its domestic regulation of AI, proposing the introduction of an ‘AI ombudsman’ and new legislation for improved protection. The government's forthcoming online safety bill includes provisions to remove child sexual abuse material from online platforms.
More like this: