Deepfake, AI or real? It’s getting harder for police to protect children from sexual exploitation online

Research output: Contribution to journalArticleResearchpeer-review

Abstract

Artificial intelligence (AI), now an integral part of our everyday lives, is becoming increasingly accessible and ubiquitous. Consequently, there’s a growing trend of AI advancements being exploited for criminal activities.

One significant concern is the ability AI provides to offenders to produce images and videos depicting real or deepfake child sexual exploitation material.

This is particularly important here in Australia. The CyberSecurity Cooperative Research Centre has identified the country as the third-largest market for online sexual abuse material.

So, how is AI being used to create child sexual exploitation material? Is it becoming more common? And importantly, how do we combat this crime to better protect children?
Original languageEnglish
JournalThe Conversation
Publication statusPublished - 25 Jun 2024

Fingerprint

Dive into the research topics of 'Deepfake, AI or real? It’s getting harder for police to protect children from sexual exploitation online'. Together they form a unique fingerprint.

Cite this