close
close

AI use in child exploitation on the rise: RCMP

AI use in child exploitation on the rise: RCMP

“The wave of AI-generated child sexual abuse material is coming,” said Cpl. Filip Gravel.

REGINA — With increasing public access to artificial intelligence (AI) technologies, both police and criminals benefit from its capabilities.

“Like anything, it can be used for good or for bad,” said Cpl. Phillipe Gravel, a researcher at the RCMP‘s National Child Exploitation Coordination Center (NCECC).

According to Gravel, police officers are watching AI is used for the first time to create child abuse material.

“Their world works on a barter system – they have to have material to trade in order to get more,” says Gravel. “And use them AI to create it for them.”

The past year has been the RCMP has only conducted a small number of studies in which AIMaterial relating to child sex abuse was found on a suspect’s hard drive. But Gravel predicts more will follow.

“The wave of AIThere is child sexual abuse material coming,” Gravel said.

“New ways of doing things are emerging all the time and as criminals learn them, we can expect more.”

Technical talk


AI technology works by creating a new image from bits and pieces of existing images it pulls from the internet, based on the parameters a user has given it.

Pornographic drawings and digital images of children are illegal in Canada. Under the Criminal Codeit is a criminal offense to have or create them.

“The material used for AI-generated pornography comes from somewhere; they are real people,” he says, emphasizing that the crime is not without victims.

Hidden dangers


In a digital culture where posting photos and videos of every aspect of life has become the norm, Gravel offers advice to prevent more people from becoming victims.

“When you post something, remember that it will be there forever even if you delete it, because you won’t know who shared, saved or screenshotted the image,” he says.

Allowing your children’s images to be used is just one thing parents should consider. Objects in the background of photos and videos can easily give criminals the information they need to find you or your child. “The best way to avoid exposing yourself and your children is to simply not post,” Gravel adds.

Alert and aware


The Canadian Anti-Fraud Center (CAFC) is also aware that the technology is available and can be used in scams and fraud, said Jeff Horncastle, the center’s acting client and communications officer.

He says victims have reported suspected use of it AI in telephone scams and false investment opportunities.

According to Horncastle, there are indications that scammers could use voice cloning technology to impersonate a real person. “The technology is not like it was two years ago, when the voice was robotic,” he says. “It’s so believable, it’s almost like you’re communicating with a real person.”

He says it is highly likely that the technology is also used in fraud and extortion. Fraudsters can now use software that uses deep-fake personas and images to appear in video chats and virtual meetings looking and sounding like a real human. Armed with a photo of a victim, they can also use the same technology to make it appear as if the victim is present and speaking.

Deep-fake video technology has also been seen spoofing celebrity endorsements of fraudulent products and cryptocurrency, Horncastle says. AI is also suspected in phishing and spearphishing emails, which are used to trick people into revealing personal and banking information.

“They can intercept a business email request for money, such as accounts payable or invoices,” says Horncastle. “The victim thinks they are emailing the company president, a colleague or a supplier and provides them with the necessary bank details and releases money directly to the scammer.”

But, he says, the way to avoid being scammed hasn’t changed. “Scammers thrive on creating a sense of urgency, panic or fear,” says Horncastle. “So take your time, do not respond and verify the authenticity of suspicious communications.”

Keeping an eye on the evidence


Despite the potential for criminal use, technology can and is being used for good. For example, the National Child Exploitation Coordination Center uses it to accelerate investigations into suspected online child sexual exploitation, help identify children and remove them from abusive environments or other harm.

Police who scan a suspect’s hard drive can use it AItechnology to identify images that meet the criteria for child sexual exploitation material, something that is normally done manually.

“Some of the researchers and specialists on the unit are parents – for some that’s why they do this work – so the less time they can spend on this kind of thing, the better for their mental health,” Gravel says. It also means less exposure for employees and faster assessment of the material, which is increasing in volume every day.

Culture of transparency


Operational technologies such as AI play a crucial role in modern police work. An operational technology is any technology-based tool, technique, device, software, application or data set that will be used to support an investigation or to gather criminal intelligence. They include things like search tools, automated license plate recognition, voice dictation, data extraction filtering, translation, facial recognition, and text-to-speech functionality.

These tools are used to fight crime, investigate suspects, protect children and vulnerable groups, collect evidence, improve data analysis, strengthen police accountability, and promote law enforcement and public safety objectives.

RCMP units like the NCECC consultation with the National Technology Onboarding Program (NTOP) when considering the use of new technologies to ensure they are effective and meet both the Canadian Charter of Rights and Freedoms and the Privacy Act.

“Our team is motivated to help everyone RCMP program areas bring together the information they need to ensure they use the tools appropriately and effectively,” said Michael Bilinger, responsible for transparency and outreach at NTOP.

The program was created in 2021 in response to privacy concerns with the RCMP‘s use of Clearview AI facial recognition technology and the subsequent investigation, which resulted in the Privacy Commissioner’s recommendation that the organization take a more structured approach to adopting and integrating new technologies. NTOP was founded to promote the responsible use of operational technologies by the RCMP and to encourage greater public transparency. In view of this, NTOP recently published Transparency Blueprint: Snapshot of Operational Technologies.

Bilinger says he has seen a culture change in the way the RCMP looks at privacy and its commitment to increasing transparency.

RCMP employees think about privacy first and educate themselves on what the privacy issues are, and don’t just rely on us to educate them,” says Billinger. “It’s a change in thinking about information as classified and confidential, to more transparency to build trust to promote. with the public. I think that’s a big win.”