Skip to content

ALERT Internet Child Exploitation unit members presents about online safety in Ponoka

ICE: 'Your presence can make a tremendous difference in these kids' safety'

NOTE: The following story discusses child sexual exploitation and may be triggering or disturbing for some readers.

Members of the Alberta Law Enforcement Response Team (Alert) Internet Child Exploitation (ICE) unit spoke to parents and caregivers at an information session held at the old pink school in Ponoka on May 7. 

ICE unit Community Engagement Team members, Cst. Stephanie Bosch and Cst. Scott Sterling presented.

Some of the topics discussed included what ALERT is and what they do, the internet and social media, child luring and sextortion, artificial intelligence and emerging online sites and apps and how the police and the public can work together to keep kids safe online. 

The presentation was offered in partnership with BGC Wolf Creek and Wolf Creek Public Schools. 

ICE says they can't defeat online child exploitation so they try to prevent it, and for that, they need parents' help. 

"I need you as parents, as service providers, as educators. This may require you to change some fundamental beliefs that you currently hold," said Sterling.

"The first thing you need to understand is you are bad at the internet. That's okay, you're not supposed to be good at it."

Bosch said what most of the parents are coming out to their presentations for is information about sextortion.

'Sextortion' is extortion with a sexual element. In these cases, an online predator grooms a minor and eventually asks for a nude photo or video, and once they have it, they make threats to release it to the child's contacts in order to demand more images, video, or money, from the victim. 

"They fear that social death more than physical death," said Bosch.

"Some of the work that we do, you can't help us with, but there's that 20 per cent that you absolutely can, and it's as easy as having a conversation," she said.

Those conversations should include setting rules in a collaborative way, learning more about their online lives, talking about consequences of sending nude images without fearmongering, and talking about how the adults will support the children in the event they're victimized, the presenters said.

The ICE members stressed that children need to feel safe to speak to parents or the adults in their lives so if sextortion happens, they report it and the abuse doesn't continue. 

"It's not on them to start these conversations it's on us, and we've done a disservice as parents and as adults not having a conversation," said Bosch.

According to the ICE unit, the majority of victims they're seeing are boys aged 13 to 18. 

There is no particular app that is problematic and parent's can't rely on parental controls to keep their kids safe, they said.

"There is no such thing as a bad app, just the bad behaviour that occurs on any given platform," said Sterling.

"The predators go where the kids are. The platform is irrelevant," said Sterling, adding the predator's goal is to get kids off the app and onto what they mistakenly believe is a secure messaging app where they think they won't get caught.

in Canada, child pornography is defined as any visual depiction of sexually explicit conduct involving a person less than 18 years old, meaning it doesn't matter if it's a photograph, cartoon or AI image - it's all illegal.

Bosch added there's been a push nationally to change verbiage to child sexual abuse material rather than pornography.

"When you add the word 'child' in front of it, it creates this illusion that there's this subset of pornography that is acceptable, which is absolutely not right. We can all agree on that," Bosch said. 

In fact, Bill C-291, an act to amend the Criminal Code and other acts  regarding child sexual abuse and exploitation material, was passed in October, 2024.

While taking a nude selfie and sending it to another teen is technically producing child pornography, ICE said it's the non-consensual sharing of images where there's an imbalance of power that they're concerned with. 

'Sexting' has become a normalized social practice, said Sterling. 

Young people often extend and nuture their relationships in the digital world, according to Thorn, a nonprofit organization that builds technology to defend children from sexual abuse.

Rather than thinking sexting or the consensual sharing of images won't happen, ICE recommends talking to children about the permanency of such images once they're on the internet and teach them how to protect their digital privacy.

It's also important to remain calm and neutral when a child makes a disclosure of abuse so shame doesn't become a barrier to reporting, they said.

Be an active listener and write down what they said, support and validate their feelings, thank them for sharing, and report it to police and bring the device the communication took place on. It's also important to not delete the messages or the app the conversations took place on, as it's evidence, the presenters explained.

ICE also warned parents about AI chatbots. At worst, they don't have adequate safety protocols, and at best, they're toxic because they're always affirming and will create misguided social development. 

In conclusion, Sterling said the least victimized kids in their files had the most involved parents. 

"Your presence can make a tremendous difference in these kids' safety."

 



Emily Jaycox

About the Author: Emily Jaycox

I'm a reporter for Ponoka News and have lived in Ponoka since 2015.
Read more