Site will be
unavailable for maintenance from June. 4, 11:30 p.m., to June 5, 12:30 a.m. ET. Thank you for your
patience!
Kids and screens: it’s an inevitable and ubiquitous pairing these days. For many parents trying to juggle all the things, it’s a lifesaver to be able to stream some kid-friendly content, pass out the iPads or hand your kid a phone. Everyone does it – but we all know, vaguely at least, that the internet isn’t always a safe place for children.
By now, we’ve heard about the dangers of social media for teens and about how important it is to limit and supervise screen time for younger kids. But many children go online each day for school, for connecting with loved ones or just for entertainment. All of these are great and valid reasons. There are common misconceptions about the dangerous reality of online child sexual abuse and exploitation – and parents need the right information in order to protect their children.
Here are five common myths around online child sexual abuse that everyone should know.
FALSE: You might think your child is too young, only goes online to play video games or watch cartoons and uses a device with parental guardrails in place, and that this keeps them safe. The reality is that online sexual exploitation is pervasive. All children – especially those who use the internet – are at risk of experiencing online sexual harms. Child sexual abuse material (CSAM) can be accessed on any platform, even the same ones that have children’s content. Grooming and sextortion is on the rise on gaming platforms, and any platform with a messaging feature can be manipulated by perpetrators.
FALSE: This is a problem in every country, including ours. In the U.S., a “substantial proportion” of young people have experienced child sexual abuse online. According to the WeProtect Global Survey in 2021, more than 71% of respondents in North America reported experiencing at least one online harm in childhood. The internet has no borders. Perpetrators can abuse children from across the world, and images posted in one country can be shared and reshared globally, perpetuating the trauma and abuse of that child indefinitely.
MOSTLY FALSE: Although most tech companies do have mechanisms that allow their users to report child sexual abuse material, there are no standards for how the companies should address this material on their platforms. Also, there are no consequences if they don’t find and remove these materials quickly. As a result, the material can live on the internet forever. Even if it’s removed from a site after a few hours, it can be downloaded and redistributed again and again, retraumatizing victims for years.
Tech has the capability to instantly remove offending/illegal content (they do so already with copyright protected music – there are financial consequences for not protecting the publishing rights of songs). So why not just take child sexual abuse content down, too? Tech companies often use their users’ privacy rights as an excuse to not take stronger actions to prevent these criminal activities. But what about the privacy rights of the children being abused?
The real reason: It’s expensive for tech companies to revamp their platforms to be more proactive in protecting children, and there’s no financial incentive for them to take action.
KIND OF: There are laws that require tech companies to take down child sexual abuse material once they are aware of it. But there are no laws in place that require them to proactively search for, detect and promptly remove it. Plenty of child sexual abuse materials still exist on mainstream platforms. In fact, more than 84.9 million images, videos and other content featuring children in suspected situations of sexual exploitation and abuse were submitted to the National Center for Missing and Exploited Children (NCMEC) in 2021 – a 22.8% increase over 2019. These were only the materials reported – and almost all came from mainstream tech companies.
FALSE: There are excellent resources available to help parents better understand online risks, how to use parental controls and how to have honest, supportive and nonjudgmental conversations with your child about the risks they may face and how they can protect themselves online. There are also resources for children, like NCMEC’s online safety program called NetSmartz. For adolescents, Thorn’s NOFILTR includes resources, quizzes and advice, all designed with an authentic youth voice.
Finally, you can use your voice to raise awareness of tech companies’ need to take down child sexual abuse materials. We’ve waited for years for them to “do the right thing.” Now it’s time for us to take action to help protect children. New laws won’t solve the massive global problem of online child sexual exploitation. But it’s an important first step – especially since tech companies aren’t doing enough to prevent the creation and distribution of these materials and keep kids safe online.