Home / Headline / RCMP says most reports of child porn on Mindgeek’s platforms don’t meet legal threshold for charges

RCMP says most reports of child porn on Mindgeek’s platforms don’t meet legal threshold for charges

Politics

The RCMP says it’s investigating all claims about sexual abuse material being hosted by the Montreal-based company Mindgeek, but most of the referrals to date don’t meet the Criminal Code definition of child pornography.

Pornhub says it has removed all content uploaded by non-verified users. The sex website faced accusations it hosted illegal content. (The Canadian Press) 

The RCMP says it’s investigating all claims about sexual abuse material being hosted by the Montreal-based company Mindgeek — but most of the referrals to date don’t meet the Criminal Code definition of child pornography.

Chief Supt. Marie-Claude Arsenault of RCMP specialized investigative services told a parliamentary committee today that while the national police force has received about 120 reports of child sexual abuse material [CSAM] on Mindgeek’s platforms since June 2020, only about 25 cases have been passed along to other law enforcement agencies to investigate.

“Without the evidence, there’s limited action we can do, so we are assessing all the reports that we’re getting now and determining if charges are likely to happen,” she told the access to information, privacy and ethics committee, which is studying privacy concerns related to streaming platforms such as Pornhub.

“We are assessing all the reports we are getting now and determining if charges likely to happen.”

Normand Wong, senior counsel with the Department of Justice, said investigators often struggle to determine if an individual in a pornographic video or image is under the age of 18.

“The definition of child pornography in the Criminal Code is amongst the broadest in the world. We protect children under 18,” he said. “[But] unless you are dealing with an identifiable person, it’s very difficult to tell if that person is above 18 or below 18. So a lot of that material is not captured.”

The committee has launched a study of the “protection of privacy and reputation on platforms such as Pornhub” in response to a New York Times article that questioned the site’s practices.

It included comments from various people who said their lives were ruined as minors after their nude images were displayed without their knowledge on the website.

Pornhub defended its efforts to remove illegal content and said it has taken steps over the last year to improve its verification, moderation and detection process.

“MindGeek has zero tolerance for non-consensual content, child sexual abuse material (CSAM), and any other content that lacks the consent of all parties depicted,” the company said in a statement to CBC News Monday.

“That said, we are continually improving our processes. Every online platform has a responsibility to join this fight, and it requires collective action and constant vigilance. We are committed to this fight and will continue to work with law enforcement globally to stamp out CSAM and non-consensual material on our platforms and on the internet as a whole.”

Mindgeek, the parent company of the popular site Pornhub, is legally headquartered in Luxembourg but has its main office in Montreal.

In 2011, the government brought in a law making it mandatory for those who supply an internet service to report online child pornography.

Close to 10 years later, the RCMP says online cases of child exploitation remain difficult to prosecute.

“There are a number of elements to when and what types of charges will be laid,” Stephen White, the RCMP’s deputy commissioner of specialized policing services, told the parliamentary committee today.

“Obviously, when we’re talking about these corporations, which are service providers, hosting platforms, other individuals have the ability to automatically load their content onto the platforms. There’s jurisdiction issues. Every case is different.”

Charity says it’s flagged child abuse content to Mindgeek

A Canadian charity that scours the web to identify child pornography posts says it has flagged close to 200 examples of child sexual abuse material — many of them showing young children — on Mindgeek’s platforms over the last three years.

Lianna McDonald, executive director of the Canadian Centre for Child Protection, told the committee today that it sometimes takes days for exploitive images and videos to be removed, further damaging the victims.

Her charity runs cybertip.ca, which for the past 18 years has processed tips from the public about possibly illegal material online — including allegations about child sexual abuse material, child trafficking and online luring. It’s also behinda web crawler known as Project Arachnid that searches the internet for child sexual abuse materials (CSAM) and sends take-down notices to platforms.

“Arachnid has detected and confirmed instances of what we suspect to be CSAM on [Mindgeek’s] platform at least 193 times in the past three years. These sightings include 66 images of prepubescent CSAM involving very young children,” she said.

“We do not believe the above numbers are representative of the scope and scale of this problem. These numbers are limited to obvious CSAM of very young children and of identified teenagers. There is likely CSAM involving many other teens that we would not know about because many victims and survivors are trying to deal with the removal issue on their own.”

Lianna McDonald, executive director of the Canadian Centre for Child Protection, holds text books as she speaks at a press conference at the Canadian Centre for Child Protection in Winnipeg Wednesday, October 1, 2014. (John Woods/The Canadian Press) 

The Canadian Centre for Child Protection said that in every case, the company eventually complied with the notice and removed the material — but the amount of time it took to get the material deleted varied from case to case.

In 63 per cent of the 2020 cases, it said, the service provider removed the material after receiving a first notice. Another twenty-six per cent of cases took three or more notices to have the material removed from the service.

The data come as McDonald urges the federal government to do more to keep child sexual abuse materials from circulating online.

“We have allowed digital spaces — where children and adults intersect — to operate with no oversight. We have allowed companies to unilaterally determine the scale and scope of their moderation practices,” she said in her prepared remarks today.

“These failures have left victims and survivors at the mercy of these companies to decide if they take action or not.”

McDonald said her organization’s requests for removal cover reported or detected cases of online luring and child sexual abuse materials and could include multiple reports about the same image or video.

Pornhub says it has removed all non-verified uploads

McDonald is pushing for stricter laws in Canada to require tech companies to use available technology to combat re-uploading of illegal content, and to and hire and train staff to carry out large-scale moderation and content removal requests.

“The current model that puts criminal law responses at the forefront and relies upon the voluntary actions of a largely unregulated industry, with no transparency and no accountability, has failed children,” she said.

Pornhub says it removed all videos uploaded by non-verified users after the adult website was accused of hosting illegal content.

Mindgeek is being sued by 40 women in California who claim it continues to profit from pornographic videos of them that were published without their full consent.

Credit belongs to : www.cbc.ca

index.php

It’s not just the smoke — as climate change prompts more wildfires, hidden health risks emerge

Science Climate change is expected to lead to a rise in the number of wildfires, …