Focue Provides the Latest and Most Up-to-Date News, What You Focus On is What You Get.
⎯ 《 Focue • Com 》

Meta scrambles to fix Instagram algorithm connecting ‘vast paedophile network’

2023-06-08 10:13
Meta has launched an investigation into reports that Instagram is promoting child sexual abuse material through its algorithm. Facebook’s parent company set up a taskforce to investigate the claims after the Stanford Internet Observatory (SIO) said it found “large-scale communities” sharing paedophilia content on the platform. The SIO said it discovered the child sexual abuse material (CSAM) following a tip from the Wall Street Journal, whose report on Wednesday detailed how Instagram’s recommendation algorithm helped connect a “vast pedophile network” of sellers and buyers of illegal material. Instagram’s ‘suggested for you’ feature also linked users to off-platform content sites, according to the report, with the SIO describing the site as “currently the most important platform” for these networks. “Instagram has emerged as the primary platform for such networks, providing features that facilitate connections between buyers and sellers,” Stanford’s Cyber Policy Center wrote in a blog post. “Instagram’s popularity and user-friendly interface make it a preferred option for these activities.” Instagram users were able to find child abuse content through explicit hashtags like #pedowhore, which have since been blocked by Instagram. “Child exploitation is a horrific crime,” a Meta spokesperson said. “We’re continuously investigating ways to actively defend against this behaviour, and we set up an internal task force to investigate these claims and immediately address them.” Meta said that it had already destroyed 27 paedophile networks over the past two years on Instagram, as well as removed 490,000 accounts violating child safety policies in January alone. Other social media platforms hosting this type of content were also identified by the SIO, though to a much lesser extent. The SIO called for an industry-wide initiative to limit production, discovery and distribution of CSAM, while also urging companies to devote more resources to proactively identifying and stopping abuse. “Given the multi-platform nature of the problem, addressing it will require better information sharing about production networks, countermeasures, and methods for identifying buyers,” the organisation said. “SIO hopes that this research aids industry and non-profits in their efforts to remove child sexual abuse material from the internet.” Read More Instagram plans ChatGPT-style AI chatbot with multiple personalities Mark Zuckerberg reveals new VR headset ahead of Apple ‘Robot taxi’ with ‘ghost driver’ interacts with pedestrians in new experiment Earth hit directly by brightest explosion ever seen, scientists say
Meta scrambles to fix Instagram algorithm connecting ‘vast paedophile network’

Meta has launched an investigation into reports that Instagram is promoting child sexual abuse material through its algorithm.

Facebook’s parent company set up a taskforce to investigate the claims after the Stanford Internet Observatory (SIO) said it found “large-scale communities” sharing paedophilia content on the platform.

The SIO said it discovered the child sexual abuse material (CSAM) following a tip from the Wall Street Journal, whose report on Wednesday detailed how Instagram’s recommendation algorithm helped connect a “vast pedophile network” of sellers and buyers of illegal material.

Instagram’s ‘suggested for you’ feature also linked users to off-platform content sites, according to the report, with the SIO describing the site as “currently the most important platform” for these networks.

“Instagram has emerged as the primary platform for such networks, providing features that facilitate connections between buyers and sellers,” Stanford’s Cyber Policy Center wrote in a blog post.

“Instagram’s popularity and user-friendly interface make it a preferred option for these activities.”

Instagram users were able to find child abuse content through explicit hashtags like #pedowhore, which have since been blocked by Instagram.

“Child exploitation is a horrific crime,” a Meta spokesperson said.

“We’re continuously investigating ways to actively defend against this behaviour, and we set up an internal task force to investigate these claims and immediately address them.”

Meta said that it had already destroyed 27 paedophile networks over the past two years on Instagram, as well as removed 490,000 accounts violating child safety policies in January alone.

Other social media platforms hosting this type of content were also identified by the SIO, though to a much lesser extent.

The SIO called for an industry-wide initiative to limit production, discovery and distribution of CSAM, while also urging companies to devote more resources to proactively identifying and stopping abuse.

“Given the multi-platform nature of the problem, addressing it will require better information sharing about production networks, countermeasures, and methods for identifying buyers,” the organisation said.

“SIO hopes that this research aids industry and non-profits in their efforts to remove child sexual abuse material from the internet.”

Read More

Instagram plans ChatGPT-style AI chatbot with multiple personalities

Mark Zuckerberg reveals new VR headset ahead of Apple

‘Robot taxi’ with ‘ghost driver’ interacts with pedestrians in new experiment

Earth hit directly by brightest explosion ever seen, scientists say

Tags tech