Thursday , March 28 2024
Kwork.ru - услуги фрилансеров от 500 руб.
Home / WORLD / Facebook apps responsible for nearly 50% of all online CHILD GROOMING cases during UK lockdown in 2020 – report

Facebook apps responsible for nearly 50% of all online CHILD GROOMING cases during UK lockdown in 2020 – report

In a new report, the National Society for the Prevention of Cruelty to Children (NSPCC) noted that the various Facebook-owned platforms – including Instagram, WhatsApp and Messenger – accounted for nearly half of some 5,441 ‘sexual communication with a child’ offences recorded by police since April 2020.

However, the charity said the actual scale of online grooming was “likely to be higher” as a result of “tech failures” by the social media giant that resulted in a drop in the removal of abuse material during the 12-month period ending in March 2021.

Kwork.ru - услуги фрилансеров от 500 руб.

According to data from 42 police forces across England and Wales, Instagram was the most common site used by groomers. It was flagged by police in 32% of the crimes where a platform was identified. Since 2017, the number of cases linked to the picture- and video-sharing site has almost doubled, police data showed.

Meanwhile, Snapchat was the second most flagged platform – linked to a quarter of the cases where a platform was identified. In all, the “big four” of the Facebook apps and Snapchat were responsible for nearly 75% of all cases where the platform used for grooming was known to police.

Noting that “online child abuse is inherently preventable,” NSPCC child safety online policy head Andy Burrows told The Herald newspaper that the high figures were caused by the “inaction of social media firms” and their adoption of a “piecemeal approach … instead of taking proactive steps to make sure that their sites are safe.”

As an example of the “far easier” ways for offenders to “contact and exploit children,” Burrows noted that groomers are able to simply “refresh the page” on some of the worst-performing platforms to “get a fresh list of children to contact as a result of the site algorithmically recommending them.”

In response, Facebook said it “works quickly to find, remove and report” this “abhorrent behaviour.” It claimed that changes were made earlier this year to “block adults from messaging under-18s they are not connected with” and said it had “introduced technology that makes it harder for potentially suspicious accounts to find young people.”

Although the tech firm said it scans images and videos on Instagram and Facebook to flag exploitative material so that it can be removed, the NSPCC said it had “removed less than half” of the child abuse content it had done previously over the last six months of 2020.

According to Burrows, this meant less “actionable intelligence” was passed to police during the “perfect storm” of a pandemic – at a time when children were online more than ever before. The NSPCC also called on Facebook to ensure its end-to-end encryption tech does not “compromise” child-protection tools.

Despite safety measures announced recently by Facebook, Apple and other firms, the charity said the platforms were “playing catch up” due to “historically poorly designed sites that fail to protect young users” – even though sending sexual messages to children has been a crime since 2015.

It said this showed the importance of the draft Online Safety bill – to be considered by a parliamentary committee next month – that holds “named managers personally liable for design choices that put children at risk.”

If you like this story, share it with a friend!

© 2021, paradox. All rights reserved.

Check Also

Putin calls for ‘humanism and mercy’ after Moscow terror attack

Creativity, humanism and mercy are of special importance in the aftermath of the Crocus City …

Leave a Reply

Your email address will not be published. Required fields are marked *