WhatsApp is being used to distribute child porn via third party apps that have received funding from Google and Facebook's advertising networks.
That's the finding of an investigation into the growing problem of illegal images of children on the instant messaging service.
Paedophiles have been abusing the protection offered by the app's data encryption to spread vile images of abuse on private groups.
These groups can be discovered through specially created third-party software, which offenders use to seek them out, that can be downloaded from app stores.
Now it has been discovered that these third-party apps were receiving funding via Google and Facebook's advertising networks.
Facebook, Google and WhatsApp say they are aware of the problem of child porn and are actively working to tackle its spread and people making money via ads.
Steps already taken included the removal of third-party apps from Google Play that were reported for flagging illegal content.
In a series of in-depth reports for TechCrunch, Josh Constine has revealed the extent of the issue, uncovered with the help of a number of organisations.
This includes Israeli startup firm AntiToxin, who use algorithms to protect people from what it terms 'Online Toxicity' - particularly children from harassment, bullying, predatory behaviour and sexually explicit activity.
AntiToxin studied some of the apps that hosted links to child porn sharing rings on WhatsApp that had been removed from Google Play.
It uncovered the funding connection with two of the world's largest tech companies, via their advertising services.
Six of these apps ran Google AdMob, while one ran Google Firebase, two on Facebook Audience Network and one on StartApp, a firm specialising in online ads.
These networks earned a cut of the money from adverts hosted on the third-party apps used to search for illegal images.
Adverts featured some big brand names - including Amazon, Microsoft, Motorola, Sprint, Sprite, Western Union, Dyson, DJI, Gett, Yandex Music, Q Link Wireless, Tik Tok and more - according to TechCrunch.
WhatsApp says it does not provide a search function for people or groups and does not encourage publication of invite links to private groups.
It also says it works closely with Google and Apple to ensure their terms of service on apps that attempt to encourage abuse on WhatsApp are enforced.
'WhatsApp has a zero-tolerance policy around child sexual abuse,' said a spokesman for the firm.
'We deploy our most advanced technology, including artificial intelligence to scan profile photos and actively ban accounts suspected of sharing this vile content.
'We also respond to law enforcement requests around the world and immediately report abuse to the National Center for Missing and Exploited Children.
'Sadly, because both app stores and communications services are being misused to spread abusive content, technology companies must work together to stop it.'
Facebook says it apps are automatically disabled from Audience Network when they are removed from the Google Play Store.
This in turn ends monetisation for any users that still have the app, cutting off funding from their creators.
Facebook has also run a scan of apps supported by the Audience Network and has removed support for WhatsApp group sharing apps - even if they are still under review by Google.
A spokesman for Facebook said: 'Facebook Audience Network is committed to building a safe and quality environment for our partners - we do not support the monetization of harmful content.
'We disable apps and withhold revenue from accounts found to be in violation of our policies or that are removed from third party app stores.'
Google says it is following its own policies for terminating illegal accounts and refunding advertisers where appropriate.
A spokesperson added: 'Google has a zero tolerance approach to child sexual abuse material and we thoroughly investigate any claims of this kind.
'As soon as we became aware of these WhatsApp group link apps using our services, we removed them from the Play store and stopped ads.
'These apps earned very little ad revenue and we’re terminating these accounts and refunding advertisers in accordance with our policies.'
The true scale of the problem emerged after Israeli researchers warned Facebook - the owner of WhatsApp - in September how easy it was to find and join group chats dedicated to the spread of child pornography.
In some cases, up to 256 people were sharing sexual images and videos of children, according to reports in the Financial Times (FT).
The matter escalated in recent weeks when two charities in Israel dedicated to online safety, Netivei Reshet and Screensaverz, made their findings public.
Over the course of several months they discovered identifiers - that were not encrypted and able to be viewed publicly - advertising the disturbing contents of said groups.
Speaking to the FT on December 20, Netivei Reshet’s Yona Pressburger said: 'It is a disaster: this sort of material was once mostly found on the darknet, but now it’s on WhatsApp.'
No comments:
Post a Comment