Meta Platforms Inc., the parent company of Facebook, Instagram, and WhatsApp, has deactivated 63,000 accounts affiliated with the notorious “Yahoo Boys” scam organisation, according to the company’s Q1 2024 Adversarial Threat Report released on Wednesday.
The accounts, which were removed in recent weeks, were used to conduct financial sextortion scams and distribute blackmail scripts.
Meta revealed that a smaller network of 2,500 accounts, each linked to approximately 20 people, predominantly targeted adult men in the United States with forged identities.
Meta claimed it found and disabled the accounts using a mix of advanced technical signals and thorough investigations, which improved its automatic detection systems.
“Financial sextortion is a borderless crime, fueled in recent years by the increased activity of Yahoo Boys, loosely organised cybercriminals operating largely out of Nigeria that specialise in different types of scams.
“We have removed around 63,000 accounts in Nigeria attempting to target people with financial sextortion scams, including a coordinated network of around 2,500 accounts.”
“We have also removed a set of Facebook accounts, pages, and groups run by Yahoo Boys—banned under our dangerous organisations and individuals policy—that were attempting to organise, recruit, and train new scammers,” the company explained.
During the inquiry, Meta discovered that the majority of fraudsters’ attempts were unsuccessful, while some targeted minors, which were reported to the National Centre for Missing and Exploited Children (NCME).
Meta said that it also shared information with other tech businesses through the Tech Coalition’s Lantern programme to assist combat scams across platforms.
Furthermore, Facebook’s parent firm stated that it eliminated approximately 7,200 assets from Nigeria, including 1,300 Facebook profiles, 200 pages, and 5,700 groups that provided scam-related information.
These assets were discovered to be selling fraud scripts and guidelines, as well as sharing links to photo collections for constructing phoney accounts, according to reports.
Since the disruption, Meta’s systems have actively blocked these groups’ efforts to return, continuously strengthening their detection skills.
The corporation also stated that it has been cooperating closely with law enforcement, assisting investigations and prosecutions by responding to legal requests and alerting authorities to potential threats.
The social media company indicated that its actions went beyond account deactivation.
“We also fund and support NCMEC and the International Justice Mission to run Project Boost, a programme that trains law enforcement agencies around the world in processing and acting on NCMEC reports.
“We’ve conducted several training sessions so far, including in Nigeria and Cote d’Ivoire, with our most recent session taking place just last month,” the firm revealed.
To protect users, particularly teenagers, Meta said that it has established tougher message settings for users under 16 (under 18 in some countries) and shows safety advisories to promote cautious online activity.
The Federal Competition and Consumer Protection Commission of Nigeria penalised Meta $220 million this week for several violations of data protection regulations relating to WhatsApp.
The inquiry, which began in May 2021, determined that Meta’s privacy policies violated consumers’ rights, including unauthorised data sharing and discriminatory activities.
Meta intended to challenge the verdict, claiming that it disagreed with the conclusions and the imposed penalty.