As per a new report from Ekō, a nonprofit watchdog organisation, between May 8-13, Meta approved 14 highly inflammatory ads
These ads called for violent uprisings targeting Muslim minorities, disseminated ‘blatant’ disinformation exploiting communal or religious conspiracy theories
The adverts were created and submitted to Meta’s ad library, which contains all adverts on Facebook and Instagram
Inc42 Daily Brief
Stay Ahead With Daily News & Analysis on India’s Tech & Startup Economy
Meta’s advertising policies are under intense scrutiny once again as a watchdog group reports that the company approved over a dozen “highly inflammatory” ads targeting Indian audiences. These ads, which appeared on Facebook and Instagram, spread disinformation, incited religious violence, and promoted conspiracy theories related to the upcoming elections.
As per a new report by Ekō, a nonprofit watchdog, between May 8-13, 2024, Meta approved 14 highly inflammatory ads. The ads were placed in English, Hindi, Bengali, Gujarati, and Kannada.
These ads called for violent uprisings targeting Muslim minorities, disseminated ‘blatant’ disinformation exploiting communal or religious conspiracy theories prevalent in India’s political landscape, and incited violence through Hindu supremacist narratives, the report said.
One approved ad also contained messaging mimicking that of a recently doctored video of Home Minister Amit Shah threatening to remove affirmative action policies for oppressed caste groups, which has led to notices and arrests of BJP opposition party functionaries.
In addition, accompanying each ad were AI-generated images. Meta’s systems failed to block researchers from posting political and incendiary ads during the election “silence period.” Setting up Facebook accounts was very simple, allowing researchers to post these ads from outside India, the report added.
The adverts were created and submitted to Meta’s ad library, which contains all adverts on Facebook and Instagram, by India Civil Watch International (ICWI) and Ekō, a corporate accountability organization. The purpose was to test Meta’s ability to detect and block political content that could be inflammatory or harmful during India’s six-week election.
In total, 14 out of 22 ads were approved by Meta within 24 hours; all of the approved ads broke Meta’s own policies on hate speech and misinformation, among other violations.
Several ads targeted BJP opposition parties with messaging on their alleged Muslim favouritism. Other ads played on fears of India being swarmed by Muslim “invaders”. One ad even claimed that Muslims attacked a Ram temple and called for violent retaliation.
Even while the incendiary ads were approved, five other creatives were rejected for breaking Meta’s Community Standards policy issues regarding hate speech and violence and incitement. An additional three ads were submitted and rejected on the basis that they may qualify as social issue, electoral or politics ads, but they were not rejected on the basis of hate speech, inciting violence, or for spreading disinformation.
Inc42 has reached out to Meta. The story will be updated based on their response.
Facebook India Online Services, the Indian arm of Meta, reported gross advertisement revenues of INR 18,308 Cr in FY23, up 13% year-on-year from INR 16,189 Cr in FY22. In FY23, the company’s net profit surged by 19% to INR 352 Cr from INR 297 Cr in FY22.
Interestingly, before India’s General Elections kicked off on April 19, Meta said it would continue efforts to limit misinformation, remove voter interference, and enhance transparency and accountability on our platforms to support free and fair elections.
The tech major also started actively blocking queries related to general elections on its under trial AI chatbot Meta AI, to restrict certain election-related keywords for AI in the ongoing test phase.
{{#name}}{{name}}{{/name}}{{^name}}-{{/name}}
{{#description}}{{description}}...{{/description}}{{^description}}-{{/description}}
Note: We at Inc42 take our ethics very seriously. More information about it can be found here.