Meta and Google Face Scrutiny for Targeting Teens with Instagram Ads
A soft-spoken ad campaign by tech giants Meta and Google is currently under very close scrutiny. It targets teenagers on YouTube with Instagram ads, which in turn violates Google’s very own policies.
The Financial Times has exposed an effort detailing how these firms could have circumvented regulation to reach a young audience—a move full of controversy.
It’s the result of a collaboration between Meta and Google to serve Instagram ads on YouTube to 13- to 17-year-olds. This is against Google’s own policy, which forbids advertising to anyone under 18.
These were ad segment-specific for something that internally had been labeled “unknown” inside the ad platform of Google—a supposed grouping of users whose age, sex, or other demographic is not released.
The targeting was supposedly happening in Canada earlier this year, with expansion to other countries planned.
According to documents, the campaign has been held with the support of Spark Foundry, a US-based advertising agency. The timing of the campaign came when there was a fall in ad revenues for Google and a migration of young users from Meta to rival platforms such as TikTok.
According to a report by Quartz, following the outlet’s inquiry, Google did investigate the matter and then axed the project. In an email, Google told Quartz that the campaign was “small in nature.” The company also said it has “thoroughly reviewed the allegations concerning the circumvention of our policies” and is taking “appropriate steps.” It also announced plans to update its training so that sales representatives better understand the rules.
Following the exposé by the programme, Google told the Financial Times, “We prohibit ads being personalised to people under 18, period.”
The US Senate has just passed legislation designed to hold tech giants liable for a range of harms against minors. One such measure is COPA 2.0, the Children and Teens’ Online Privacy Protection Act, which bars behavioral targeted advertising to minors and forbids the collection of data without consent.
A different bill, the Kids Online Safety Act, requires tech companies to design online platforms that are not unsafe for users with regard to potential harm including such things as cyberbullying, sexual exploitation, and drug use. The fallout from this global outrage could result in regulatory actions in India.
Leave a Reply