close
close
migores1

Google, Meta Launched Secret Ad Campaign Targeting Teens: Report

Image for article titled Google and Meta Launch Secret Ad Campaign Targeting Teens

Photo: Sheldon Cooper/SOPA Images/LightRocket (Getty Images)

Google and Meta fled a secret advertising campaign targeting teenagersin violation of Google’s own rules, Financial times reports. The YouTube ads were aimed at bringing more 13- to 17-year-olds to Instagram as TikTok’s dominance grows.

The companies planned to expand the ads overseas, but Google investigated and shut down the project after it was approached by the Financial Times.

Google told Quartz that the campaign is “small in nature” but that it has “thoroughly reviewed allegations of circumvention of our policies” and is taking “appropriate action.” Specifically, the company said it will refresh its training to ensure our sales representatives understand that they are prohibited from assisting advertisers in trying to specifically target sensitive audiences. Google said it has a history of company initiatives aimed at protecting children and teens online. Meta did not immediately respond to Quartz’s request for comment, but denied wrongdoing in statements to FT.

Why does it matter

The goal was under great attention for its failure to protect teenage users. CEO Mark Zuckerberg has publicly apologized for those failures at a Senate hearing in January. While ads designed to get teens to use Instagram are a far cry from the more extreme “exploitative” issues on Meta platforms, ads targeting teens still have apparent harm and have been related to negative health outcomes. And the industry in general is are facing heat to profit from advertising to children.

US Senate it just passed legislation designed to hold tech giants accountable for any harm their platforms may cause to minors. One of the bills, the Children and Teens Online Privacy Protection Act, or COPA 2.0, prohibits advertising directed at minors and collecting data without their consent. It gives parents and children the option to delete their information from social media platforms.

The second bill, the Children’s Online Safety Act, asktechnology companies to design online platforms in ways that would mitigate or prevent harm to users, including cyberbullying, sexual exploitation and drug use. The bill will require platforms to limit the ability of adult users to talk to minors and provide parental tools that allow guardians to manage their privacy.

Related Articles

Back to top button