European regulators are intensifying scrutiny over a now-terminated secret advertising partnership between Google and Meta Platforms that reportedly violated Google’s own policies for protecting minors online, according to a Financial Times report on Tuesday.
The partnership, revealed in an August investigation by the Financial Times, involved a marketing campaign targeting 13- to 17-year-old users of YouTube, owned by Google-parent Alphabet. The initiative aimed to promote Instagram, Meta’s photo-sharing platform, but went against Google’s restrictions on ad personalization for minors.
Regulatory Response
The partnership, which expanded to the U.S. and was intended to go global, has since been canceled. However, European Commission officials have been investigating the arrangement. In October, they directed Alphabet’s lawyers to compile and review internal communications, presentations, and other materials related to the ad campaigns.
The digital advertising partnership between Google and Meta raise serious ethical and regulatory concerns, highlighting troubling contradictions in both companies’ public commitments to safeguarding minors online. While the partnership has been scrapped, its implications warrant scrutiny and criticism for both companies.
Google’s Contradictions
Google has long touted its “industry-leading” safeguards to protect teens, including prohibiting ad personalization for users under 18. However, the secret collaboration with Meta targeting 13- to 17-year-olds directly undermines these claims. This raises questions about whether Google’s stated policies are genuinely enforced or serve merely as PR tools. The involvement of YouTube, a platform frequented by teens, further complicates the narrative of Google as a responsible guardian of youth online.
Even as Google claims to have updated internal training and reinforced its safeguards, the fact that such a partnership was allowed to take shape suggests systemic lapses in governance and accountability within the company. If Google cannot enforce its own policies, how can it assure regulators and the public of its commitment to ethical practices?
Meta’s Responsibility
Meta, already under fire for the negative effects of its platforms on young users, has once again shown a troubling willingness to prioritize profit over ethics. Despite rolling out enhanced privacy and parental controls earlier this year, this partnership undermines Meta’s credibility. It suggests that the company continues to see teens as a lucrative demographic, even at the expense of their privacy and well-being.
Targeting teens with tailored advertising is particularly concerning given the well-documented vulnerabilities of younger users to manipulative marketing tactics. This partnership exposes Meta’s inability — or unwillingness — to reconcile its profit-driven motives with the duty to protect its youngest users.
The fact that this collaboration extended beyond Europe to the U.S. and was poised for global rollout demonstrates a reckless disregard for regulatory frameworks and public trust. It also highlights the inadequacy of current regulatory measures in holding tech giants accountable.
Both companies need to be held accountable not only by regulators but also by the public. This incident underscores the need for stricter oversight and transparent enforcement mechanisms to ensure that minors’ rights and safety are not subordinated to corporate interests. European regulators must act decisively, and similar scrutiny should follow in the U.S. and other jurisdictions.
If tech giants like Google and Meta cannot self-regulate, it is imperative for governments to step in and impose rigorous checks to prevent the exploitation of vulnerable demographics for profit.
Baburajan Kizhakedath