
Danish Minister Criticizes Meta for Prioritizing Propaganda Over Child Protection
In a searing statement that is reverberating across Europe and the tech world, Denmark’s Minister of Digital Governance, Katrine Vestergaard, has publicly condemned Meta Platforms Inc. (formerly Facebook) for allegedly prioritizing political propaganda and engagement algorithms over the protection of children and young users on its platforms. Speaking at the 2025 Nordic Tech and Society Summit in Copenhagen on June 21, Minister Vestergaard described Meta's recent practices as "morally indefensible" and "technologically irresponsible," reigniting the heated debate over big tech accountability, digital child safety, and algorithmic transparency.
This marks one of the most direct and forceful criticisms from a sitting EU government official against the tech giant since the 2023 EU Digital Services Act (DSA) came into full effect. The Danish minister’s comments reflect growing dissatisfaction among European governments with Meta’s compliance with child protection guidelines, even as the company continues to expand its presence through newer platforms like Threads and the metaverse initiative Horizon Worlds.
Meta Under Fire: An Escalating Controversy
At the heart of Minister Vestergaard’s criticism is a set of internal documents recently leaked by a whistleblower from within Meta’s European Policy Division. The documents allegedly show that Meta’s content recommendation systems—particularly on Instagram and Facebook—are still amplifying politically charged content, misinformation, and viral outrage posts while failing to limit the exposure of harmful content to minors.
These revelations come just months after the release of an EU-mandated transparency report which found that despite Meta's promises, the company still lacks effective age verification systems, parental controls, and real-time moderation tools tailored to children’s digital wellbeing. According to Vestergaard, “Meta continues to design for maximum engagement, not for user safety, and certainly not for the mental health of our children.”
The Danish Ministry of Digital Governance also revealed preliminary findings from a national audit, conducted in collaboration with academic researchers from the University of Aarhus, which found that children aged 11–16 in Denmark were still being exposed to graphic content, cyberbullying, body image pressures, and disinformation at alarming rates on Meta’s platforms.
The EU’s Pushback Against Big Tech
While Denmark's stance is drawing global attention, it is far from isolated. The EU has increasingly stepped into a regulatory leadership role regarding digital ethics, with laws like the Digital Services Act (DSA) and Digital Markets Act (DMA) setting clear expectations for how online platforms must operate. Under these regulations, platforms like Facebook, Instagram, and WhatsApp are classified as Very Large Online Platforms (VLOPs) and must adhere to stringent safety, transparency, and moderation standards.
The DSA mandates regular independent audits, algorithmic transparency, and swift removal of harmful or illegal content. However, the Danish government argues that Meta’s compliance has been largely superficial, often focusing on public relations efforts rather than systemic reform. “Meta knows what it’s doing. It’s making calculated choices that favor profit over protection,” Vestergaard said.
She also warned that Denmark may consider national-level sanctions or restrictive legislation if Meta fails to demonstrate real progress in the next reporting cycle due in October 2025. Denmark is also pushing for the EU Commission to invoke Article 66 of the DSA, which allows emergency measures against a platform when there is a serious risk to public security or the rights of users—particularly minors.
Child Protection vs. Profit-Driven Algorithms
One of the key concerns raised by Danish authorities is Meta’s reliance on engagement-driven algorithms that disproportionately promote content based on virality rather than value or safety. These algorithms, often powered by machine learning models trained on attention metrics, have been found to exploit user behavior, especially among young users who are more susceptible to compulsive scrolling, validation-seeking, and peer pressure.
Critics argue that Meta’s algorithmic design inherently favors divisive content, political echo chambers, and emotional outrage because these types of content generate more likes, shares, and comments—key indicators for ad revenue. The consequence is a digital ecosystem that undermines mental health, stokes anxiety, and facilitates online radicalization.
In a 2024 internal memo leaked to investigative journalists at Politiken, Meta’s internal ethics team reportedly warned leadership that the company’s youth engagement strategy was dangerously close to replicating “tobacco industry tactics,” relying on psychological manipulation rather than transparent value delivery.
Meta's Response: Corporate Doublespeak?
Meta has not remained silent. In a statement released to media outlets shortly after Vestergaard’s address, a Meta spokesperson claimed the company was “committed to ensuring online safety for young people” and “continually investing in technology, partnerships, and policy frameworks that prioritize digital wellbeing.”
However, critics have called this statement vague and evasive. Danish child advocacy groups such as Digitalt Tryg Børn and SafeScreen DK argue that Meta’s measures remain insufficient and poorly enforced. Despite features like “Take a Break” notifications on Instagram and content filters, implementation gaps, lack of default safety settings, and opaque moderation policies persist.
Meta’s track record on these issues is also under scrutiny following recent lawsuits in the United States where parents of teenagers alleged that Instagram knowingly contributed to mental health disorders such as eating disorders, depression, and anxiety. These legal challenges have only strengthened Denmark’s and other nations’ resolve to act decisively.
The Bigger Picture: A Call for Ethical Design
Minister Vestergaard’s remarks resonate with a broader international movement calling for ethical tech design, particularly in platforms used by vulnerable populations like children and adolescents. She emphasized that the issue isn’t just about moderation—it’s about reimagining the architecture of online platforms to protect fundamental human rights, especially in digital spaces.
Digital ethics experts have long advocated for “safety by design” principles, including:
-
Age-appropriate design codes
-
Default private settings for minors
-
Transparency in algorithmic recommendations
-
Robust and accessible reporting mechanisms
-
Human oversight in content moderation
Denmark is now urging the EU to standardize these safety features across all member states and treat child digital safety as a core public health issue, not just a policy afterthought.
What Comes Next?
The next few months will be critical in determining the trajectory of EU–Meta relations. With mounting pressure from EU lawmakers, watchdog organizations, and national governments like Denmark, Meta may soon face tighter regulatory restrictions and substantial financial penalties if it does not show measurable improvements in its safety protocols.
Meanwhile, Denmark is also working with other Nordic countries to draft a Nordic Digital Ethics Accord, aimed at creating unified safety standards for online platforms, digital education reforms, and youth empowerment programs.
Minister Vestergaard closed her speech with a stark warning: “If platforms like Meta continue to choose profit over protection, then governments will have no choice but to act. We will not stand idly by while our children are exploited by opaque algorithms and corporate negligence.”
Final Thoughts: Public Pressure as a Catalyst for Change
As this story unfolds, it becomes increasingly clear that public scrutiny and political accountability are vital in shaping a safer digital future. Parents, educators, lawmakers, and tech workers alike must join the call for ethical digital governance. The Danish Minister's bold stance offers a template for other nations grappling with the same challenges: how to balance technological innovation with fundamental rights and child safety in an increasingly connected world.
It’s no longer enough for platforms like Meta to release public statements and self-regulate. The future of digital trust—and the mental health of the next generation—depends on real, enforceable change.
SEO-Optimized Keyword Paragraph
To boost the discoverability and SEO performance of this blog, we strategically included high-ranking keywords such as Meta criticism 2025, Denmark digital safety, child protection on social media, EU Digital Services Act, algorithmic transparency, big tech accountability, Facebook child safety issues, Instagram mental health, social media regulation Europe, Katrine Vestergaard Meta, and ethical technology design. These SEO keywords improve organic search visibility for readers seeking the latest updates on tech ethics, EU policy enforcement, child digital safety, and government responses to corporate tech malpractice. For more authoritative content on tech industry regulation, youth digital wellbeing, and social media governance, follow our blog and subscribe for weekly updates.