Driven by a desire for more free speech, Meta CEO Mark Zuckerberg has announced significant changes to the company's moderation policies. These changes will impact content across Facebook, Instagram, and Threads, and could present new challenges to marketers and their media agencies, especially in Canada.
It is likely more than a coincidence that this move—likely to win Zuckerberg favour with Donald Trump—follows a series of high-profile court cases addressing issues related to content moderation, privacy, and antitrust in the U.S. This includes the FTC v. Meta case, which has the potential to break up Zuckerberg's platforms.
With Meta finally making the move to end its fact-checking program with third-party partners, and replace it with a community-driven system similar to the Community Notes system in place at X (introduced by Trump bestie Elon Musk), we are going to have massive issues across the platform with advertising nudging up against all kinds of unmoderated, uncontrolled, and unfortunate content and "information."
Community Notes are created by a system of anonymous users that sign up to write and rate labels for posts. These labels offer fact-checks, context, or additional information to clarify or correct misleading content.
A community-moderated platform allows for more inaccuracy, bias concerns, manipulation, and abuse of the rankings, and Community Notes has had mixed reviews in terms of effectiveness, culminating in a series of reports this past fall that suggested it is creating election manipulation issues.
Misinformation spreads incredibly quickly on social channels, and community fact-checking like Community Notes simply can’t keep up. The Center for Countering Digital Hate (CCDH) reported in the Fall of 2024 that they found that on X, "283 misleading posts about the 2024 US elections with proposed Community Notes, with a total of 2.9 billion views."
On Meta, the risk of unchecked content moderation is massive, as all of Zuckerberg's platforms have three billion active users, compared to 350 million active users on X. Imagine how quickly a lie will spread on Meta, and how people's perceptions of the truth can be altered. Imagine.
Meta has had a complex history with content moderation, as the platform has grown with more content published and global adoption. In the beginning (circa 2004 to 2010), Facebook had basic community guidelines to prevent hate speech, illegal content, and abuse. Since then, it has become clear that Facebook couldn't keep up with the scale and growing demands of content moderation, controls and restrictions.
Then came the consecutive series of events that created headlines for Facebook around misinformation, electoral manipulation, hate crimes and abuse.
These were headlines that could not be ignored, and the company struggled to respond. It all came to a head during the 2016 U.S. election, where it was clear the platform was manipulated to support misinformation and disinformation, concerns that were confirmed with the 2018 Cambridge Analytica bombshell, which proved conclusively that data was used to manipulate the election and target people. Not long after that, the true horror of the power and reach of hate speech on Facebook was highlighted in the investigation of the 2019 Christchurch mosque shootings.
More accurately, Meta has censored professionally reported news and legitimate journalism in Canada, leaving space for more rumours, gossip and opinion masquerading as fact and news.
We have seen the real impact here in Canada, with Russia's misinformation campaign and a growing polarization of political points of view.
All of this is to say, the digital / social landscape—the context for advertising—hasn’t been getting better; and with Zuckerberg’s decision including revised policies around political content, reversing prior efforts to limit political posts in users' feeds, it is going to get worse.
Despite the growth of deceptive content, advertising spending on Meta continues to climb.
Brand leaders have done risk assessments, created inclusion and exclusion lists, pulled agency programmatic teams out of meetings for the crisis of an ad being placed on Rebel Media and Briebert, and have playbooks for brand safety. Yet, dollars continue to flow into the largest platform of misinformation and disinformation, one that is changing our society by the minute and tampering with democratic systems.
In Canada, Meta fired its entire agency support team, yet marketers continue to spend on the platform. Since 2019, Zuckerberg has been asked repeatedly to appear in Canada in front of a federal committee, and has never shown up. There are real concerns about the harm being done to Canada and Canadians because of Meta’s decisions and actions, but it has proven it is not interested in acknowledging those concerns. Canadian advertisers should keep this in mind when deciding where to spend their media budgets.
I have worked with brands to remove advertising from the Meta platform, and had no decline in sales. Alternatively, I have argued against increased spending across Meta because it didn’t generate a better return on investment.
Yet, here we are, continuing to fund what is now the largest misinformation engine in the world, one that clearly doesn’t care for facts
We need to recalibrate and assess what works in advertising—not just where we can spend dollars. In fact, I have yet to see a media mix model produced for a client that said they were investing the right amount in Meta—it has always been too much. And even with counsel, nothing changes because it is easy to spend money with Meta.
As a society, we wouldn't tolerate a mainstream and legitimate media publication in Canada doing what Meta has been doing—allowing disinformation to go unchecked and allowing it to spread as fact.
If, in this article, I had substituted any publisher in Canada for Meta, you would most likely be outraged. Still, here we are suggesting that the scale and value of Meta for advertising outweighs any societal impact of this decision. It is time for our industry to check the facts.
Sarah Thompson is a Toronto-based media expert who has held executive leadership roles at a number of media agencies.