Meta Content Moderation Changes: What to Know

On January 7, Meta announced significant changes in its approach to content moderation and fact-checking in the U.S. These rule changes include:
  • Phasing out third-party fact-checking and adopting a “community-driven system similar to X’s Community Notes.”
  • Reversing its 2021 decision to reduce political content that appears in feeds.
  • Focusing on removing only “high-severity violations” and illegal content (which includes terrorism, child sexual exploitation, drugs, fraud, and scams). Meta will stop proactively scanning for hate speech and other types of rule-breaking, and enforce rules only on content reported by users.

 

What this means for our clients 

  • While Meta implements and fine-tunes this system over the next year, organizations should expect a period of uncertainty and ongoing changes in content moderation.
  • Meta’s decision to eliminate its algorithmic downranking of political content will likely increase the impressions and overall views of political content on Meta platforms. 
  • General Meta platform users will be empowered to leave context on any piece of content that you share with a Community Note — including those who might not be knowledgeable about your issue areas or agree with your organization’s viewpoints. This will necessitate active community management, potentially including employee and supporter training and engagement, for organizations that wish to influence Community Notes on their posts or those of their competitors.
  • Less content will be removed from the platform for rule violations. In the announcement, Zuckerberg acknowledges that, “This is a trade off. It means we’re going to catch less bad stuff, but we’ll also reduce the number of innocent people’s posts and accounts that we accidentally take down.”

Recommendations 

Meta’s changes represent a full embrace of a community-based, “hands-off” approach to content moderation. That, coupled with their decision to eliminate downranking for political content, will have an outsized impact on the experience on Meta’s platforms in the months to come. 

Some immediate next steps that organizations should take:

  • Invest in increased social listening and online monitoring: If you do not already have regular social monitoring, now is the time to set it up. This includes monitoring engagement on your own content and across the broader digital ecosystem. 
      • Understand the conversation: Establish a baseline sense of the level and tenor of online discussion about your issue or brand, so that you can identify if there are significant spikes or changes in the months to come.
      • Regularly review your comments and engagements: Ensure there are processes in place to regularly review and track the quantity, quality, and tone of the engagements your content receives.
      • Identify parameters for when an online conversation warrants a response from your team or community: These parameters may look different for every organization — but it’s important to establish where the line is for you.
  • Develop a Community Notes response strategy: Community moderation means you need a rapid response plan. 
    • Designate at least one member of your team to respond to Community Notes: This person or persons should be empowered to quickly request reviews if your posts are flagged.
    • Create rapid response templates: When possible, anticipate notes or flags you expect to receive from certain audiences and develop tools and templates your team can act on quickly.  
    • Develop a plan for how to engage your online community: Community Notes offers the opportunity for your online audience to engage directly as a voice of support for your issue or perspective. Explore ways to empower your community to engage with content directly on your behalf and consider adopting tools like Broadcast Channels that allow you to communicate directly with your audience.  
  • Proactively tell your story and grow your audience of supporters with an always-on approach to communications: In this new era of community moderation, it will be more critical than ever that organizations adopt a proactive, always-on approach that persuades audiences and positions your values, message, and beliefs at the forefront of conversations. 
    • Take advantage of increased reach: Advocacy or political organizations may find further reach on Meta platforms in the months to come. Take advantage of that.  
    • Make sure you can stand behind what you are sharing: When sharing creative assets, videos, or text, clearly state when you are sharing a quote or opinion, vs a fact, to avoid being flagged by Community Notes.
    • Don’t be afraid to get in the weeds: Staying ahead of Meta’s new era of content moderation means getting your hands dirty — frequently responding to people in the comments, shutting down any misleading Community Notes, and only posting content that you are prepared to back up.

Further Background 

Details and timeline: These changes will impact all of Meta’s platforms, including Facebook, Instagram, Threads, and WhatsApp. As of now, these changes will be rolled out in the U.S. only, and Meta’s fact-checking program will continue as is in places with more stringent regulation, like the EU. Meta said it would begin phasing in community notes “over the next couple of months and improve it over the year.”

X / Twitter’s Community Notes as a model: In 2021, X introduced Community Notes as a way for users to provide additional context to potentially misleading posts. 

  • Eligibility criteria: Any X user can sign up to be a Community Notes contributor if they meet the eligibility criteria (no recent platform violations, X user for >6 months, and a verified phone number).
  • Editing Community Notes: Notes cannot be edited or modified by X’s teams, and a post with a Community Note will not be labeled, removed, or addressed by X unless it is found to be violating the X Rules, Terms of Service, or their Privacy Policy. Users can also rate how helpful they think a note is — and if an author of a post disagrees with a Community Note that has been added, they can request an additional review.  

Meta’s existing fact-checking program: Currently, Meta works directly with independent and external International Fact-Checking Network (IFCN) fact-checkers to review and flag content across their platforms. While the IFCN fact-checkers do not remove potentially misleading content, they do mark it as false, reducing its distribution on platforms and effectively limiting more people from viewing inaccurate or misleading information. Under the new policy, Meta will maintain its Trust and Safety and Content Moderation teams — however, they will be moved from California to Texas in an effort to “build trust to do this work in places where there is less concern about the bias of our [Meta’s] teams.”

What This Looks Like on Meta vs X: While X’s Community Notes can provide an idea of how this program will look on Meta platforms, X’s user base is smaller than Meta’s and remains populated with subject matter experts, journalists, and users who are invested in keeping X as a verifiable source of news. It remains to be seen how Meta’s users engage with the Community Notes feature and the exact criteria that Meta will choose to implement for their version of this policy.

Previous