On Facebook

CEO of Meta, Mark Zuckerberg, has decided that moderation of comments on his platforms has become too biased and is shifting to a model of Community Notes, aka the model used on X. (formerly Twitter)

ABC: After Trump’s election win, Meta is firing fact checkers and making big changes

As mentioned in my first, previous and somewhat timely post, this is not Meta’s first foray into misconduct on their social media platforms. One such event revolved around Cambridge Analytica which in 2018 was revealed to be illegally gathering personal information of users of Meta’s social networks for the purposes of aiding political campaigns, like the Trump 2016 Presidential Campaign.

Upon discovery, Meta took some surface-level actions to try and mitigate the damage caused to it’s corporate reputation: Highlighting privacy to users and making their settings more obvious, rolling out GDPR compliance across it’s platform worldwide, not just in the European Union, and it also increased staffing levels and standards in it’s moderation and community standards.

To the surprise of absolutely no-one, now that enough time has passed and with more friends on the way in government circles, Zuckerberg has decided that it’s time to revert some of these policies back to the same behaviour from the past by removing fact-checkers and moving content moderation staff from the high minimum-wage-paying state of California to Texas, where staff who suffer from PTSD after looking at some content will perhaps be paid a princely minimum wage of $7.50 per hour. The reasoning for this is being pushed as that the existing moderators are too liberal in blue-state California and the reinforcement of freedom of speech.

This shift is especially troubling given reports of suppressed comments about Palestine since 2023, raising concerns about whose voices will be amplified and whose will be silenced under the new model.

Don’t get me wrong, I’m all for freedom of speech, but not at the expense of common courtesy and certainly not at the expense of facts.

The simple fact is that Facebook and it’s associated social media networks have been largely changed from, originally, a place where opinions are swapped and shared to a place where opinions are measured, shaped and governed.

This occurs because we allow it to. Every post or comment that we add to Meta’s servers provides more content to be analysed, parsed and dissected so that you are shown news, posts and advertisements which are tailored for you and your experience. After all, you don’t have to think about things that you already know or agree with.

Of course, this is not new information – we already know this.

And yet, there are still three billion people on Meta’s platforms.

Perhaps it’s time to rethink our participation in platforms that shape our discourse for profit.