Mark Zuckerberg has announced a significant policy change, ending Meta's content moderation review system, which had aimed to protect users from harmful content.
The new policy places the responsibility of moderating harmful or misleading content on users, echoing a move towards less accountability for Meta.
Zuckerberg's decision seems to align with the rise of Donald Trump, suggesting a retreat from previous commitments to user safety and corporate responsibility.
Critics argue that this shift is not a failure of fact-checking but rather a change in the prioritization of harm reduction and the desires of advertisers.
As Meta pivots its focus, questions arise regarding Zuckerberg’s vision for the company amidst rising political power and cultural insecurity, particularly in a post-Trump era.
For more details, visit the original article here.