Facebook’s Betrayal of Progress in the Arab World
The irony of "fact-checking" and Facebook’s long-overdue turn toward progress
Mark Zuckerberg’s decision to end Facebook’s fact-checking program is, at its core, a welcome step. By declaring a renewed commitment to “free expression,” Zuckerberg acknowledges what many of us have known for years: the platform’s approach to misinformation and moderation has been deeply flawed. Yet, while this shift holds promise, it arrives far too late for those of us who have struggled against the limitations imposed by Facebook’s policies.
In the digital age, platforms like Facebook are not just communication tools—they are the arbiters of information, deciding which narratives are amplified and which are suppressed. At Ideas Beyond Borders (IBB), we recognize the immense influence these platforms hold. We built the largest independent pro-liberty multimedia content platform in the Middle East, providing an extensive array of original videos, podcasts, and a rich assortment of translated books and articles in the region’s languages. With a vibrant community of over 8.6 million followers, we are at the forefront of championing liberty, free enterprise, science, critical thinking, and advancing knowledge across the Middle East. In a region where access to unbiased information is often limited by censorship, extremism, and authoritarian narratives, young people are hungry for this information—a positive alternative to the regressive ideologies driving conflict in the region.
Our mission, through our distribution network Bayt Al Hikma 2.0 (House of Wisdom 2.0), is to spread these values and ideas across the Arab world. However, Facebook’s previous actions force us to question its commitment to free expression.
Facebook once claimed to combat falsehoods through "fact-checking" and misinformation policies. Yet, their policies suggest otherwise. Despite a wealth of progressive voices in the Middle East, Facebook appointed just one representative from the region to its Oversight Board—an individual affiliated with the Muslim Brotherhood—a transnational extremist organization with branches everywhere around the world that wants to impose theocratic governments in the Middle East.
This decision was not just baffling, but profoundly troubling. The Muslim Brotherhood is an organization widely criticized for promoting regressive ideologies, undermining democratic principles, and stifling freedom of expression. By giving such a figure a seat on the Oversight Board, Facebook effectively endorsed a worldview that runs counter to the very ideals it claims to protect. For an organization like IBB, which works tirelessly to combat extremism and promote open, critical discourse, this feels like a betrayal.
Our challenges didn’t stop there. Time and again, educational content aimed at fostering understanding and knowledge was flagged and removed without explanation. Videos on topics like Dubai's economic transformation, tributes to Nelson Mandela, or analyses of Lebanon’s economic crisis were labeled as political or election-related despite containing no such content. Even videos on basic economic principles, such as the economies of India and Saudi Arabia, were rejected. These rejections were upheld through review processes that lacked transparency, forcing us to navigate unnecessary roadblocks.
For example, a video about the economy in India was rejected on the grounds that it was related to elections despite containing no mention of them, and it was targeted towards an Arab audience that can have no influence on Indian elections.
Similarly, other videos, such as one about the development of Saudi Arabia, another about Dubai titled "From a Small Village to the World's Hub," and a video on Lebanon's economic crisis, were also rejected. In most cases, the rejection comes directly from Facebook, and when we submit a request to review the decision, it is often denied without any clear explanation. These challenges pose significant obstacles to disseminating meaningful educational content and limit our ability to reach the intended audience effectively.
Now, Zuckerberg’s announcement to end fact-checking—framed as a move away from “too many mistakes and too much censorship”—acknowledges some of the platform’s past failures. While we welcome this pivot, it is impossible to ignore the harm already done. The delay in addressing these issues has stymied progress, allowing misinformation and regressive ideologies to gain a foothold, while voices promoting reason and liberty have struggled to reach their audiences.
Facebook’s past moderation policies often seemed to ignore the realities of the region. By prioritizing voices tied to ideologies rejecting critical thought, they undermined the ideals that empower societies to thrive. This was not merely a policy failure but a failure of principle. Social media platforms should champion free inquiry and the exchange of ideas, yet Facebook’s actions have too often done the opposite.
The promise of a more user-driven moderation approach, inspired by Elon Musk’s changes at X, carries both potential and risk. While it may reduce accusations of bias, it must not become an excuse for abdicating responsibility. Platforms like Facebook have a profound influence on global discourse.
We hope Zuckerberg’s decision marks the beginning of a new chapter for Facebook, one that truly prioritizes free expression and progress. However, this shift must be accompanied by a candid acknowledgment of past missteps and a commitment to ensuring that all voices, especially those championing liberty and knowledge, can thrive. Without this, the promise of free expression risks becoming an empty platitude.
At Ideas Beyond Borders, our resolve remains steadfast. We believe in the power of unfettered access to knowledge and the tenacity of those who pursue freedom. With this change, we hope to finally share our work and ideas freely, reaching those who need them most. Yet, the irony persists: the tools meant to lead us to a freer, enlightened world sometimes threaten to drag us back into darkness. The Arab world deserves nothing less than platforms that empower progress and enlightenment. We hope Zuckerberg will make good on his promise and deliver a platform where ideas can flourish, promoting progress and peace at home and abroad.
I share your concern about how dangerous it is to put obviously biased or compromised figures in charge of “fact-checking,” and I respect the frustration you voice over Facebook’s murky, often self-defeating policies in the Middle East. But let’s not forget that when Facebook fails to moderate quickly and decisively, people die. The Rohingya Genocide is a tragic example of a platform turning a blind eye to hate speech, letting viral misinformation fuel mass violence. This is the frightening cost of inaction.
If Zuckerberg wants to pivot toward “community-driven” moderation, then it can’t be limited to English. There must be robust sentiment detection for every language—and real accountability for extremist content. Otherwise, we risk replaying the horrors in Myanmar anywhere marginalized groups are scapegoated by hate campaigns.
You're right that fact-checking is no trivial job, especially when the gatekeepers have questionable affiliations. Yet if Facebook and other platforms shirk responsibility entirely, we know exactly how catastrophic that can be. Swift, sweeping moderation isn’t anti-free-speech; it’s about protecting real people from harm.