Facebook has repeatedly allowed world leaders to use its platform to deceive the public or harass opponents despite being alerted to evidence of the wrongdoing.
The Guardian has seen extensive internal documentation showing how Facebook handled more than 30 cases across 25 countries of politically manipulative behavior that was proactively detected by company staff.The investigation shows how Facebook has allowed major abuses of its platform in poor, small and non-western countries in order to prioritize addressing abuses that attract media attention or affect the US and other wealthy countries.
The company acted quickly to address political manipulation affecting countries such as the US, Taiwan, South Korea and Poland, while moving slowly or not at all on cases in Afghanistan, Iraq, Mongolia, Mexico and much of Latin America.
“There is a lot of harm being done on Facebook that is not being responded to because it is not considered enough of a PR risk to Facebook,” said Sophie Zhang, a former data scientist at Facebook who worked within the company’s “integrity” organization to combat inauthentic behavior. “The cost isn’t borne by Facebook. It’s borne by the broader world as a whole.”
Facebook pledged to combat state-backed political manipulation of its platform after the historic fiasco of the 2016 US election, when Russian agents used inauthentic Facebook accounts to deceive and divide American voters.
But the company has repeatedly failed to take timely action when presented with evidence of rampant manipulation and abuse of its tools by political leaders around the world.
Facebook fired Zhang for poor performance in September 2020. On her final day, she published a 7,800-word farewell memo describing how she had “found multiple blatant attempts by foreign national governments to abuse our platform on vast scales to mislead their own citizenry” and lambasting the company for its failure to address the abuses.“I know that I have blood on my hands by now,” she wrote. News of the memo was first reported in September by BuzzFeed News.
Zhang is coming forward now in the hopes that her disclosures will force Facebook to reckon with its impact on the rest of the world.
“Facebook doesn’t have a strong incentive to deal with this, except the fear that someone might leak it and make a big fuss, which is what I’m doing,” she told the Guardian. “The whole point of inauthentic activity is not to be found. You can’t fix something unless you know that it exists.”
Liz Bourgeois, a Facebook spokesperson, said: “We fundamentally disagree with Ms Zhang’s characterization of our priorities and efforts to root out abuse on our platform.
“We aggressively go after abuse around the world and have specialized teams focused on this work. As a result, we’ve taken down more than 100 networks of coordinated inauthentic behavior. Around half of them were domestic networks that operated in countries around the world, including those in Latin America, the Middle East and North Africa, and in the Asia Pacific region. Combatting coordinated inauthentic behavior is our priority. We’re also addressing the problems of spam and fake engagement. We investigate each issue before taking action or making public claims about them.”
Facebook did not dispute Zhang’s factual assertions about her time at the company.
With 2.8 billion users, Facebook plays a dominant role in the political discourse of nearly every country in the world. But the platform’s algorithms and features can be manipulated to distort political debate.
One way to do this is by creating fake “engagement” – likes, comments, shares and reactions – using inauthentic or compromised Facebook accounts. In addition to shaping public perception of a political leader’s popularity, fake engagement can affect Facebook’s all-important news feed algorithm. Successfully gaming the algorithm can make the difference between reaching an audience of millions – or shouting into the wind.
Zhang was hired by Facebook in January 2018 to work on the team dedicated to rooting out fake engagement. She found that the vast majority of fake engagement appeared on posts by individuals, businesses or brands, but that it was also being used on what Facebook called “civic” – ie political – targets.
The most blatant example was Juan Orlando Hernández, the president of Honduras, who in August 2018 was receiving 90% of all the known civic fake engagement in the small Central American country. In August 2018, Zhang uncovered evidence that Hernández’s staff was directly involved in the campaign to boost content on his page with hundreds of thousands of fake likes.
One of the administrators of Hernández’s official Facebook Page was also administering hundreds of other Pages that had been set up to resemble user profiles. The staffer used the dummy Pages to deliver fake likes to Hernández’s posts, the digital equivalent of bussing in a fake crowd for a speech.
This method of acquiring fake engagement, which Zhang calls “Page abuse”, was made possible by a loophole in Facebook’s policies. The company requires user accounts to be authentic and bars users from having more than one, but it has no comparable rules for Pages, which can perform many of the same engagements that accounts can, including liking, sharing and commenting.
The loophole has remained open due to a lack of enforcement, and it appears that it is currently being used by the ruling party of Azerbaijan to leave millions of harassing comments on the Facebook Pages of independent news outlets and Azerbaijani opposition politicians.