Faceboook recently announced that it has tied up with Boom Live, an Indian fact-checking agency, to fight fake news during the Karnataka elections. This is the latest in a series of partnerships that the social media giant has entered into in the last one year to improve the accuracy of information shared on its platform.
Fake news has become a pressing concern for Facebook, in light of the Cambridge Analytica revelations, that suggested that the social platforms was being used as a tool to manipulate voters and swing elections.
With a fortnight to go till polling day, Karnataka is awash in fake news.
Election season kicked off with the arrest of the editor of right-wing website Postcard News who publishing a photo of an injured Jain monk, claiming that he had been attacked by Muslims. It transpired that he had been injured in an accident. Fake intelligence reports, candidate lists, pre-poll surveys and even a video claiming to depict the Chief Minister dancing to a raunchy song have emerged since then.
Much of this misinformation found its audience on Facebook and WhatsApp.
"We know that Facebook is the source of the misinformation for most people, so we want to distribute our fact-checks on the same platform. As part of our battle, reach is very important, said Govindraj Ethiraj, Founder-Editor of Boom Live.
Old School meets New School
Boom employs a combination of old school journalistic methods and new age tech tools to fight fake news. "We look at publicly available data and numbers to identify broad trends. We use tools like reverse image searches and Photoshop to detect photos that have been doctored or morphed. And we also reach out to people, speak to them, and get facts," Ethiraj said.
The company is a year and four months old, and has a team of 5 fact-checkers based in Mumbai. It is now hiring two more, paid for by Facebook, specifically to fact-check the Karnataka polls.
Crucially, the pilot program will focus only on links in the English language.
Once a post has been fact-checked and rated as false, Facebook says that it will reduce its distribution and penalise the pages that generated or shared it.
Facebook has also provided Boom's fact-checkers access to a couple of software tools – one to flag potential pieces of misinformation and another to identify links that are going viral.
Jency Jacob, Boom's Managing Editor, said that while these tools are helpful, his fact-checkers weren't really dependent on them. "The tools provide a broad overview, but for our project, we are looking at a very niche category – misinformation related to the Karnataka election. And for that, we are monitoring pages and websites that peddle misinformation through opinion-based stories and we also have a WhatsApp helpline."
Once a post has been fact-checked and rated as false, Facebook says that it will reduce its distribution and penalise the pages that generated or shared it. It will also show users articles created by fact-checkers in the related news section below a flagged post.
Will it work?
Can such measures really stem the tide of fake news though?
Facebook believes the answer is "Yes". The company has claimed that its fact-checking measures were successful in reducing the distribution of fake news by 80%.
This claim, ironically, is impossible to fact-check.
"Facebook hasn't shared hard data on the impact of its fact-checking efforts. Until they do, it is ultimately impossible for outsiders to judge whether any of this is effective," said Alexios Mantzarlis, Director of the International Fact-Checking Network, of which Boom is a member. Apart from being tight-lipped about data, Facebook has also been accused by fact-checkers of taking too long to apply a disputed label, and in some cases, of not applying a label even after a story was flagged.
"They are basically buying good PR by paying us," said one disgruntled fact-checker to The Guardian.
Facebook did not reply to an email questionnaire requesting a response to these allegations.
Boom says that it is aware of the criticisms and that it is going into the project with limited expectations.
"If this partnership is aimed at getting PR and fighting whichever fire they are fighting at this point, that is not going to solve the problem," said Ethiraj. "The platforms are the primary propagators of fake news. If they treat this like a PR exercise, the whole thing will bounce back and hit them in a worse way."
While there is little to no indication that flagging posts as disputed reduces the spread of fake news, there is a mounting pile of evidence that suggests that such labels make readers more susceptible to misinformation.
"After we have taken on some ministers and ministry handles and fact-checked them openly, we haven't been able to catch them on anything after that. That's a sign of the fact that they appreciate that they have to be careful."
A recent study published by researchers at Yale University notes that while slapping a disputed label on an article would make a reader 3.7% less likely to believe it, the presence of such labels would make the same reader more likely to believe other pieces of misinformation that are not flagged. Considering the volume of fake news on the Internet these days, and the limited resources most fact-checkers have available to them, that is a dangerous tradeoff.
The findings of the Yale paper are disputed by Facebook. But they are in line with concerns raised by a clutch of slightly older studies that indicated that marking an article as fake news was unlikely to decrease readers' belief in it if the post aligned with their political leanings and that in some cases, such warnings could actually backfire and increase belief.
So is there any point at all to fact-checking?
Jacob, Boom's managing editor, concedes that it is an uphill battle, perhaps even an impossible task, to challenge entrenched ideological positions with fact-checking. But he believes that fact-checking still serves several vital functions.
"There are rational people who are caught in the middle. While they may hold their ideological beliefs, they are worried about the fact that they can't believe everything that comes on Facebook or Whatsapp. And that to me is a good sign," says Jacob. "After we have taken on some ministers and ministry handles and fact-checked them openly, we haven't been able to catch them on anything after that. That's a sign of the fact that they appreciate that they have to be careful. That is a good sign."