This article exists as part of the online archive for HuffPost India, which closed in 2020. Some features are no longer enabled. If you have questions or concerns about this article, please contact indiasupport@huffpost.com.

Election Experiment Proves Facebook Just Doesn't Care About Fake News In India

Much-hyped fact-checking initiative identified only 30 bits of fake news in month-long Karnataka campaign. Yup — 30!
PA Images via Getty Images

Mumbai — On April 16, a little less than a month before Karnataka went to the polls, Facebook announced a partnership with Boom Live, an Indian fact-checking website, to fight fake news during the Karnataka assembly polls.

Five days before the partnership was announced, an embattled Mark Zuckerberg stood before the the US Congress. Under fire for having allowed his platform to be used to manipulate elections, he declared that his company would do everything it could to protect the integrity of elections in India and elsewhere.

Facebook's press-release promised as much:

We have learned that once a story is rated as false, we have been able to reduce its distribution by 80%, and thereby improve accuracy of information on Facebook and reduce misinformation.

Yet, the pilot project in Karnataka suggests Facebook has a long way to go to keep Zuckerberg's promise. In an election cycle widelyacknowledged as rife with misinformation, fake polls and surveys, communally coloured rumours, and blatant lies peddled by campaigners, rating stories as "false" proved to be so difficult and time consuming that the Facebook partnership was only able to debunk 30 pieces of misinformation — 25 in the run-up to the polls, and 5 in the immediate aftermath — in the month long campaign.

The much-ballyhooed partnership added up to a small financial contribution from Facebook that allowed Boom to hire two fact-checkers, one in its offices in Mumbai and one based on the ground in Bengaluru, specifically to track the election. The fact-checkers were also given access to a Facebook dashboard that could be used to discover and counter misinformation on the platform.

Boom did not reveal the sum involved or allow HuffPost India access to the dashboard, citing a non-disclosure agreement. Facebook's representatives declined comment on a detailed questionnaire sent to them.

A Gushing Sewer of Fake News

Globally, Facebook's fact-checking initiative is a little over a year old, but the partnership with Boom marks its advent in India, the company's largest market.

"It's a late start, a very late start." says Pratik Sinha, co-founder of AltNews, another prominent fact-checking website. "But they're doing something now, which is good."

Yet Govindraj Ethiraj, Founder-Editor of Boom Live, said the social networking giant's contribution to their fact-checking efforts was of limited utility. "Facebook's involvement didn't really help us," he said. "This was more about us helping them."

Ethiraj identified Facebook-owned WhatsApp as the primary medium for the propagation of fake news during the Karnataka election. Each of the three major parties in the fray reportedly set up tens of thousands of groups on the platform in an effort to spread their message. Facebook is yet to figure out a way to allow fact-checkers into the platform without breaking the end-to-end encryption which makes it impossible for messages to be tracked.

But even on Facebook, which lends itself far more easily to tracking and monitoring, the tools that the company has built to track fake news are not particularly effective.

Facebook allows advertisers to micro-target content at users using specific attributes, and users are unlikely to report content that agrees with their ideological biases.

In his office in the aging Sun Mill Compound in Mumbai's Lower Parel, Jency Jacob, Managing Editor of Boom logged into the dashboard and scrolled through the gushing sewer of user-flagged content pouring in from around the world: stories about dinosaur remains and ancient caves, tales of celebrities battling mysterious diseases, and ordinary people undergoing plastic surgeries to look like celebrities, mixed in with news – both real and fake – that users found objectionable. There's one about the rise in fuel prices and there's even a Huffpost India story, about a Dalit being flogged to death in Gujarat. (The HuffPost India story, the editorial board can affirm, is true.)

"I can't claim that it doesn't affect me," admitted Jacob. "This morning, the first thing I saw after waking up was a video of a woman kicking a 3-year-old baby and slamming her on the ground. We are in the rush of it right now, but I don't think we will enjoy doing this all our lives."

"A lot of it is dependent on how users are reporting," Jacob continued, explaining that the dashboard tool relies on users to flag potentially "fake" news. "If the users aren't reporting it, it isn't going to come into the queue."

This is a blind spot as Facebook allows advertisers to micro-target content at users using specific attributes, and users are unlikely to report content that agrees with their ideological biases.

Everything But English

Facebook's dashboard cannot be used to report non-English content. In India, local language users outnumber English language users and more are coming online every day. The dashboard is also unable to filter stories relevant to a specific location, despite Facebook allowing advertisers to geo-target their advertisements with reasonable accuracy.

Jacob reckons the tool will get better at dealing with the Indian context over time. "This was always intended to be a pilot project. It will take them time to figure out how to get us more relevant leads," he said.

With not much help forthcoming from Facebook, Boom relied on its own tried and tested methods of tracking misinformation. Its fact-checkers monitored pages and websites known to be potential sources of fake news, told friends and family to forward anything suspicious they came across, and maintained their own reporting channel - a dedicated WhatsApp helpline for users to direct suspicious looking links.

These methods threw up about 4-5 actionable leads every day. To fact-check them, Boom deployed a combination of old school journalistic practices, such as getting fact-checkers to call sources, and tech tools like video and image matching software.

Fact-checking is a painstaking process that involves a great deal of manual effort.

"The way we measure virality is a bit of a crude method. We check whether several of us have received it or not, and whether it is being shared on all three platforms."

"Essentially, we are saying what we are saying is true, don't believe others," said Sinha. "That's a very arrogant position to take. To say that in a world full of information, there has to be a process where we take the audience from the claim to the truth. Gathering the information required to do that takes a lot of time."

According to Jacob, it sometimes takes 2-3 people working all day to fact-check a single video. And Boom only has 6 fact-checkers in all, including the two Facebook-funded hires. Given these constraints, they could act on only a fraction of the tip-offs.

"We were not looking at volume, but at impact," said Jacob, indicating that they focused their attention on misinformation that was going viral. "The way we measure virality is a bit of a crude method. We check whether several of us have received it or not, and whether it is being shared on all three platforms."

Jacob admits that there were many more stories that they could have tackled, but he says that it was impossible to address them all with the limited resources available to them.

Sinha reckons that Facebook already has the technology to significantly alleviate the manpower issue. "If you upload a video to Facebook and there's a copyright violation, they pull the video. So they know how to match videos. If they leverage that technology and apply it to fake news, it'll reduce the mundane work we have to do by half," he said.

While Facebook's contribution to Boom's sourcing and fact-checking processes was minimal, it does seem to have had a significant impact on how fact-checks were disseminated. The Facebook dashboard allows fact-checkers to tag content with ratings ranging from 'true' to 'false' with a few options in between and also attach their fact-check articles to the content. The platform then attempts to reduce distribution of the content and display the fact-check article to users whenever they encounter it on the news feed or attempt to share it.

NurPhoto via Getty Images

Major Victory

This system claimed its first major victory within a week of the partnership being announced when several major media outlets including NDTV India, India Today and Republic published a list of purported star campaigners for the Congress party that turned out to be fabricated.

Boom rated the articles false and linked their fact-check. Jacob could not verify if this reduced the articles' distribution by the 80% figure touted by Facebook, but said there was a clear impact.

"NDTV India carried the story and we noticed that their traffic dropped after we linked our fact-check to their article," said Jacob. With traffic plummeting and users being shown fake news warnings when interacting with their content, most of the media houses that published the list either issued clarifications or took their articles down.

After the initial success, Boom quickly ran into the limitations of the ratings system. Fact-checks could only be done on links and not on image, video, or text posts. Facebook eventually granted Boom access to image and video posts, but text posts are still beyond the purview of fact-checkers.

While that change was likely a simple fix that only required a switch to be flipped, there are other restrictions on the ratings system that are unlikely to be lifted as easily.

From the beginning of the election cycle, false statements by prominent politicians - including the Prime Minister - were an everyday affair. As is the norm, they were faithfully reported by most media outlets without critique or context. Misinformation masquerading as opinion, wherein a set of legimitate facts are presented out of context to arrive at a blatantly false conclusion, was also a persistent feature during the polls. Such articles add to the whirlwind of campaign misinformation, but are exempted from the rating system.

"Facebook needs to figure out a more aggressive model of showing the explanatory article to the reader."

Sinha believes that misinformation that falls into these grey areas cannot be laid at Facebook's door.

But Pranesh Prakash, Fellow at the Centre for Internet and Society, said such restrictions were "extraordinarily stupid."

"As long as the distinction is made that the publication isn't msiquoting and the politician is saying something that is false - and that's easy enough to do - I can't think of a possible justification," he said, regarding false statements made by public figures.

As for misleading opinion pieces Prakash said, "Most falsehoods are not just statements that present incorrect facts, but that present facts in an incorrect context. It's clearly the context that speaks to how people interpret facts. Fact checkers can't be people who only look at facts as black and white things."

Facebook's suggested method of dealing with such articles is to attach fact-check articles to them while assigning them a 'not eligible' rating. Jacob reckons that this is yet another blind spot.

"Facebook needs to figure out a more aggressive model of showing the explanatory article to the reader. The way it is designed now, with the article showing up below as a related link, not many people will bother to go and click on that."

The Whatsapp Problem

For all its flaws, the fact-checking initiative appears to be making an attempt at solving the problem of misinformation on Facebook's news feed. But the company hasn't even begun to address the 800-pound gorilla that is WhatsApp.

While Facebook has been castigated for playing fast and loose with privacy on its primary platform, the inherently better privacy features of the fully-encrypted Whatsapp platform have made it lethal when it comes to fake news. The lack of third party access, which has prevented Facebook from monetising WhatsApp chats - thus far - and security agencies from spying on them, has also made Whatsapp messages impossible to fact-check.

In Karnataka, WhatsApp was the primary vector for the spread of a series of fake polls, some of which were eventually picked up and published by mainstream media outlets. Unlike fake news that emerges on the Facebook and Twitter, it is impossible to trace the source of misinformation on Whatsapp.

"Just as spam can be flagged and people can be barred if they're flagged as spammers, similarly, if people have been flagged as serial promoters of fake news, you can use that to nudge people's behaviour."

"If Whatsapp had a trending list, our jobs would've been a lot easier," lamented Jacob. "By and large, we have figured out what goes viral on Facebook and Twitter. It might take a day to reach us, but eventually we catch anything that's going viral on these platforms. But Whatsapp is a black box."

Prakash asserts that while encryption is a barrier, it does not make it impossible to police fake news on WhatApp. "Just as spam can be flagged and people can be barred if they're flagged as spammers, similarly, if people have been flagged as serial promoters of fake news, you can use that to nudge people's behaviour."

There are indications that WhatsApp is attempting to develop features to tackle fake news. The platform has beta-tested features that would clearly identify forwarded messages and warn users if a message has been forwarded more than 25 times. Jacob said that Facebook was working on a product that would throw up fact-check articles when a user interacts with a fake news URL on WhatsApp. If or when any of these features actually make it to users is a matter of conjecture.

Prakash said the slow pace of progress on WhatsApp is just a reflection of the company's priorities. "It speaks to how American a company a Facebook is. Whatsapp is the real network for fake news in India, but it gets the least amount of attention."

Close
This article exists as part of the online archive for HuffPost India, which closed in 2020. Some features are no longer enabled. If you have questions or concerns about this article, please contact indiasupport@huffpost.com.