NewsGuard Wants to Fight Fake News With Humans, Not Algorithms

816

Say you’re scrolling through Facebook, see an article that seems a little hinky, and flag it. If Facebook’s algorithm has decided you’re trustworthy, the report then might go to the social network’s third-party fact checkers. If they mark the story as false, Facebook will make sure fewer people see it in the News Feed. For those who see it anyway, Facebook will surface related articles with an alternative viewpoint just below the story.

Every major platform—Twitter, YouTube, Reddit, and more—has some version of this process. But they all go about it in completely different ways, with every tech company writing its own rules and using black box algorithms to put them into practice. The patchwork nature of promoting trustworthy sources online has had the unintended consequence of seeding fears of bias.

That’s one reason why a group of journalists and media executives are launching a tool called NewsGuard, a browser plug-in for Chrome and Microsoft Edge that transcends platforms, giving trustworthiness ratings to most of the internet’s top-trafficked sites. Those ratings are based on assessments from an actual newsroom of dozens of reporters who comprise NewsGuard’s staff. They hail from a range of news organizations, including New York Daily News and GQ. Together, they’ve spent the last several months scoring thousands of news sites.

To vet the sites, they use a checklist of nine criteria that typically denote trustworthiness. Sites that don’t clearly label advertising lose points, for example. Sites that have a coherent correction policy gain points. If you install NewsGuard and browse Google, Bing, Facebook, or Twitter, you’ll see either a red or green icon next to every news source, a binary indicator of whether it meets NewsGuard’s standards. Hover over the icon, and NewsGuard offers a full “nutrition label,” with point-by-point descriptions of how it scored the site, and links to the bios of whoever scored them.

The tool is designed to maximize transparency, says Steve Brill, NewsGuard’s cofounder, best known for founding the cable company Court TV. “We’re trying to be the opposite of an algorithm,” he says. Brill started NewsGuard with Gordon Crovitz, former publisher of The Wall Street Journal.

Along with the launch of the plug-in, NewsGuard is announcing partnerships with Microsoft as part of its Defending Democracy Program. The startup has also forged a deal with libraries in at least five states, which plan to install the extension on their own computers and educate members about how to use it at home. “Adding this service on computers used by our patrons continues the long tradition of librarians arming readers with more information about what they are reading,” Stacey Aldrich, the state librarian of Hawaii, said in a statement.

Brill and Crovitz launched NewsGuard in response to two dueling crises facing journalism: the declining trust in mainstream media, and the proliferation of fake news that masquerades as legitimate. To fend off the threat of heavy-handed regulation, tech companies have unleashed artificially intelligent tools, which in turn have sparked charges of censorship. Recent changes to Facebook’s algorithm, for example, led to traffic declines at a range of media outlets. But Republican members of Congress have since seized on the shrinking reach of sites like The Gateway Pundit as evidence that Facebook censors conservatives.

Brill and Crovitz view NewsGuard as a sort of compromise. “We see ourselves as the logical, classic, free market American way to solve the problem of unreliable journalism online,” Brill says. “The alternatives out there are either government regulation, which most people should rightly hate, and the second-worst idea, which is: Let’s let the platforms continue to say they’re working on algorithms to deal with this, which will never work.”

NewsGuard’s staff of nearly 40 reporters and dozens of freelancers are still working their way through 4,500 websites that they say account for 98 percent of the content shared online. The creators say they’re on track to meet that goal by October. Sites can score up to 100 points on the NewsGuard rubric, with certain offenses, like repeatedly publishing stories identified as false, carrying extra weight. Any site that receives less than 60 points gets marked as red. The NewsGuard staff calls all of these organizations to discuss their shortcomings, and to ensure that they’ve characterized the site fairly.

“Algorithms don’t call for comment,” Brill says, adding that dozens of sites that have already improved their scores by integrating NewsGuard’s criteria.

Political leaning doesn’t come into play; a conservative site like Fox News gets the same green light as a left-leaning one like Think Progress. Similarly, both InfoWars and Daily Kos, which sit on opposite ends of the ideological spectrum, scored in the red.

NewsGuard’s generous threshold does mean that sites can get away with quite a bit and still score a green rating. “Not all greens are the same,” Crovitz cautions. That’s why NewsGuard publishes its nutrition labels, to help inform users about where a given site might fall short. The Daily Caller, for one, passes NewsGuard’s test, despite losing points for deceptive headlines, failing to disclose its financing, and failing to separate news and opinion responsibly.

NewsGuard’s long list of advisors includes marquee names from the federal government, including former homeland security secretary Tom Ridge, former undersecretary of state for public diplomacy Richard Stengel, and former Central Intelligence Agency director General Michael Hayden. But high-profile names won’t alone be enough to convince people that NewsGuard’s ratings are the ground truth. “In a world in which 10 or 15 percent of people think Barack Obama wasn’t born in the United States and another 10 or 15 percent probably still think 9/11 was an inside job, obviously not everyone is going to believe it,” Brill says. “We think more people will than won’t, or at least, more people will be more informed and maybe more hesitant about sharing.”

One recent study by Gallup and the Knight Foundation suggests that could be true. Researchers tested NewsGuard’s ratings, asking more than 2,000 US adults to rate the accuracy of 12 news articles on a five point scale. Some saw articles with NewsGuard’s ratings, some didn’t. The researchers found that subjects perceived news sources to be more accurate when they had a green icon attached than a red icon. They were also more likely to trust articles with a green icon than articles that had no icon at all.

NewsGuard’s leaders hope that the tech companies that already dictate much of the world’s information diet will license their nutrition labels in some form. That’s one way the company plans to make money. It also licenses these ratings to brands who want to create an advertising white list that prevents their ads from appearing on unsavory sites.

Microsoft is sponsoring the tool as part of its recently formed Defending Democracy Program. “As a NewsGuard partner we’re really interested in seeing how their service helps provide an additional resource of information for people reading news,” Tom Burt, Microsoft’s corporate vice president of customer security and trust, told WIRED in a statement. “As we see how the technology is adopted in the market we’ll also consider other opportunities.”

‘More people will be more informed and maybe more hesitant about sharing.’

Steve Brill, NewsGuard

The NewsGuard team has also had meetings with Facebook, though the social networking giant wouldn’t confirm whether it’s considering a partnership. It’s a lot to ask of a company like Facebook, which has kept its processes secret so it doesn’t have to engage in debate about every judgment call. The few times Facebook’s secret sauce has been exposed, it backfired. Two years ago, when word got out about how Facebook picked trustworthy news outlets in its Trending Topics section, it kicked off years of accusations of political bias that continue to this day. This year, the company axed Trending Topics altogether.

Tech companies are gradually coming around to the idea of working with certain, well-known third parties on content moderation. YouTube, for one, has begun surfacing Wikipedia and Encyclopedia Brittanica content alongside common conspiracy theories about topics like the moon landing.

It may take time to convince these same giants, already reluctant to pick favorites, to adopt this still untested methodology. But that shouldn’t stop the rest of us from getting a head start.

https://www.wired.com/story/newsguard-extension-fake-news-trust-score/