New York: Google’s YouTube says it is taking several steps to ensure the veracity of news on its service by cracking down on misinformation and supporting news organisations.
The company said on Monday it will make “authoritative” news sources more prominent, especially in the wake of breaking news events when misinformation can spread quickly. At such times, YouTube will begin showing users short text previews of news stories in video search results, as well as warnings that the stories can change.
The goal is to counter the fake videos that can proliferate immediately after shootings, natural disasters and other major happenings.
For example, YouTube search results prominently showed videos purporting to “prove” that mass shootings like the one that killed at least 59 in Las Vegas were fake, acted out by “crisis actors.”
In these urgent cases, traditional video won’t do, since it takes time for news outlets to produce and verify high-quality clips. So YouTube aims to short-circuit the misinformation loop with text stories that can quickly provide more accurate information.
Company executives announced the effort at YouTube’s New York offices. Those officials, however, offered only vague descriptions of which sources YouTube will consider authoritative.
Chief Product Officer Neal Mohan said the company isn’t just compiling a simple list of trusted news outlets, noted that the definition of authoritative is “fluid” and then added the caveat that it won’t simply boil down to sources that are popular on YouTube.
He added that 10,000 human reviewers at Google —
so-called search quality raters who monitor search results around the world — are helping determine what will count as authoritative sources and news stories.
Alexios Mantzarlis, a Poynter Institute faculty member who helped Facebook team up with fact-checkers (including The Associated Press), said the text story snippet at the top of search results was “cautiously a good step forward.”
But he worried what would happen to fake news videos that were simply recommended by YouTube’s recommendation engine and would appear in feeds without being searched.
He said it would be preferable if Google used people instead of algorithms to vet fake news. “Facebook was reluctant to go down that path two and half years ago and then they did,” he said.
YouTube also said it will commit USD 25 million over the next several years to improving news on YouTube and tackling “emerging challenges” such as misinformation. That sum includes funding to help news organizations around the world build “sustainable video operations,” such as by training staff and improving production facilities.
The money would not fund video creation. The company is also testing ways to counter conspiracy videos with generally trusted sources such as Wikipedia and Encyclopedia Britannica.
For common conspiracy subjects — what YouTube delicately calls “well-established historical and scientific topics that have often been subject to misinformation,” such as the moon landing and the 1995 Oklahoma City bombing — Google will add information from such third parties for users who search on these topics.