Google starts highlighting fact-checks in News

Oct 14, 2016 22:25


Today Google added a new “fact-check” tag to its popular Google News service. The site aggregates popular timely news from multiple sources and has traditionally grouped them with tags like “opinion,” “local source” and “highly cited.” Now readers can see highlighted fact-checks right next to trending stories.

The company cites the growing prominence of fact-checking sites as one of the reasons for creating the tag. Content creators will be able to add the new fact-check tag to posts themselves using a finite set of pre-defined source labels.



ClaimReview from Schema.org will be used to compile and organize stories offering factual background. The Schema community builds markups for structured data on the internet. The group is sponsored by Google but also has support from Microsoft, Yahoo and Yandex.



Casual readers in the U.S. and U.K. can find nuggets of fact within the expanded view of news stories on web and mobile versions of the service. Fingers crossed the new tool doesn’t result in stories claiming that the earth is really flat rising to the top of our feeds.

On a support page, Google explains that it holds the power to intervene if posts are improperly tagged as fact-checks.

“Please note, that if we find sites not following those criteria for the ClaimReview markup, we may, at our discretion, either ignore that site’s markup or remove the site from Google News.”

This may not prevent false stories from rising up in Google News, but it will make it a lot harder. It doesn’t appear that very many fact-check stories have propagated yet on Google News. We couldn’t find any in a quick visit to the site, but given the timeliness of the presidential election in the U.S., we can only expect the system to be put through its paces in the coming weeks.

Source 1

Google added fact checking: Facebook, it’s your move now

Google yesterday announced it will introduce a fact check tag on Google News in order to display articles that contain factual information next to trending news items. Now it’s time for Facebook to take fact-checking more seriously, too.

Facebook has stepped into the role of being today’s newspaper: that is, it’s a single destination where a large selection of news articles are displayed to those who visit its site. Yes, they appear amidst personal photos, videos, status updates, and ads, but Facebook is still the place where nearly half of American adults get their news.


Facebook has a responsibility to do better, then, when it comes to informing this audience what is actually news: what is fact-checked, reported, vetted, legitimate news, as opposed to a rumor, hoax or conspiracy theory.

It’s not okay that Facebook fired its news editors in an effort to appear impartial, deferring only to its algorithms to inform readers what’s trending on the site. Since then, the site has repeatedly trended fake news stories, according to a Washington Post report released earlier this week.

The news organization tracked every news story that trended across four accounts during the workday from August 31 to September 22, and found that Facebook trended five stories that were either “indisputably fake” or “profoundly inaccurate.” It also regularly featured press releases, blog posts, and links to online stores, like iTunes - in other words, trends that didn’t point to news sites.

Facebook claimed in September that it would roll out technology that would combat fake stories in its Trending topics, but clearly that has not yet come to pass - or the technology isn’t up to the task at hand.

In any event, Facebook needs to do better.

It’s not enough for the company to merely reduce the visibility of obvious hoaxes from its News Feed - not when so much of the content that circulates on the site is posted by people - your friends and family -  right on their profiles, which you visit directly.

Plus, the more the items are shared, the more they have the potential to go viral. And viral news becomes Trending news, which is then presented all Facebook’s users in that region.

This matters. Facebook has trended a story from a tabloid news source that claimed 9/11 was an inside job involving planted bombs. It ran a fake story about Fox News anchor Megyn Kelly which falsely claimed she was fired. These aren’t mistakes: they are disinformation.

Facebook has apologized for the above, but declined to comment to The Washington Post regarding its new findings that fake news continues to be featured on the platform.

In addition, not only does Facebook fail at vetting its Trending news links, it also has no way of flagging the links that fill its site.

Outside of Trending, Facebook continues to be filled with inaccurate, poorly-sourced, or outright fake news stories, rumors and hoaxes. Maybe you’re seeing less of them in the News Feed, but there’s nothing to prevent a crazy friend from commenting on your post with a link to a well-known hoax site, as if it’s news. There’s no tag or label. They get to pretend they’re sharing facts.

Meanwhile, there’s no way for your to turn off commenting on your own posts, even when the discussion devolves into something akin to “sexual assault victims are liars” (to reference a recent story.)

Because perish the thought that Facebook would turn of the one mechanism that triggers repeat visits to its site, even if that means it would rather trigger traumatic recollections on the parts of its users instead.

There is a difference between a post that’s based on fact-checked articles, and a post from a website funded by an advocacy group. There’s a difference between Politifact and some guy’s personal blog. Facebook displays them both equally, though: here’s a headline, a photo, some summary text.

Of course, it would be a difficult job for a company that only wants to focus on social networking and selling ads to get into the media business - that’s why Facebook loudly proclaims it’s “not a media company.”

Except that it is one. It’s serving that role, whether it wants to or not.

Google at least has stepped up to the plate and is trying to find a solution. Now it’s Facebook’s turn.

Facebook may have only unintentionally become a media organization, but it is one. And it’s doing a terrible job.

Source 2

google, social media, factcheck

Previous post Next post
Up