Google to Curb ‘Fake News’ with Improved Search Ranking, Human Evaluators & User Feedback

Author Photo
Apr 25, 2017
13Shares
Submit

Google had been struggling for a past few months to curb fake news. In the past, we reported about various tools in works that would be applied by Google to tackle such news sources. Today, the search engine giant has introduced “structural changes.” to shun the “spread of blatantly misleading, low quality, offensive or downright false information.”

The company has identified the billions of indexed pages in its search results. Around 0.25 percent of daily search queries show up with “offensive or clearly misleading content.” Google has marked the issue as a different problem from existing fake news issues, but its motto stays the same – i.e.- to decrease the volume of such sources.

gmail-inbox-integrationRelatedGoogle Won’t Snoop On Your Private Emails For Showing Personalised Ads

In a blog post, Google writes:

While this problem is different from issues in the past, our goal remains the same—to provide people with access to relevant information from the most reliable sources available. And while we may not always get it right, we’re making good progress in tackling the problem.

In addition to applying algorithms, the tech giant is also using human moderators to evaluate the quality of search results to remove problematic queries. Google would record the evaluation of manual evaluators and utilize it as a feedback mechanism to identify problem areas to improve them. But, it won’t be using these assessments to determine single page rankings.

By employing New Search Quality Rater Guidelines for low-quality pages that possess false information, unexpected offensive results, hoaxes and unsupported conspiracy theories. With these guidelines, algorithms will start placing low-quality on lower search result rankings.

screen-shot-2017-06-23-at-5-53-46-amRelatedUber is Neck Deep in Trouble For Stealing Self-Driving Tech From Alphabet Owned Waymo

Google’s second tool “Ranking Changes” is for ranking pages on content. It has readjusted the signals that were used to determine search results. The new change would help in promoting “authoritative pages and demote low-quality content, so that issues similar to the Holocaust denial results that we saw back in December are less likely to appear.”

It will also be taking help of users, by allowing them to flag faulty Featured Snippets and Autocomplete predictions. Users will be able to report such content by clicking on “Report inapproriate predictions” and “Feedback.” After clicking these options, users will see a box with labeled categories. Users will also be able to drop personalise comments to report the content, this feedback from users will be used by Google to improve search algorithms.

These changes will also address issues with Google Home and earlier queries about why the search bar shows “shocking or offensive predictions in autocomplete.” Google has also listed these changes under its “Help Center” along with information for website owners under the “How Search Works” section.

Submit