Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Financial Times
Financial Times
Business
Gillian Tett

Facebook or Google — which should worry us more?

A couple of months ago, a veteran investor in Silicon Valley conducted an experiment: he extracted all the data that Facebook and Google each held about him and compared the files.

The results startled him — Google held dramatically more information, by a large multiple. “It’s amazing,” he told me over breakfast in San Francisco. “Why is nobody talking about that?”

It is an interesting question, particularly if you use Google’s services numerous times each day, as I do. One answer might be that Google executives have been savvy in building political support networks. Another is that Google hangs on to the data it collects itself, and then uses it to create targeted search-and-advertising offerings, customised for users. Facebook lets third-party developers access its data, which is why the antics of Cambridge Analytica have sparked so much furore.

This distinction may make Google sound more benign, but does it mean we can relax? Robert Epstein, a psychologist with the American Institute for Behavioral Research and Technology in California, thinks not. In recent years, he has conducted extensive research with fellow psychologists and data scientists into Google’s “search”, or “autocomplete”, function. This has left him convinced that search engines can sway our minds in extraordinarily powerful and largely unnoticed ways too — and not only about politics.

“A search engine has the power to manipulate people’s searches from the very first character people type into the search bar,” says a research paper that this group presented to a psychology conference in Oregon last month. “A simple yet powerful way for a search-engine company to manipulate elections is to suppress negative search suggestions for the candidate it supports, while allowing one or more negative search suggestions to appear for the opposing candidate.”


Epstein’s group asked 661 Americans to pick one of two candidates in an Australian election. Since it was presumed they did not know much about Antipodean politics, the participants were instructed to research them with a Google-type search engine that offered the usual autocomplete suggestions when words were typed in.

However, the researchers also varied the search suggestions shown beneath a candidate’s name, including a range of positive and negative words. The results were stark. When participants were later questioned about their voting preferences, changing the ratio of positive to negative suggestions in the autocomplete was shown to be capable of shifting the preferences of undecided voters by nearly 80 per cent — even though participants seemed free to search for any material they wanted. Another study found that when participants were only offered four autocomplete suggestions, they were very easily manipulated; when there were 10 to choose from, they were not.

These results do not demonstrate that Google — or any other search-engine company such as Bing or Yahoo — has used this power to manipulate its users. But Epstein’s paper highlights some patterns that he considers strange. At that time he discovered that on Google, Bing and Yahoo you could get negative search results about Bing and Yahoo; but you could get negative search results about Google only on Bing and Yahoo.


Another striking pattern cropped up in August 2016. When the words “Hillary Clinton is” were typed into Google’s search engine, the autocomplete offered phrases such as “Hillary Clinton is winning”; on Yahoo and Bing, the autocomplete suggested “Hillary Clinton is a liar” and “Hillary Clinton is a criminal”.

Google executives say these different auto-suggestion patterns arose because the company has a policy of removing offensive auto-predictions. “Google removes predictions that are against our autocomplete policies, which bar . . . hateful predictions against groups and individuals on the basis of race, religion or several other demographics,” wrote Danny Sullivan, a company senior executive, in a blog post last month.

They have also firmly denied they have ever tried to use the autocomplete tool to manipulate users. They have said Epstein’s work was based on a small sample size, using a Google-style search engine rather than Google’s own data. In a rare official comment in 2015 about some of Epstein’s work, executives said: “Google has never ever re-ranked search results on any topic (including elections) to manipulate user sentiment.”

If nothing else, this research should make us all ponder the way in which we use that “autocomplete” function. The better auto-prediction becomes, the greater the potential risk that users will become lazily sucked into digital echo chambers. Among other things, Epstein believes search engines should put a simple “health warning” about the dangers of echo chambers — and manipulation — on their sites to counter these possible risks.

Whether or not you accept Epstein’s research, this seems a good idea. But don’t expect it to happen soon — or not unless more consumers, and regulators, do what my Silicon Valley breakfast companion did: namely look at the data that all the biggest tech companies hold on us, starting — but not finishing — with Facebook.

This article has been amended since publication to clarify findings on Google, Bing and Yahoo made in a paper by Dr Epstein

If you are a subscriber and would like to receive alerts when Gillian’s articles are published, just click the button “add to myFT”, which appears at the top of this page beside the author’s name. Not a subscriber? Follow Gillian on Twitter @gilliantett or email her at gillian.tett@ft.com

Follow @FTMag on Twitter to find out about our latest stories first. Subscribe to FT Life on YouTube for the latest FT Weekend videos

Copyright The Financial Times Limited 2018

2018 The Financial Times Ltd. All rights reserved. Please do not copy and paste FT articles and redistribute by email or post to the web.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.