Google Search quietly undermines democracy

Google’s aesthetic has has always been rooted in a clean look – a homepage with no ads or pop-up clutter, adorned only with a signature “doodle” decorating its name. Part of the reason many users love Google is its sleek designs and ability to return remarkably accurate results. Yet the simplicity of Google’s homepage is deceptively static. Over time, the way the company returns information has changed slightly. These incremental changes go largely unnoticed by the millions of users who rely on the search engine daily, but they have fundamentally changed the processes of finding information, and not necessarily for the better.

When Google first launched, queries returned a simple list of linked websites. Slowly this format changed. Google first launched AdWords, allowing businesses to buy space at the top and tailor returns to maximize product placement. In the early 2000s, it corrected spelling, provided news summaries under headlines, and anticipated our queries with autocomplete. In 2007, she launched Universal Search, gathering relevant information in all formats (news, images, video). And in 2012, it introduced Knowledge Graph, providing a separate snapshot of feedback, a source of knowledge many of us rely on exclusively when it comes to quick searches.

As research has shown, most of these design changes are now tied to Google properties, putting its products above its competitors. Instead of just showing a series of blue links, its goal, according to official SEC filings by Alphabet, is increasingly to “provide direct answers.” By adding all of these features, Google – along with competitors such as DuckDuckGo and Bing, which also abstract content – has effectively changed the experience from an exploratory search environment to a platform designed around verification, replacing a process that allows for learning and investigation with one that is more like a fact-checking service.

Google’s latest desire to answer our questions for us, rather than forcing us to click through feedback and find the answers for ourselves, isn’t particularly problematic if what you’re looking for is a simple fact like the number of ounces that make up a gallon. The problem is that many rely on search engines to find information on more complicated topics. And, as my research reveals, this shift can lead to incorrect returns that often disrupt democratic participation, confirm unsubstantiated claims, and are easily manipulated by people seeking to spread lies.

For example, if one asked “When is the North Dakota caucus” during the 2020 presidential election, Google highlighted the wrong information, saying it was Saturday, March 28, 2020. In fact, the firehouse caucus was on March 10, 2020, it’s the Republican convention that was on the 28th. Worse still, when errors like this occur, there is no mechanism for users to who notice discrepancies report them for an informative review.

Google summaries can also mislead the public on matters of great importance to the maintenance of our democracy. When Trump supporters stormed the Capitol on January 6, 2021, conservative politicians and pundits were quick to brand the rioters “anti-Trumpers,” spreading lies that antifa (a loose organization of people who believe in active and aggressive opposition to far-right movements) was to blame for the violence. On the day of the attack, The washington time published an article titled “Facial Recognition Identifies Extremists Storming the Capitol,” supporting this claim, and this story has been perpetuated in the House and on Twitter by elected officials.

Yet even though the FBI found no evidence to support these claims, and The Washington Times finally published a correction to the article, the misinformation is still widely accessible with a simple Google search. If one were to search for “Washington Times Antifa Evidence”, the best return (at the time of writing) is the original article with the headline “Facial Recognition Identifies Extremists Storming Capitol Building”. Below, Google summarizes an inaccurate argument, pointing out that those identified as the extremists were antifa. Perpetuating these lies has lasting effects, especially since the people in my study described Google as a neutral provider of news and information. According to an April 2021 poll, more than 20% of Republican voters still blame antifa for the violence that occurred that day.

The problem is that many users still rely on Google to verify information, which could reinforce their belief in false claims. It’s not just because Google sometimes provides misleading or incorrect information, but also because the people I spoke with for my research thought Google’s top search results were “more important”, “more relevant and “more accurate,” and they trusted Google. more than news, they saw it as a more objective source. Many said the Knowledge Graph might be the only source they looked at, but few realized how much Google had changed – that it was no longer the search engine it once was. In an effort to “do their own research”, people tend to search for something they’ve seen on Facebook or other social media platforms, but due to the way the content has been tagged and categorized, they actually fall into an information trap.

This brings me to what I mention in my book, The Propagandists’ Handbook, as “the IKEA effect of misinformation”. Business specialists have found that when consumers build their own merchandise, they place more value on the product than on an already assembled item of similar quality – they feel more knowledgeable and therefore more satisfied with their purchase. Conspiracy theorists and propagandists rely on the same strategy, providing tangible, DIY quality to the information they provide. Independently conducting research on a given topic makes the audience feel like they are engaging in an act of self-discovery when they are actually participating in a scavenger hunt organized by those spreading the lies.

To combat this, users need to recalibrate their thinking about what Google is and how information is fed back to them, especially as we approach a busy midterm season. Rather than assuming feedback validates the truth, we need to apply the same scrutiny we’ve come to have toward social media stories. Googling the exact same phrase you see on Twitter will likely return the same information you saw on Twitter. Just because it comes from a search engine doesn’t make it more trustworthy. We need to be careful about the keywords we start with, but we also need to take a bit more time to explore the information that comes back to us. Rather than relying on quick answers to tough questions, take the time to click on links, do a little research on who is reporting, and read information from a variety of sources. Then run the search again, but from a different angle, to see how slight syntax changes affect your results.

After all, something we don’t even think to consider might be just a click away.

Comments are closed.