University of New Hampshire, Department of Computer Science

Information Technology 502
, Intermediate Web Design

Spring 2024

Jump to navigation

Design Template by Anonymous

Biases Within the Google Search Algorithm

Understanding the Google Search Algorithm

To look at search engine optimization (SEO), we have to first understand how the search engine works. But we can't just look at its successes; I want to take this time to look at its problematic past and failures.

The search engine operates by applying a detailed algorithm to each search prompt to try and find a result that will garner the user's click.

Google places importances on these clicks because they get more money from advertisors for each click. This is a financial incentive to put these sponsered links at the top of the search page. It takes those crucial first couple of spots away from other links that might have more meaningful or academically-sound information.

The first two links of any search result gather around 50% of all clicks from that prompt. These first two spaces are essentially what the user will be seeing after any search. So when the spots are filled with the most popular links or sponsered links, instead of the factually correct links, it's easy for misinformation to spread.

Can Algorithms Even Be Biased?

It's easy to assume that code, and specifically algorithms, can't be biased. But look at what's going on in the courts! Judges are using algorithms to determine if an individual will re-commit a crime to determine their sentencing length. This is meant to be less biased than the judges making the decision, where race might play a role in their pre-concieved notions about the individual. But these algorithms can still be racist. They don't ask what an individual's race is, but they take into account if an individual grew up with both parents, if they have felons in their family, thier job history, etc. All of these factors shouldn't influence whether or not someone will be a repeat offender, but it does in these algorithms.

Algorithms like this don't specifically take race into account. However, they can reinforce existing biases. For example, black people are five times more likely to be felons than white people. This comes from systemic racism and unfair persecution, as well as insufficient resources for people being released from jail. So when the algorithm asks "Do you have felons in your family?" this disproportionately impacts black people. It then leads to more black people in jail, and the cycle continues.

Algorithms can build upon biased data, causing a feedback loop that reinforces itself.

This is one of many general examples. I want to take this time, however, to look specifically at the Google search algorithm.

Google gets more clicks if it provides popular sites. These sites don't need to be appropriate, or even accurate. They just need to get people interested. So by putting sites at the top of the page that might contain harmful stereotypes, Google is saying that they care more about interaction than they do keeping their site unbiased.

This is also present in the autofill function. Google provides a list of options that you might use to complete your search. These aren't regulated, meaning harmful stereotypes can work their way in.

Both of these examples are presented in further detail on the following page.