The world of online marketing turns on keywords and with Search Engine Optimization of vital concern to business owners and advertisers alike, tools like Googleβs Keyword Planner can be essential to identifying what keywords to buy ads near in Google search results.
But until recently, when advertisers typed in βBlack Girls,β βLatina girlsβ (we know, itβs redundant) and βAsian girls,β the majority of the keyword ideas suggested by the Keyword Planner were pornographic. Under βBlack girls,β for instance, Google returned βnaked black girls,β βbig booty black girlsβ and βebony camβ under suggested terms.
Suggested Reading
But if you typed in βwhite girlsββor, for that matter, βwhite boysββ into the keyword planner, no suggested terms appeared at all.
The discovery was recently made by The Markup, a tech-focused nonprofit news site. When it reached out to Google for comment about the issue, the tech giant quickly blocked its ad portal from returning results featuring any combination of βboysβ or βgirlsβ and racial or ethnic terms.
But these results have a direct effect on marketers attempting to build campaigns specifically for young, nonwhite audiences. From The Markup:
These findings indicate that, until The Markup brought it to the companyβs attention, Googleβs systems contained a racial bias that equated people of color with objectified sexualization while exempting White people from any associations whatsoever. In addition, by not offering a significant number of non-pornographic suggestions, this system made it more difficult for marketers attempting to reach young Black, Latinx, and Asian people with products and services relating to other aspects of their lives.
As the Markup notes, Googleβs Keyword Planner is βan important part of the companyβs online advertising ecosystem,β generating more than $134 billion in revenue just in 2019.
βThe language that surfaced in the keyword planning tool is offensive and while we use filters to block these kinds of terms from appearing, it did not work as intended in this instance,β Google spokesperson Suzanne Blackburn wrote in a statement responding to the findings. βWeβve removed these terms from the tool and are looking into how we stop this from happening again.β
Google stands apart from other search engines, commanding 70 percent of the search market share in 2018, making their missteps all the more visible and impactful. But whatβs true of Google is true of other kinds of technology: These tools and systems mirror, and in some cases amplify, the biases and blindspots of its makers and users.
Google has had several high profile problems in the last decade with regard to racism in its algorithms. In 2012, UCLA professor Safiya Noble called out how the search engine would regularly bring up porn sites in top results when people searched for βBlack girls.β Google subsequently and quietly fixed the issueββBlack girlsβ now returns the nonprofit Black Girls Code as its top search resultβbut searches for βAsian girlsβ still return mostly fetishistic sites.
In 2016, researchers in Brazil found that Google was more likely to return images of white people than Black or Asian women when users searched for βbeautiful womanβ images. The inverse was true when people searched for βugly woman.β
Straight From
Sign up for our free daily newsletter.