Google's Online Marketing Tool Identified 'Black Girls' Searches With Porn: Report

The world of online marketing turns on keywords and with Search Engine Optimization of vital concern to business owners and advertisers alike, tools like Google’s Keyword Planner can be essential to identifying what keywords to buy ads near in Google search results. But until recently, when advertisers typed in β€œBlack Girls,” β€œLatina girls” (we know,…

The world of online marketing turns on keywords and with Search Engine Optimization of vital concern to business owners and advertisers alike, tools like Google’s Keyword Planner can be essential to identifying what keywords to buy ads near in Google search results.

But until recently, when advertisers typed in β€œBlack Girls,” β€œLatina girls” (we know, it’s redundant) and β€œAsian girls,” the majority of the keyword ideas suggested by the Keyword Planner were pornographic. Under β€œBlack girls,” for instance, Google returned β€œnaked black girls,” β€œbig booty black girls” and β€œebony cam” under suggested terms.

Video will return here when scrolled back into view

But if you typed in β€œwhite girls”—or, for that matter, β€œwhite boys”— into the keyword planner, no suggested terms appeared at all.

The discovery was recently made by The Markup, a tech-focused nonprofit news site. When it reached out to Google for comment about the issue, the tech giant quickly blocked its ad portal from returning results featuring any combination of β€œboys” or β€œgirls” and racial or ethnic terms.

But these results have a direct effect on marketers attempting to build campaigns specifically for young, nonwhite audiences. From The Markup:

These findings indicate that, until The Markup brought it to the company’s attention, Google’s systems contained a racial bias that equated people of color with objectified sexualization while exempting White people from any associations whatsoever. In addition, by not offering a significant number of non-pornographic suggestions, this system made it more difficult for marketers attempting to reach young Black, Latinx, and Asian people with products and services relating to other aspects of their lives.

As the Markup notes, Google’s Keyword Planner is β€œan important part of the company’s online advertising ecosystem,” generating more than $134 billion in revenue just in 2019.

β€œThe language that surfaced in the keyword planning tool is offensive and while we use filters to block these kinds of terms from appearing, it did not work as intended in this instance,” Google spokesperson Suzanne Blackburn wrote in a statement responding to the findings. β€œWe’ve removed these terms from the tool and are looking into how we stop this from happening again.”

Google stands apart from other search engines, commanding 70 percent of the search market share in 2018, making their missteps all the more visible and impactful. But what’s true of Google is true of other kinds of technology: These tools and systems mirror, and in some cases amplify, the biases and blindspots of its makers and users.

Google has had several high profile problems in the last decade with regard to racism in its algorithms. In 2012, UCLA professor Safiya Noble called out how the search engine would regularly bring up porn sites in top results when people searched for β€œBlack girls.” Google subsequently and quietly fixed the issueβ€”β€œBlack girls” now returns the nonprofit Black Girls Code as its top search resultβ€”but searches for β€œAsian girls” still return mostly fetishistic sites.

In 2016, researchers in Brazil found that Google was more likely to return images of white people than Black or Asian women when users searched for β€œbeautiful woman” images. The inverse was true when people searched for β€œugly woman.”

Straight From The Root

Sign up for our free daily newsletter.