Google Equates Black Girls With Sex; Why?

The search engine's profit motive doesn't always work in the best interests of women of color.

(Continued from Page 2)

No matter the Web mechanisms, sex is a lucrative industry online. The relationship between ads and websites when you search for women and girls is of no surprise. Anyone can buy "black girls" or any other word combinations. In this business model, communities are less in control of the ways in which they are represented. Identity is for sale to the highest bidder.

Latanya Sweeney of Harvard University just released a study (pdf) about the power of Google's AdWords and racial identity. She found that "black-sounding" names are more likely to bring up advertising for criminal-background checks than white names. Sweeney's work points to the ways that racial bias happens in Google search.

What is worrisome about the commercial-search business model is that companies have a vested interest in our clicking on websites and ads that make them the most money. This model has little relationship to the best way for us to find knowledge or information that we can trust. It drives home the reason we cannot substitute search engines for great schools, competent teachers and well-funded libraries. It raises questions about how companies, not communities, control what is found in a search engine. For marginalized groups that need continued advocacy for social, economic or political justice, co-optation of identity by these processes is problematic.

But it's not just girls and women who should take note. In 2004, the Anti-Defamation League challenged Google when searches on "Jew" brought back anti-Semitic, neo-Nazi and Holocaust-denial websites. Google responded by issuing an explanation that signaled it could do little to affect search results. It claimed that its algorithm technology was neutral, and search results were a matter of how people use Google, rather than the technology itself.

Consider the website, dedicated to discrediting Dr. King. Stormfront, the site's owner, can game the algorithm to get first-page placement. Despite protests, Google will not demote the racist site. Consider another example: In November of 2009, a search for Michelle Obama produced a photoshopped image of her as a monkey. Rather than remove the image from its database, Google ran a disclaimer about offensive search results. Many politicians and celebrities have also been humiliated by losing control of their identity in search engines, including George W. Bush and Rick Santorum.

Again, like other forms of hate speech protected under the First Amendment, without regard to the social ramifications of their words, Google's position supports the idea that racists are able to co-opt the name of Dr. King, Michelle Obama or the Jewish community under the auspices of freedom of speech. Hence, Google is not socially responsible for the results of its algorithm.

Many are at a disadvantage when it comes to how search works. Clear and obvious racist (and sexist) values are embedded in some of the digital technologies we use. What both Sweeney's and my research efforts show is how racism and sexism function on the Web. This is heavy stuff, but many of us are trying to call attention to Google's bias because we want to see change. Change is possible.

In the end, it's really not so complicated.

Google is a social technology whose effectiveness is determined by its use. It is a reflection of a profit model, and it is a reflection of our societal values, with one big exception: Google's algorithms are made by people, and hence, they can be made more socially responsible. An unwillingness to do so has real civic costs that we are just now beginning to uncover.

Racial bias is real on the Internet, and the search engines we use reflect that. The research raises the question, if Google can't do anything about this kind of sexism and racism, then what search engine can?