Latanya Sweeney’s name produces a different view than yours. Another example of how a “passive” or “neutral” algorithm will reflect racism and other biases from its context. Nice to see a response to this situation that asks how algorithms can reflect better politics, rather than asking how people can be more forgiving of the programmers who didn’t think about this.
Professor Latanya Sweeney found that searching “Black-identifying” names like hers resulted in Google.com and Reuters.com generating ads “suggestive of an arrest in 81 to 86 per cent of name searches on one website and 92 to 95 per cent on the other.” This means that when Professor Latanya Sweeney (who has no criminal record) googles herself, or when anyone googles her, one of the top results is “Latanya Sweeney: Arrested?” According to the study, when we google the names of Black-identifing names, we’re very likely to see the words “criminal record” or “arrest.” That view sucks! And it only serves to edify negative stereotypes, which potentially limit people with “Black” names from accessing equal means of sustenance and amenities. Meanwhile, googling a white-identifying name produces “neutral” content. (The ads that come up when I google my own name offer viewers private information for a fee.)
And it is how this digital view is shaped that is most disturbing: Google assures that there is no racial bias in the algorithms they use to position ads. Rather, the algorithms “learn over time” which ads are selected most frequently and then they display those. The algorithms are simply reflecting the dominant values of our time, but demonstrating them to each of us differently, depending on our own particularities, and from what is know from our individual and collective clicks: these algorithms cannot result in a more panoramic view. So, thank you to Latanya Sweeney for rubbing the fog off of my view, for now at least. Otherwise, because of my race, and my name, I may not have seen the racist outcomes these algorithms are producing.