Gordon, who relies on stock image libraries to create the photo composites and collages she makes for news stories, spoke with “Marketplace” host Kai Ryssdal about a problem she’s encountered within them: It can be tricky to find suitable images of people of color.
The following is an edited transcript of their conversation.
Kai Ryssdal: So, if you went right now and looked for, I don’t know, pictures of people’s hands, what would you find just in whatever system you use put “hands” in the query box?
Shoshana Gordon: So, you would probably get a bunch of hands that look like white people’s hands, with some token black and brown folks in there. Axios uses iStock; the other main companies are Shutterstock and Adobe Stock, and a similar thing happens when you put “hand” into those search engines as well.
Ryssdal: Explain to me how that happens. How do you just get pictures of white people’s hands?
Gordon: Right? So, I talked with Getty Images, which owns iStock, and what they told me is basically, there’s a couple of different algorithms that are controlling what you see when you type “hand” into the search bar. The primary algorithm is based on keywords and popularity and engagement, so, how many times has that photo been downloaded.
Ryssdal: So, the more people choose a white hand, the more white hands are going to show up?
Gordon: Exactly, yeah. So there’s this positive feedback loop of whiteness. They know that, and they’ve created a secondary algorithm to counteract that historical bias. It looks at the region where that search is being made and it tries to match the racial makeup of that region with the search results you’re seeing.
Ryssdal: Do you think that’s enough?
Gordon: Well, I don’t want to be prescriptive, but I know that these companies are also taking a lot of different approaches to rectify this problem. They’ve all created different image libraries that highlight diversity. They’ve also started trying to educate customers and contributors alike about how to make and use more visually inclusive stories.
Ryssdal: The challenge, of course, is that when you’re on deadline, and you don’t have a whole lot of time to go hunting for the right image that’s representative of what’s in the story, it kind of misrepresents what’s going on, right?
Gordon: Oh, completely. And I think that’s why this is such an important issue to me, because stock images are everywhere, whether we notice them or not. And when white people dominate the search results, that means that the final images that we see out in the world are also probably going to be dominated by white people.
Ryssdal: So, as much as this as an interview about stock images, it’s not really about stock images, right? It’s about the algorithm.
Gordon: Right. And I would say that is a lot of how we get our information these days. We go onto Google or DuckDuckGo, and we type in something, but I don’t really know how those results have come to me. I’m not a computer scientist. I’m not a programmer, but these algorithms determine what we see, and we don’t know how it works.
Ryssdal: Yeah, and the way we all consume information today is rapidly scrolling by, and an image strikes our cerebral cortex, and we’re like, “OK, white person, oh, white person,” and that’s the image that we get.
Gordon: Totally! And I think, yeah, just bringing it back to the importance of [stock] images — they have this invisible power of shaping those implicit biases, and they could reinforce them or they could dismantle them. I think we need to stop and look at an image that seems anodyne on its face and understand that every image we see is telling a narrative about race, about class, about all these different things.