Tech and Fish: What does DEI have to do with Tech Ethics?
DEI and Responsible Tech go hand-in-hand - but why? What is it about DEI that leads to better outcomes?
If you've been around the AI Ethics, Tech Ethics, and responsible tech space for a minute, you've probably noticed something--DEI and tech ethics tend to go hand-in-hand. There are many diverse folks leading AI and tech ethics efforts, and many responsible tech and responsible AI programs are closely tied with DEI programs and initiatives.Â
It makes sense. The people most often harmed by unintentionally unethical tech and AI are minorities - whether veterans, those with medical needs, different races, or different sexual orientations - unethical tech has hurt each of those groups at one time or another. Responsible tech and AI programs gravitate toward those groups they want to help. Â
The practical reason is straightforward. Who's best equipped to tell you if something is going to screw over a group of people inadvertently? That group of people themselves. But it's hard to articulate why that is. Saying that you need diverse perspectives to achieve better outcomes is self-evident to some of us, and really hard to grok for others. I've heard the metaphor of the elephant used to explain the value of diverse perspectives, but I felt like it was missing something.
Then, I came across this quote on Substack, "We don't know who discovered water, but we know it wasn't the fish." This metaphor articulates the problem of a single perspective and the lack of diversity perfect. Let me explain
The quote was made famous by media-cultural theorist, Marshall McLuhan. The argument from McLuhan goes like this: to a fish, water is so much a part of their environment that they don't notice it, and barring some other different environment (like air), there is no reason to notice that water is a thing - it's so ubiquitous to the experience of being a fish that it's invisible to the fish. When presented with an "anti-environment", like air, they suddenly become aware that they live in water, and that water changes their experiences and perspective of the world.
The Water and the Fish and the … Technologist?
We can apply the same logic of the fish and the water to technologists and their experiences and backgrounds. We get a situation where technologists are unaware of their experiences and backgrounds because they are so ubiquitous they are invisible. They don't have contrary experiences to draw their attention to the fact that the environment is there -- it exists. More importantly, it colors their perspective of the world without them knowing it. We don't know who discovered the racial, socio-economic, gender, and sexual orientation bias in the model, but we know it wasn't the stereotypical technologist.Â
Before I go on, it's important to note that there's no blame here. No one sets out to create a biased algorithm (at least, I hope not...). The argument is not that developers who code biased algorithms are bad people, just that they are people and have their own blindspots -- just like we all do. Instead of pretending these blindspots don't exist or aren't important, we advocate for a more inclusive path that leads to better outcomes with other voices joining our existing developers.Â
We have lots of case studies to suggest that this is already happening, like the Google Photos incident, where Google's ML algorithm incorrectly identified several black people as gorillas, or Kodak's Shirley Card, which it used to color balance photos for decades, resulting in people of color often being washed out.Â
According to a 2022 McKinsey poll, 25% of folks involved in AI development self-identified as part of a minority. That means a whopping 75% of folks have very similar backgrounds and experiences, similar "water," making it less likely they'll see the crap floating in the water.Â
No One can See the Whole Picture Alone
So how do you make the fish aware of the water? Add in some other folks who can already see the water to help point it out. I don't expect our stereotypical technologist to be able to understand why pronouns are important or why listing homosexuality as a medical diagnosis is offensive. It would be unfair to expect me to understand every nuance of representation of racial minorities because I'm white. Even though I would be on the lookout and actively trying to mitigate that bias, there will always be some 'water' that I don't see.Â
This is where DEI comes into play. Diversity in development helps technologists see things they wouldn't otherwise and helps them mitigate or avoid unintended consequences altogether.Â
If we look at the Google Photo controversy, it was a safe guess that most of the people who worked on the algorithm there were white, probably straight, and very likely middle-upper class. They trained the models on pictures that looked like them and reflected their experiences. What they didn't notice was the lack of people of Black, Indigenous People of Color (BIPOC). Their life experiences at that point didn't involve a lot of BIPOC so they didn't notice the lack of them in their training data.Â
But if you put that training data in front of a team of diverse reviewers, it would have taken mere minutes for that group to identify that there was no one in that data set that looked like them, and asking wasn't a problem.Â
Once users identified the problem, Google moved quickly to address it (with limited success). Wouldn't it have been better (and less costly) to avoid the problem completely by diversifying the technologists working on the algorithm? That's the value of DEI.Â
AI Ethics, Tech Ethics, and DEI Oh My!Â
At the end of the day, AI and tech ethics are about developing innovative technologies while following a set of guiding principles that help ensure that those technologies cause as little harm as possible. We know based on past experiences, that the folks most likely to be harmed are minorities because the technologists creating these technologies are just one perspective and they lack the breadth of experience to anticipate problems that they might cause.Â
One additional thought to add is that Diversity is often understood by demographics - as I provided above. But it can also be diversity of thought and diversity of perspectives within a particular demographic. There's a lot of diversity of thoughts and perspectives within groups, and those have value as well.Â
I’ve got a draft going on why diversity hiring events matter and I’ll definitely be linking back to this. This is very well put.