Dr. Joy Buolamwini on Unmasking AI: My Mission to Protect What Is Human in a World of Machines AI transcript and summary - episode of podcast Dare to Lead with Brené Brown
Go to PodExtra AI's episode page (Dr. Joy Buolamwini on Unmasking AI: My Mission to Protect What Is Human in a World of Machines) to play and view complete AI-processed content: summary, mindmap, topics, takeaways, transcript, keywords and highlights.
Go to PodExtra AI's podcast page (Dare to Lead with Brené Brown) to view the AI-processed content of all episodes of this podcast.
Dare to Lead with Brené Brown episodes list: view full AI transcripts and summaries of this podcast on the blog
Episode: Dr. Joy Buolamwini on Unmasking AI: My Mission to Protect What Is Human in a World of Machines
Author: Vox Media Podcast Network
Duration: 01:25:46
Episode Shownotes
In this episode, Brené and Dr. Joy discuss fighting bias in algorithms, Gender Shades - the accuracy of AI powered gender classification products, and her amazing perspective on technology as a poet, artist, and scientist.
Learn more about your ad choices. Visit podcastchoices.com/adchoices
Full Transcript
00:00:10 Speaker_06
Hi everyone, I'm Brene Brown and this is Dare to Lead. If it sounds like I've got a big grin on my face, it's because I have a huge smile on my face. If you do not know Dr. Joy Bolamwini, you are missing out. Let me just say that.
00:00:25 Speaker_06
I'm just going to draw a line. She's one of the most amazing guests I've ever had the privilege to talk to. We are going to dig into her book, Unmasking AI, My Mission to Protect What is Human in a World of Machines.
00:00:37 Speaker_06
She's got every tech bona fide you can imagine. Let me tell you a little bit about her. She is the founder of the Algorithmic Justice League, a groundbreaking MIT researcher, and an artist. She is the author of the national bestseller, Unmasking AI.
00:00:53 Speaker_06
She advises world leaders on preventing AI harms. Her research on facial recognition technologies transformed the field of AI auditing and has been covered in over 40 countries.
00:01:04 Speaker_06
Her Gender Shades paper is one of the most cited peer-reviewed AI ethics publications in the world. She's got an amazing TED Talk. We'll link to it on the page on brenebrown.com.
00:01:17 Speaker_06
She's a Rhodes Scholar, a Fulbright Fellow, a recipient of the Technological Innovation Award from Martin Luther King Jr. Center. She's the 2024 winner of the NAACP R12 Foundation Digital Civil Rights Award.
00:01:30 Speaker_06
She has a PhD from MIT, an honorary doctor of fine arts from Knox College. She's a poet. She's a coder. She is light. She is magic, and I cannot wait for y'all to hear this interview. Let's jump in.
00:01:51 Speaker_12
Thumbtack presents the ins and outs of caring for your home. Out. Indecision, overthinking, second-guessing every choice you make. In. Plans and guides that make it easy to get home projects done. out. Beige on beige on beige. In.
00:02:14 Speaker_12
Knowing what to do, when to do it, and who to hire. Start caring for your home with confidence. Download Thumbtack today.
00:02:25 Speaker_00
How much money do you make? My name is Vivian Tu, better known as your rich BFF and your favorite Wall Street girlie. My podcast, Net Worth and Chill, is back and better than ever for season two.
00:02:35 Speaker_00
We've got finance experts and your favorite celebs answering all those taboo money questions you've been too afraid or too embarrassed to ask. With new episodes dropping every Wednesday, you can watch or listen.
00:02:45 Speaker_00
Sit back and relax and get ready to Net Worth and Chill.
00:02:50 Speaker_06
Welcome to Dare to Lead, Dr. Joy. Thank you so much for having me. I totally stalked you in your DMs.
00:03:02 Speaker_07
DMs have been working out. That's how the cover of the book got made. I actually reached out to the illustrator, sent her an Instagram DM, and she just happened to have time.
00:03:13 Speaker_07
She'd seen the documentary Coded Bias, so she knew a little bit about the work, but it was a last-minute request, so Instagram DMs for the win.
00:03:24 Speaker_06
I wasn't even shy either. I'm like, Hey, I love your work.
00:03:32 Speaker_07
I was suspect. I was highly suspect. I said, all right, let's double check everything.
00:03:39 Speaker_06
So before we get started, we always start with the same question, but I had to start with telling you like how absolutely obsessed I am with your work.
00:03:48 Speaker_06
think that I can get out of bed in the morning and face AI because there are people like you in the world asking really hard questions, holding people accountable. I'm just so grateful for you and your work.
00:04:02 Speaker_07
Well, I'm grateful to be here and to have the opportunity to share more about it.
00:04:07 Speaker_06
So let's start with your story. And I mean from baby joy.
00:04:15 Speaker_07
Yes, so Baby Joy actually popped up in Canada, of all places. I was born in Edmonton, Alberta. I was there because of education. My dad was getting his PhD at the University of Alberta.
00:04:30 Speaker_07
I was born the year he defended, you know, so I really had no choice but to go down this And so I'm the daughter of a scientist, and I'm also the daughter of an artist.
00:04:42 Speaker_07
And funny enough, my mom's dad, Dr. Daniel Jumabadu, so I'm third generation when it comes to this academic tree, actually taught my father, right? So there's this whole lineage. And a few weeks ago, I visited Ghana for the first time in 30 years.
00:05:03 Speaker_07
and I went to the university that he was dean of the School of Pharmacy for. And so it was very nice, full circle, because my mother would tell me all of these stories growing up, and the places she described I finally got to see in person.
00:05:22 Speaker_07
And so I'm the daughter of an artist and a scientist. I was born in Canada. When I was about two years old, I moved to Camasi, Ghana. And so that's the heart of the Ashanti region. And those are where my earliest memories are.
00:05:37 Speaker_07
I lived with my maternal grandmother. And then when I was about four years old, I was reunited with my parents. And I didn't know who these people were forcing me to eat vegetables, but I was told we were related.
00:05:51 Speaker_07
And then we moved to Oxford, Mississippi. So my dad was teaching at Ole Miss. You can tell we were immigrants because they were calling it Ole Miss, so it was... Clearly not a, there's some outsiders. At Ole Miss, yeah. You're right, at Ole Miss.
00:06:08 Speaker_07
I learned their British English would not fly at school quite quickly, so definitely living in between cultures. And when I was around 10, we moved to Memphis, Tennessee, so keep that Southern, and then went to undergraduate.
00:06:24 Speaker_07
I studied computer science at Georgia Tech in Atlanta, Georgia, so just, Despite the very cold start in Edmonton, Alberta, kept it pretty warm. Wow, so Georgia Tech.
00:06:38 Speaker_06
Yes, Yellow Jacket, Ramblin' Wreck. Yeah, yes, we've done some Dare to Lead work at Georgia Tech. A couple of my favorite people are there. It's a really interesting school. Absolutely. What were you like and what did you love as a teenager?
00:06:54 Speaker_07
As a teenager, what was I like? I was the kid who was skateboarding, who had my electric guitar, and would make up songs and turn up my amp really loudly. And then I was also very athletic, so I was on all the sports teams, cross country, basketball.
00:07:16 Speaker_07
I did track and field, and my favorite event was po-vaulting. The risk taker in me was All for that.
00:07:23 Speaker_07
I used to skateboard, and so that just allowed... Actually, no, the reason I got into pole vaulting was I was at track practice one day, and we were doing push-ups. The coach comes around. He's like, man, those are some really nice push-ups.
00:07:38 Speaker_07
I'm like, yeah, I got some good push-ups. But he was talking about the girl behind me, and I was like, what? You think her push-ups are good? And so she is telling this girl that she should go try po-vaulting.
00:07:49 Speaker_07
So I was like, if you think she can po-vault, then I can definitely po-vault. The thing with po-vaulting is it's so much more than strength. You have to be a little bit of a daredevil. She didn't have that daredevil side.
00:08:00 Speaker_07
I definitely had that daredevil side. So they say it takes the speed of a sprinter, the strength of a thrower, and the body awareness of a gymnast. I had two out of three, so... That was my thing. So I was just full of energy.
00:08:17 Speaker_07
Sometimes I would skip basketball practice to be at Knowledge Bowl sessions, and the coach didn't know what to do with me because it's like, it's Quiz Bowl, you know, so it's nerdy educational.
00:08:30 Speaker_07
And so the coach, the Knowledge Bowl coach and the basketball coach would be trying to say, you know, split time. Right? I think I was probably more of an asset to the knowledge ball team than the basketball team, to be honest.
00:08:47 Speaker_07
I'm vertically challenged here, which is why I like the football team. So yeah, me as a teenager, full of energy, exploring all kinds of things. building websites so that I wouldn't have to pay for my track uniform and my basketball stuff.
00:09:03 Speaker_07
So I was like, oh yes, I'll make some cash. I can barter with these tech skills. So a little entrepreneurial, just testing and experimenting and trying many things. showing up with all my AP books. I remember this.
00:09:17 Speaker_07
We would go to basketball games and in the locker room I'd have like AP physics, AP chemistry or whatever else. So I would be doing that homework during halftime. So I would have time to get the work done.
00:09:32 Speaker_07
So yeah, I think that's probably a snapshot of teenage me all over the place having too much fun.
00:09:40 Speaker_06
Wow. I mean, you are engaged left brain, right brain, full body contact. You are all in. Yes. Full brain, full human. Love it. Okay. Georgia Tech. What do you learn about yourself at Georgia Tech?
00:09:56 Speaker_07
Oh. Georgia Tech was such a fascinating experience for me.
00:10:01 Speaker_07
I came in international affairs by medical engineering, then switched to computer science quite quickly, then switched to some other field, computational media, then switched back to computer science, and then along the way started a hair care technology company.
00:10:18 Speaker_07
I was doing all of these entrepreneurial projects. And so I think the biggest lesson I got from Georgia Tech was finding ways to use my official schoolwork to explore what I wanted to do regardless.
00:10:34 Speaker_07
So, for example, my senior project ended up being an experiment, really, with the Carter Center. And so, for my capstone project, I was actually working with the Carter Center. We went to Ethiopia, and we were exploring ways of
00:10:52 Speaker_07
transforming their paper-based methods of assessing the effectiveness of their campaigns, so monitoring and evaluation at that time. Android tablets were coming on the scene, and because they were open source, I could actually program them.
00:11:07 Speaker_07
And so I was working with them to program and transform their paper-based way of testing their campaigns for various neglected tropical diseases. So in this case, trachoma, which is an infectious disease that can lead to blindness.
00:11:22 Speaker_07
But it was actually something there was medication for. So Pfizer had this drug called Zytromax that they were distributing, but they weren't actually so clear on how effective it was.
00:11:32 Speaker_07
So they had this Maltra program, malaria and trachoma, where they would go to different villages, distribute it, and so forth.
00:11:39 Speaker_07
The problem with the paper-based way of trying to assess it is there's a lot of data entry, and that data transcription is not exact. And also when it came to the locations, trying to put in the GPS coordinates was not exact at all.
00:11:54 Speaker_07
And so being able to use the GPS on the Android tablets help to make it a bit more precise. So for me, this was so exciting because I was motivated to get into computer science to try to use tech for good.
00:12:06 Speaker_07
And so at Georgia Tech, I found that it was possible to translate these technical skills towards big problems in the world and make a meaningful contribution. So I think that was one of the biggest takeaways for me.
00:12:23 Speaker_06
Oh, yellow jackets. Yeah, I mean, I guess I'm in a state of shock because this seems like the progress of a 50-year-old at the Gates Foundation or something.
00:12:33 Speaker_06
So I'm so amazed that you're programming these new, and I'm doing the math in my head, so the Android tablets were relatively new.
00:12:40 Speaker_07
Yes, this was around 2011, 2012. Yes. Okay, where do you go after Georgia Tech? So after Georgia Tech, I take a detour and I do a hair care technology company. Send us strands of your hair, we analyze it, give you unique product recommendations.
00:12:58 Speaker_07
I'm having a good old time. I get my mom saying, wait, have you thought about applying for the Rhodes Scholarship? And I'm like, ah, Oxford sounds cold. How about the Fulbright Fellowship? So I applied for the Fulbright Fellowship.
00:13:12 Speaker_07
And Fulbright Fellowship, just like pole vaulting, I was hanging out with some friends one day, and they were talking about what they were planning to do for the next, you know, year or so. And one's like, oh, yeah, I'm applying for the Fulbright.
00:13:24 Speaker_07
I'm like, oh, she applied? I could probably apply, right? I applied. I ended up getting it. She didn't, unfortunately, but she inspired me to apply, so shout out.
00:13:38 Speaker_07
And then, as I was going to do that, my mom was still adamant about applying for the Rhodes Scholarship, and I was still, it's cold, mom. But anyways, I applied for it, and thankfully, both came through, so I went to...
00:13:56 Speaker_07
It sounds like an AI-generated story, but this is truth.
00:14:01 Speaker_06
It does sound like an AI-generated story, but keep going.
00:14:03 Speaker_07
It doesn't even sound real, but that's what happened. So I was in Lusaka, Zambia, and inspired by some of the work I had done with the Carter Center and in Ethiopia, I was saying, okay,
00:14:15 Speaker_07
I wanted to really think through the part that I was playing in trying to use tech for good, because in the situation of working with the Carter Center, I was literally the westerner parachuting in, you know, a tech project.
00:14:30 Speaker_07
And when I did that, I found out that the assumptions I made when I was programming this in my childhood bedroom in Memphis, Tennessee didn't actually hold out to the real-world conditions.
00:14:40 Speaker_07
And I was also wondering, well, why aren't the local people developing these systems? Why am I coming in? And so with my Fulbright Fellowship in Lusaka, Zambia, the focus really was equipping Zambian youth
00:14:53 Speaker_07
to make mobile apps for things that they cared about. So we partnered with iSchool Zambia, and we worked on these ZedgyPads, which were Android tablets that had the entire Zambian school curriculum on there.
00:15:08 Speaker_07
And I got to train youth who, it was funny, they'd walk into class, they wouldn't ever think I was the instructor, so I would learn a lot before. And then I mentioned I used to skateboard. There were Zambian skateboarders there.
00:15:25 Speaker_07
They would get their parts from South Africa. And then we also found this poetry scene. So I was hanging out with the poet. It was a great time. Meanwhile, I was teaching people. how to code.
00:15:37 Speaker_07
And so then when the Peace Corps folks would come in and they would see photos of my projects, they would always be confused because it's usually like some white guy and a bunch of Black people out with a fishery.
00:15:49 Speaker_07
And my photos were all of these dark-skinned people programming apps. On skateboards! Right! Skateboarding, having a good time. I'll have to send you some of the skateboard photos and videos. We had way too much fun. We made music videos.
00:16:07 Speaker_07
It was a great time. So then, after around seven months, it was time to go to Oxford. The second Oxford. Oxford, England. Not Oxford, Mississippi, yeah. Time for Oxford, England. And so I went there to start the adventure of being a Rhodes Scholar.
00:16:26 Speaker_07
And I made the mistake of enrolling into African Studies. After the first lecture, I knew I was in the wrong place. In the wrong place. So...
00:16:43 Speaker_07
It was just such a disconnect from the work I had done in Ethiopia, the work that I was doing in Lusaka, and they were taking more of an anthropology perspective, which is fine, just not where I was at the time.
00:16:56 Speaker_07
So, I quickly switched to another discipline. So, I switched to the education department, learning and technology, which built more on the work I had been doing teaching youth how to make mobile apps that they cared about.
00:17:09 Speaker_07
And so, yes, just like when I went to Georgia Tech and I'm like, okay, international relations, probably not me. Oxford, African studies, probably not me. So I got through the first year. It was cool.
00:17:25 Speaker_07
Then with the Rhodes Scholarship, two years are funded, and the program I did was a one-year program. So I had applied to a master's, I think in public health, global health, right? Because the work in Ethiopia, I had a story. It made sense. Yeah. Right?
00:17:40 Speaker_07
Somewhere along the line, I realized I really didn't want to do global health. They didn't like my jokes. It was like a very somber kind of culture. And I'm not that person. So I was like, maybe I should do something else. So I had an idea.
00:17:59 Speaker_07
So I proposed a year of service where one such scholar, hoping to use their year to make a meaningful impact and fight the world's fight, might instead of doing a course pursue some other idea. So I
00:18:15 Speaker_07
pretty much wrote a 40-page plea or proposal to describe this year of service, how I would do it, why it was impactful, and then basically I said I would do it one way or the other so they might as well get the credit for it.
00:18:32 Speaker_07
It's been 100 years, we haven't changed our way. They told me not to get my hopes up, but long story short, they allowed me to do it. And so my second year at Oxford was actually teaching Oxford youth how to code apps they cared about.
00:18:48 Speaker_07
And I used a very similar curriculum. Yeah, we actually did a first responders app focused on campus sexual violence. The limestone at Oxford was just about as impenetrable as anywhere else. low connectivity issues also transferred.
00:19:04 Speaker_07
There was a lot going on there. So I had an opportunity to do that. And along the way, I get a call from my dad. I mentioned the PhDs earlier. He's like, so what about the terminal degree? Think about graduate school. So I applied to just one place.
00:19:24 Speaker_07
One, if I, look, I apply to one place, I've done what my dad wants me to, I've applied to graduate school, if I make it, great, if I don't, I keep on my entrepreneurial adventures. So I applied to MIT, I get in, and then I'm off to MIT.
00:19:42 Speaker_06
I think Dr. Jordan's got this whole thing where it's like, hey, if those are your pushups, I can go to MIT. If those are your pushups, I can be a Rhodes Scholar, a Fulbright.
00:19:55 Speaker_07
I'm like, hmm, it wasn't on my radar, but I think I got this. No, I mean, to be fair, when I was little, MIT was the dream tech school because all of these documentaries I would watch, it was always the backdrop, such and such, happening at MIT.
00:20:11 Speaker_07
I didn't know there were requirements, but I was inspired.
00:20:15 Speaker_06
You just thought it was gonna be a great narrative documentary experience, but it turns out there's like prerequisites. That's terrible. Right, things of that nature. So you're in MIT.
00:20:27 Speaker_07
Yes. And you love it? Oh, my goodness. So, I am finally at the Future Factory, and I'm there when they're celebrating the 30th anniversary of the Media Lab.
00:20:39 Speaker_07
Martha Stewart is there, you have the magicians, they're talking about magic and mischief, and I was so excited. I mean, Nerd and me could not have been more thrilled thrilled. And one of my professors was a Japanese pop star, the design one.
00:20:58 Speaker_07
There's this other lady making synthetic estrogen and she had this project, Sad Bad Chickens. It was just like, I can't even make up. I can't make it up.
00:21:15 Speaker_06
I can't make it up.
00:21:16 Speaker_07
I can't make it up. So I'm thrilled. And also, one of the friends who'd also been a Rhodes Scholar with me, she was from Zimbabwe, she and her husband were at Harvard as resident tutors. And so they needed scholarship tutors.
00:21:32 Speaker_07
And having done a bunch of scholarships, I was like, Oh, I could probably do that. So thankfully, they accepted me as a resident tutor. So I was a resident tutor at Adam's house. So I was living in Harvard Square and then going to school at MIT.
00:21:48 Speaker_07
So I had these two communities. So that was how I entered the Cambridge, Massachusetts experience. And now that I look back on it, I can appreciate and can be very grateful for the amount of privilege it was to be in both spaces simultaneously.
00:22:09 Speaker_07
At the time, I was like, yay, free housing and free toilet paper! Yo! Harvard's making these kids soft! What is going on? Right? Wow. They edited out the free toilet paper from the book, but I think the world should know. I really do.
00:22:29 Speaker_07
It might have changed since my time. This was in 2013, so things could have changed.
00:22:36 Speaker_06
How would you describe where you are, what you do today, and how did you get here?
00:22:43 Speaker_07
So I am a poet of code. I tell stories that make Daughters of Diasporas dream and Sons of Privilege pause. I am able to do AI research that shows discrimination and products as a way of actually preventing harmful deployments of AI systems.
00:23:04 Speaker_07
And the way I got to doing this work that really allows me to embrace the artist within while also having my research hat was a bit accidentally. When I finally made it to my dream school at the Future Factory MIT Media Lab,
00:23:19 Speaker_07
I took my first course, one of them was science fabrication. You read science fiction, you try to build something you might otherwise not do, as long as you can do it in six weeks.
00:23:30 Speaker_07
So I wanted a shapeshift, but the six-week part made it a little bit challenging. So I was like, all right, not going to be able to change the laws of physics, but Maybe I could change my reflection in a mirror.
00:23:42 Speaker_07
And so I learned about this special kind of material called half-silvered glass. And when you put a black background behind it, it reflects just like a regular mirror. But if you shine light through, the light will go.
00:23:58 Speaker_07
And so I was like, oh, if I put something completely black and then have a mask on it, it can look like a filter.
00:24:06 Speaker_07
But instead of a filter on a camera, like you would have with Snapchat where you have the dog ears, I could actually make that effect on the mirror. because of the material.
00:24:18 Speaker_07
So when I figured that out, I was like, oh, that's how I can shapeshift in less than six weeks. So I started experimenting with that and I got it to work.
00:24:26 Speaker_07
And then I thought, okay, right now it's like a fun, it's kind of like when you go to an amusement park and they have the cutout and you have to put your face in the right place.
00:24:36 Speaker_07
So that's fun, but I was like, ooh, what if it could actually follow my face in the mirror? So that's when I thought, oh, well, we have technology that can actually detect faces and track faces.
00:24:48 Speaker_07
So let me add some of that kind of technology into my project. So I went and I found the kind of software that's meant to do it, and I integrated it into the project I was doing.
00:24:59 Speaker_07
I know how to program, so I was getting the right code into my project, and then I was excited. I wanted to try it, but it wasn't actually following my face, so the cool effect I wanted to have wasn't happening.
00:25:13 Speaker_07
And so, you know, scientists, I start to debug. So I was like, all right, let me see if it can get any face. So I held up my hand, and I drew, like, basically a smiley face, right? Two eyes, a nose, and a mouth. Just on the palm of your hand.
00:25:26 Speaker_07
On the palm of my hand. I held that to the camera, and it detected not my dark-skinned face, but the face I had drawn on my palm. So after that, I was like, yo! anything's up for grabs.
00:25:39 Speaker_07
So I look around my office, and I happen to have a white mask because it was Halloween time, and a friend had wanted to do a girls' night out. So for the girls' night out, they said, bring masks.
00:25:50 Speaker_07
They actually meant beauty masks, and she had a bunch of Korean beauty masks, so I didn't need the mask anyways. But because of that, That's why I had a mask in my office. So, thank you, Cindy. Right? So, I have the mask.
00:26:05 Speaker_07
I put the... Actually, before I even got the white mask all the way over my face, it was already being detected.
00:26:12 Speaker_07
So, it was that moment of putting on a white mask over my dark-skinned face to have it detected by a machine that I was like, oh, my goodness. What's going on here?
00:26:24 Speaker_07
So that's what led me to start asking questions like, how do machines see us in the first place? How are we even training machines to detect people? And also, are AI systems as neutral as I assumed they would be? Because it's math after all.
00:26:44 Speaker_07
So that's what really got me started down this long rabbit hole art project.
00:26:52 Speaker_07
trying to see if I could shapeshift, seeing that the face tracking software wasn't detecting me until I put on a white mask and then really thinking about, oh my goodness, what does this mean?
00:27:06 Speaker_07
Maybe this is what I should research during my master's time at MIT. And so that's what I did. So I had that white mask experience and I shared it in a TED Talk, TEDx Talk, and it got a lot of views. But you know what, people might question my claims.
00:27:25 Speaker_07
Let me check myself. So I took my TED profile image and I ran that headshot through online AI demos from a number of different tech companies. And some didn't detect my face at all, just kind of like the coating in the white mask.
00:27:42 Speaker_07
And the other ones that detected my face labeled me male. I'm like, wait, phenomenal woman here, what's happening? Why am I being labeled a male? And so that's what actually led me to focus my research on gender classification.
00:27:59 Speaker_07
How are machines even reading gender in the first place? And does it matter the skin type of the person behind the photo? So those were the types of explorations that led to my research.
00:28:17 Speaker_07
And so, long story short, with my research, I tested AI systems from giants we know, right? So, IBM, Microsoft, later on Amazon.
00:28:30 Speaker_07
And for commercially sold products that they had, AI-powered products, they were getting it wrong when it came to facial analysis, guessing the gender of a face. And so, they would overall work better on male-labeled faces than female-labeled faces.
00:28:47 Speaker_07
They would overall work better on lighter-skinned faces than darker-skinned faces.
00:28:52 Speaker_07
And then when you did a deeper analysis and looked at it by skin type and gender, on some of them there would be flawless performance, like Microsoft was 100% accurate on lighter males. whereas they were closer to 80% on darker females.
00:29:09 Speaker_07
And those were the good numbers, right? So then when we go to other companies and we pull it apart, I saw that in some cases the accuracy was around 68%. for women with darker skin.
00:29:22 Speaker_07
And mind you, at this point, we've broken gender into a binary just for the study. So you had a 50-50 shot of just guessing it right. The toss of a coin. Exactly.
00:29:36 Speaker_07
And if we broke it down even further by specific skin type for type 6 skin, right, it was close to a coin toss. in terms of the accuracy. I was actually curious. I was like, hmm, I wonder if the companies know. So I sent them the results.
00:29:56 Speaker_07
Once the paper had been officially accepted to conference, then I shared pre-results with all of the companies, and I shared all of the results, but I blacked out their competitors' names.
00:30:09 Speaker_06
Oh, interesting.
00:30:10 Speaker_07
Right. And so I had different responses from the various companies over the two studies that we did. So one response was no response. Which is a response. Yes. And then the other, IBM was actually proactive.
00:30:27 Speaker_07
They said, oh, come to our office, show us what's happening. And part of that is because I had been at this Aspen Institute event. And the head of AI research was there.
00:30:42 Speaker_07
And so the day I was turning in my MIT master's, I was walking from Harvard Square over to MIT, and he was there with his two daughters. And he introduced me to his daughters, and I said, oh, there are findings in here you're going to want to know.
00:30:58 Speaker_07
And so he kind of already knew that this work was going on, and I had met him. So I think that was part of why IBM was a bit more responsive, and they've played this game a lot longer than, let's say, Amazon. Amazon was combative, to say the least.
00:31:16 Speaker_07
So they actually had a corporate vice president attempt to discredit the research, and it got so bad we had a whole petition was put together by over, I think, 70 researchers, including the former principal AI researcher
00:31:31 Speaker_07
of Amazon saying that if you're attacking this research, it's actually going to impede important work in the field of AI because we have to know our limitations in order to reach our aspirations.
00:31:45 Speaker_10
Ooh.
00:31:58 Speaker_02
Fox Creative. This is advertiser content from Zelle. When you picture an online scammer, what do you see?
00:32:07 Speaker_11
— For the longest time, we'd have these images of somebody sitting crouched over their computer with a hoodie on, just kind of typing away in the middle of the night. And honestly, that's not what it is anymore.
00:32:16 Speaker_02
— That's Ian Mitchell, a banker-turned-fraud fighter. These days, online scams look more like crime syndicates than individual con artists. And they're making bank. Last year, scammers made off with more than $10 billion.
00:32:32 Speaker_11
It's mind-blowing to see the kind of infrastructure that's been built to facilitate scamming at scale. There are hundreds, if not thousands, of scam centers all around the world. These are very savvy business people. These are organized criminal rings.
00:32:47 Speaker_11
And so once we understand the magnitude of this problem, we can protect people better.
00:32:53 Speaker_02
One challenge that fraud fighters like Ian face is that scam victims sometimes feel too ashamed to discuss what happened to them. But Ian says one of our best defenses is simple.
00:33:05 Speaker_11
We need to talk to each other. We need to have those awkward conversations around what do you do if you have text messages you don't recognize? What do you do if you start getting asked to send information that's more sensitive?
00:33:16 Speaker_11
Even my own father fell victim to a, thank goodness, a smaller dollar scam, but he fell victim, and we have these conversations all the time. So we are all at risk, and we all need to work together to protect each other.
00:33:29 Speaker_02
Learn more about how to protect yourself at vox.com slash zelle. And when using digital payment platforms, remember to only send money to people you know and trust.
00:33:42 Speaker_04
You're okay, you're okay.
00:33:44 Speaker_03
Wolves and dogs are pretty closely related. They actually share 99.9% of their genetics. But even when they're just a few months old, even when they're raised by human scientists, wolves are pretty different from dogs.
00:34:01 Speaker_01
They start biting you in the ears when you're lying down, if you don't sit up fast enough. And you hear this wonderful noise, this little... And then they chomp you in the ear, and you're like, oh!
00:34:13 Speaker_03
— And when they grow up, these differences get even bigger. Dogs are our friends. Wolves are hunters.
00:34:20 Speaker_01
— If I had a sore shoulder, I wouldn't go in with the adult wolves, even if I raised them, because it could trigger their hunting behavior.
00:34:28 Speaker_03
— This week on Unexplainable, how did we get the nice, friendly dogs we know and love today from wolves? Follow Unexplainable wherever you listen for new episodes every Wednesday.
00:34:44 Speaker_06
When I watch the gender shades. And we'll link to this. If you're listening, you're gonna wanna see this. It's just, I mean, the way you made it so accessible for me to understand.
00:34:56 Speaker_06
Now, I will say there's a part after Gendershades that follows the colon as all good academics have a snappy title followed by a really shitty, serious subtitle. That's our training, right?
00:35:10 Speaker_06
So the full thing is Gendershades, which I love, but what comes after that in the actual title?
00:35:16 Speaker_07
Intersectional analysis of commercial gender classifiers.
00:35:20 Speaker_06
Hey, if she can do that kind of push-up, she can call anything she wants.
00:35:28 Speaker_07
It was actually funny because I went back and forth with my committee about the title. So that was my compromise. I just wanted it to be gender shades. Like for the book, I went back and forth with the publishers. I just wanted it to be Unmasking AI.
00:35:44 Speaker_07
They're like, that's not how it's done.
00:35:48 Speaker_06
We are a simple title in a complex subtitle world. I never want a subtitle either, but they're not having it.
00:35:53 Speaker_07
Exactly. So that was our compromise. And they're like, are you sure gender shades? I'm like, I'm positive gender shades. This is what people can remember. The subtitle, no one's going to remember that.
00:36:06 Speaker_06
So anyways. When I started my shame research, I was just like, can I just call it not good enough? And they're like, we need a subtitle. And I was like, variables mitigating self-conscious affect. They're like, now you're talking. I'm like, no.
00:36:19 Speaker_06
Now I'm losing 90% of the people who related to that feeling of not being good enough, so damn it.
00:36:25 Speaker_07
Exactly. But now you get to write books the way you want, so there you go.
00:36:31 Speaker_06
Actually, I get to a little bit sometimes. Let me ask you this.
00:36:35 Speaker_06
I don't know if you know S. Craig Watkins who works at UT on AI but also spent some time at MIT and we had him on and he's been in AI for 20 years and one of the things that he talked about that I learned and I'm curious on your thoughts about this too as you tell us about what surprised you the most is he just said the idea that when we're creating AI and building algorithms that we can only have computer scientists and
00:37:03 Speaker_06
computational mathematicians at the table and not humanists and people with lived experiences and liberal arts people and ethicists. He's like, that's got to end now. And so tell me one, what surprised you the most?
00:37:16 Speaker_06
And do you agree about building longer, wider tables when we're developing AI?
00:37:24 Speaker_07
Yes, I think what surprised me the most also answers that question. So for a really long time, I felt that to be taken seriously as a computer scientist, I had to hide the art part of myself.
00:37:43 Speaker_07
And so I struggled for a really long time to embody this notion of being a poet of code. So by the time I was doing my PhD, I asked myself, what would a poetic PhD look like? And...
00:38:01 Speaker_07
I wasn't sure if I was willing to take that risk, but because I had done other academic degrees, I knew I could go through that process, right? So it was saying, how can I push myself further this time around?
00:38:16 Speaker_07
And when it came to the gender shades paper, which we've talked about that research, you know, showing gender bias, showing skin-type bias in all of these major tech companies with lots of money and resources.
00:38:31 Speaker_07
I was struck by the fact that all of the companies failed so miserably on women like me, women with dark skin. And so, I wanted to take a look into that part a bit more.
00:38:47 Speaker_07
And this is what allowed me to start exploring using poetry as a way of sharing what was going on with AI systems. And so, after Gender Shades came this poem that I now called, AI, Ain't I a Woman? Inspired by Sojourner Truth's 19th century
00:39:08 Speaker_07
speech in Akron, Ohio, really talking about the need for women's rights to be intersectional and think through the lived experiences of other women, not just those who had the podium or the platform at the time.
00:39:22 Speaker_07
And so I remember waking up with these words in my head. My heart smiles as I bask in their legacies, knowing their lives have altered many destinies. In her eyes, I see my mother's poise. In her face, I glimpse my auntie's grace.
00:39:37 Speaker_07
In this case of déjà vu, a 19th century question comes into view in a time when Sojourner Truth asked, ain't I a woman? Today, we pose this question to new powers, making bets on artificial intelligence, hope towers,
00:39:52 Speaker_07
The Amazonians peek through windows blocking deep blues as faces increment scars. Old burns, new urns, collecting data chronicling our past, often forgetting to deal with gender, race, and class. Again, I ask, ain't I a woman?
00:40:08 Speaker_07
Face by face, the answers seem uncertain. Young and old, proud icons are dismissed. Can machines ever see my queens as I view them? Can machines ever see our grandmothers as we knew them?
00:40:22 Speaker_07
And then as this was coming, I also started testing images of iconic women, historic women, women from contemporary times.
00:40:32 Speaker_07
So Ida B. Wells, data science pioneer, hanging facts, stacking stats on the lynching of humanity, teaching truths hidden in data, each entry and omission. a person worthy of respect.
00:40:46 Speaker_07
Shirley Chisholm, unbought and unbossed, the first black congresswoman, but not the first to be misunderstood by machines while versed in data-driven mistakes.
00:40:54 Speaker_07
Michelle Obama, unabashed and unafraid to wear her crown of history, yet her crown seems a mystery to systems unsure of her hair. A wig, a bouffant, a toupee, maybe not. Are there no words for our braids and our locks? The sunny skin and relaxed hair,
00:41:10 Speaker_07
Make Oprah the first lady? Even for her face well known, some algorithms fault her, echoing sentiments that strong women are men. We laugh, celebrating the successes of our sisters with Serena smiles. No label is worthy of our beauty.
00:41:31 Speaker_07
And so that was really my first way of saying, OK, let me let the artists take the stage. And in letting the artists take the stage, those words are also accompanied by visuals. So when I'm talking about Serena Smiles, you see IBM labeling the goat.
00:41:50 Speaker_07
and her sister Venus, right, as men. When I'm talking about Ida B. Wells, data science pioneer, you see Microsoft describing her as a small boy wearing a hat, right? And so, you're seeing these images.
00:42:06 Speaker_07
So, for me, these started to become counter-demos or counter-narratives to that master narrative of the superiority of tech and tech companies. And it was this way that I realized I could allow people to bear witness to AI failure.
00:42:23 Speaker_07
So instead of the typical tech demo, which is about, oh, look at all of the things it can do, this kind of demo, which I then ended up calling an evocative audit, which was part of what my PhD became, was to say, how do we actually allow people to see what's at stake?
00:42:42 Speaker_07
when you have a study like gender shades.
00:42:44 Speaker_07
So I wanted to go from performance metrics to performance arts because that's when I knew it would actually touch people so they could see what it means when I'm asking, can machines ever see my queens as I view them?
00:42:57 Speaker_07
Can machines ever see our grandmothers as we knew them? So to that question of who is and who isn't at the table, and who's bringing their experiences or not.
00:43:08 Speaker_07
What I learned was the scholarship that came before me was from Black women who were bringing in their lived experience. One was Dr. LaTanya Sweeney.
00:43:19 Speaker_07
She was one of my PhD committee members and actually the first woman to get a computer science PhD from MIT. And some of the work she did, it came through accidentally, almost like my white mask experience.
00:43:34 Speaker_07
She was being interviewed by a reporter, and she was trying to look up one of her papers, so she searched it, and I want to say it was Google.
00:43:43 Speaker_07
And what came up wasn't just her search for the paper, but also advertisements indicating she might have been arrested. And so the reporter was saying, wait, wait, wait, hold on, hold on.
00:43:55 Speaker_07
Your paper was interesting, but what about the supposed arrest record? And she was saying, wait, what? I've never been arrested, so what's going on here? And then the reporter was saying, it's because you have one of those Black-sounding names.
00:44:10 Speaker_07
Latanya, and she's like, look, I'm a computer scientist. This seems far-fetched.
00:44:14 Speaker_07
So she set up this experiment trying to prove this guy wrong, where she had a list of names that were more likely to be assigned to Black babies than to white babies, and they did the comparison.
00:44:26 Speaker_07
And in fact, it was the case, right, that the search engine with the ads that would come up would more likely put up a false arrest record for people with Black-sounding names.
00:44:40 Speaker_07
And that work came before my work, but that was showing bias in search engines.
00:44:46 Speaker_07
And then I was looking at the work of Sophia Noble, also great researcher, Dr. Noble, author of Algorithms of Oppression, MacArthur Genius Award winner, so much respect for her work. And her work, she was looking for activities for her
00:45:04 Speaker_07
her daughter to do. I think there was a sleepover happening. Kids were around middle school. So she searched black girls. what she pulled up instead of activities for Black girls were pornographic images, more often than not.
00:45:19 Speaker_07
Similar thing if you put in Asian girls. White girls, a little more innocent content. And so that also set her down this exploration that then led to the book, Algorithms of Oppression, the MacArthur Fellowship, and all of that. And so I say this to
00:45:38 Speaker_07
really point to the legacy of Black women in the tech space, right, who have used their own lived experience combined with their academic training, combined with their technical expertise to open up new schools of scholarship.
00:45:57 Speaker_07
And so, in my PhD, I make this argument about the importance of Black feminist thought and then ethics of care and going beyond positivism
00:46:08 Speaker_07
when it comes to the way in which we do research and saying lived experience truly matters, and then I map it with the research that they did, and then how my personal experience of coding in the white mask builds on that. Then I go back to...
00:46:26 Speaker_07
Sojourner Truth and I go back to Frederick Douglass and I show how they were using daguerreotypes and the imaging, the advanced technology of their era. So now how we have AI, oh, everyone's excited. There, to have a photo of somebody was incredible.
00:46:42 Speaker_07
But those kinds of images were being used to support scientific racism, to say, look at the shape of this head, clearly an inferior specimen of a human," you know, this sort of thing.
00:46:54 Speaker_07
And so they actually took that same technology and they had dignified portraits of themselves. So in AI Anti-Woman, I intentionally use one of Sojourner Truth's images taken like that
00:47:10 Speaker_07
She said the shadow supports the substance because she would sell these images. And that's the image that I show Google, labeling her as a clean-shaven old gentleman. So there are all of these historic connections.
00:47:25 Speaker_07
And so as somebody who's a researcher and an artist, I do oftentimes, and I did especially earlier in my career, feel like I wouldn't be respected as much if I talked about my lived experience or if I let the art aspect come through.
00:47:43 Speaker_07
It would be viewed as too subjective and thus disqualify the data that I had gathered. And so, in conceptualizing this notion of the evocative audit, this was actually saying, no, this is also a valid form of knowledge.
00:48:00 Speaker_07
And in fact, it builds on Black feminist epistemology, you know? And so, this is all within my MIT PhD alongside the algorithmic audits of Amazon and others.
00:48:15 Speaker_07
So I was saying you can put the algorithmic audit with the evocative audit and you can actually reach more people that way.
00:48:24 Speaker_06
When you were reciting your poetry, I didn't understand the bias. I felt the bias. It takes me on a trail. from Frederick Douglass to someone we both know, Sarah Lewis at Harvard, to the power of art and imagery aesthetic force.
00:48:49 Speaker_07
Absolutely.
00:48:52 Speaker_06
And when you think of the evocative audit, I've mentored so many female PhD students, Latina, black, white, and the compelling need to orphan the art in them, in themselves, to be seen, to feel like a place is earned at the table when that is their power.
00:49:15 Speaker_06
Do you think that's changed at all, Dr. Joy? I mean, do you think Or do you think there's a season for it in a career? What do you make of that? Because I still see it with the PhD students.
00:49:28 Speaker_07
I think for me, there's a reason I started with the tech, right? So by the time I'm doing AI Ain't I a Woman, I have three degrees, technical degrees, learning and technology. In some ways, I felt that I had, quote unquote, paid my dues.
00:49:51 Speaker_07
And I think that's not necessarily a good thing. That was the armor I felt I needed to have. to let the turtle artists come out. And what I found so fascinating with AI Ain't High Woman was how far it traveled.
00:50:08 Speaker_07
I think I was most surprised at some point I had through one of my mentors. and recommended to be on the EU Global Tech Panel. So president of Microsoft, others.
00:50:18 Speaker_07
And I remember being in Belgium and showing AI Anti-Woman for the first time on this EU Global Tech Panel and just the response to it because it's so visceral. And the only other dark faces in that room were just the faces in the video and mine.
00:50:37 Speaker_07
And even before showing it, I almost didn't make it into that room. because the human gate checkers were asking me what business did I have, you know, at this high-level convening. And I mean, I'm 5'2", I barely looked 28 at the time, you know.
00:50:58 Speaker_07
And so, it was the thinking about how the physical gatekeepers become algorithmic gatekeepers. and then looking at who has, again, a seat at the table.
00:51:09 Speaker_07
So the head of the World Economic Forum was there, and he saw it, and then later I was invited to Davos. And so I presented AI Ain't I a Woman in that space, and later it was shown to EU defense ministers ahead of a conversation
00:51:26 Speaker_07
on lethal autonomous weapons. And again, this is what I mean by how far it could travel.
00:51:32 Speaker_07
Because at the end of the day, the storytelling connects all of us, and that's what the poetry is doing, that's what the evocative audit is doing alongside these photos you can't unsee. These are the biggest, most powerful tech companies.
00:51:47 Speaker_07
And here are iconic faces of women that I respect so much. And I mean, if you're getting Oprah's face when I showed Amazon misgendering Oprah, it's not like she doesn't have photos available, you know? kind of thing.
00:52:03 Speaker_07
So, the other thing I found with the evocative audit was being able to show and not just tell helped us get to the conversation.
00:52:12 Speaker_07
So, instead of like, oh, I'm not sure if this is really a problem or this, that, or the other, it's almost the mic drop, white mass, mic drop, anti-woman, mic drop, gender shades. Okay, let's get to the conversation. This is real. This exists.
00:52:28 Speaker_07
You're not going to gaslight me in a saying it's not. a thing, but also more importantly, it invited other people into the conversation. So I spent so much time trying to make the gender shades results as accessible as possible.
00:52:44 Speaker_07
So taking the time to do the video, the interactive website. So later on, and you see this in the film, the documentary Coded Bias, when I go to the Brooklyn tenants, and I'm working with people there,
00:52:58 Speaker_07
mainly women of color who've had the landlord install a facial recognition entry system. And so they're saying it feels like Fort Knox just to get into their house.
00:53:08 Speaker_07
They were saying thank you for the website because it made the information so clear where you don't need to have a PhD from MIT to know that this is wrong, right? And so that was a
00:53:22 Speaker_07
thing that I kept struggling with, even when I was thinking about, do I really want to finish the PhD and so forth, the work that I was doing, what I delivered as a poet, that was already there without the MIT degree.
00:53:36 Speaker_07
And so it's not what enabled the poetry, but I was also very aware of the use of institutional power and the use of narrative power.
00:53:48 Speaker_07
And so I write all of this in my PhD, the ways in which I'm using credentials for institutional power to get into the halls of Congress, to get into EU Global Tech Panel, Davos, other spaces.
00:54:03 Speaker_07
But alongside that, I'm bringing the narrative power, right, to say these are the marginalized, these are the X-coded, and they matter and they must be centered in these conversations.
00:54:22 Speaker_08
Okay, big reveal. Here we go. Wow.
00:54:35 Speaker_05
You guys kind of get a first look at a completely removed cop code number one. This is an area where there was not river sounds for over a hundred years.
00:54:45 Speaker_09
On this week's episode of Gastropod, we are telling a super exciting story about a river, but actually about fish.
00:54:52 Speaker_08
And in terms of fish, it's actually about one particular fish, about salmon. But it's really about something that's incredibly exciting that will help salmon. It's about the world's largest dam removal project ever. That's right.
00:55:04 Speaker_09
This episode, we are going to the Klamath River on the California-Oregon border. It was once one of the world's great salmon rivers. We wanted to see what happens when you take a river away from salmon and how to go about giving it back.
00:55:19 Speaker_08
Find Gastropod and subscribe wherever you get your podcasts.
00:55:25 Speaker_06
I just can't aptly tell you how grateful I am for your work and your ability to not just do the work but name the process of how the work is coming to you and getting to me. I know quantitative hard, like what I would call, not nicely.
00:55:46 Speaker_06
hardcore quantoids would claim you as their own. But I'm like, no, she's one of us. She's a qualitative person. She's an ethnographer, qualitative person for sure. And then I was talking to a friend, she's like, Dr. Joy, she's quantitative to the bone.
00:56:02 Speaker_06
And what I think is so interesting is that the answer is yes, you are both qualitative and quantitative. You are institutional and narrative. You are poetry and code.
00:56:15 Speaker_07
daughter of art and science.
00:56:16 Speaker_06
A daughter of art and science and a pole vaulter.
00:56:23 Speaker_07
Only to show I'm stronger.
00:56:25 Speaker_06
I mean, but I think your pole vaulting is not just physical pole vaulting. I think you take a really hard, fast run at things and fly right over them and bring us along.
00:56:38 Speaker_06
I could work the pole vaulting metaphor for your work until you were in tears and begging me to stop.
00:56:43 Speaker_07
I mean, I tried in the book. I talked about the difference between bargazing and stargazing. And then I did snowboarding.
00:56:52 Speaker_06
The athlete in me said, just a little bit here, just a little bit there. I picked up the crumbs you were laying down. As an athlete, yeah, I get it. I try to leave something in there for everybody.
00:57:07 Speaker_06
Let me ask you this, how do you envision the future of AI in our daily lives with respect to human dignity and freedom?
00:57:20 Speaker_07
I have a poem about this, if you'll permit me to share it.
00:57:24 Speaker_06
Oh my God, yes, I would love it. It would mean everything. Yes.
00:57:27 Speaker_07
Okay, so it's in the section about the Brooklyn tenants. Let me see. So, intrepid poet, I think it's safe now that I have all this academic tech cred armor. You're in the door. So, all right, let's do a little poetry.
00:57:44 Speaker_06
You are the door, yeah.
00:57:46 Speaker_07
So then I had the opportunity to work with the Brooklyn tenants and it was at a time during my research where after research had been attacked and I felt that the people who were closest to me academically
00:58:01 Speaker_07
didn't stand up for the research in the way I had hoped at that time. I was just like, is this really where I want to be?
00:58:09 Speaker_07
And when I got the opportunity to work with the Brooklyn tenants who were resisting the installation of a facial recognition entry system as the way to access their homes, I was so inspired by them.
00:58:23 Speaker_07
And I was like, oh, I'm not doing this for my committee or for my supervisor or for the institution. I'm doing it for the X-coded. I'm doing it for people like the Brooklyn Tenants. And so this is the poem I wrote for them.
00:58:38 Speaker_07
To the Brooklyn Tenants resisting and revealing the lie that we must accept the surrender of our faces, the harvesting of our data, the plunder of our traces. We celebrate your courage, no silence, no consent.
00:58:56 Speaker_07
You show the path to algorithmic justice requires a league, a sisterhood, a neighborhood, hallway gatherings, sharpies and posters, coalitions, petitions, testimonies, letters, research and podcast, dancing and music.
00:59:13 Speaker_07
Everyone playing a role to orchestrate change. To the Brooklyn tenants and freedom fighters around the world persisting and prevailing against algorithms of oppression, automating inequality through weapons of math destruction, we stand.
00:59:32 Speaker_07
with you, in gratitude. You demonstrate the people have a voice and a choice. When defiant melodies harmonize to elevate human life, dignity, and rights, the victory is ours. God, it's so good.
00:59:55 Speaker_07
And that's truly how I felt when I saw, oh my goodness, these aren't people with PhDs from MIT. These are people who saw a problem, educated themselves about it, reached out and resisted and resisted successfully.
01:00:10 Speaker_07
And for me, that was such an inspiring model to understand that the research I was doing could equip people on the front lines and that also people seeing themselves identified in me in various ways.
01:00:24 Speaker_07
People know whether you're the outsider because you're a woman or you're an immigrant or you're more of the artsy person in a very technical field, whatever it might be, and seeing that inspiring others, that gave me the motivation I needed to continue.
01:00:43 Speaker_06
Weapons of math destruction.
01:00:46 Speaker_07
Yes. Book by Kathy O'Neill. So that whole part is a bibliography. Can I nerd out a little bit?
01:00:51 Speaker_06
Yes.
01:00:52 Speaker_07
I mean, yes. I know. Go, go, go. Okay. Algorithms of Oppression. I already mentioned Dr. Sophia Noble. Algorithms of Inequality, right? Automating Inequality. That's a book by Virginia Eubanks. Through Weapons of Math Destruction.
01:01:08 Speaker_07
That's a book by Dr. Kathy O'Neill. So that whole thing is a bibliography.
01:01:14 Speaker_06
Yeah, I mean, it is literally a bibliography. It is literally a shout-out.
01:01:19 Speaker_07
Yes. For those who know, you know. So all these gems dropped out throughout, for sure.
01:01:25 Speaker_07
And then, also, going back to weapons of mass destruction, when I was – I wasn't sure I wanted to do this research in the first place because, you know, calling out big tech could be dangerous, right? Yeah.
01:01:38 Speaker_07
I was on the fence and I had a mentor of mine tell me that Kathy O'Neill was coming to do a book talk and she was doing a book talk literally on the same block I live on. So I went to that book talk, it was at Harvard Bookstore.
01:01:55 Speaker_07
And I saw this blue-haired, blue-eyed mathematician, you know, talking about how machines can be harmful and problematic. And until then, I felt I was a little alone in these observations.
01:02:08 Speaker_07
And so, I remember introducing myself and telling her about computer vision. She was kind of following what I was saying. But for me, I was like, oh, I'm not alone. If I'm going to be a gadfly, I might have an Ella.
01:02:20 Speaker_07
I was like, okay, all right, we can do this. Weapons of Math Destruction was a really inspirational book for me. And now, Kathy O'Neil, later in the book, I talk about how we team up to do an algorithmic audit for Olay. She knows nothing about skincare.
01:02:39 Speaker_07
As a Olay ambassador, however, I am now better. better versed on skincare that super serum is great clinically prove it you know But we had so much fun together. And then we're both musicians. Kathy's an amazing musician.
01:02:59 Speaker_07
She plays all the things, but she also plays fiddle. So from time to time, I'll go. Her mom's also an amazing mathematician. So she'll invite me over. Her mom will be solving the hardest math puzzles you've ever seen, correcting Kathy on the solution.
01:03:15 Speaker_07
I just love this. I love everything about this. And then she wrote this data science book from way back in the day, O'Neill and O'Neill, because both of her parents were mathematicians. And then she was a computer scientist.
01:03:31 Speaker_07
I guess they call us data scientists now. She's just the best. So you have these incredibly brilliant women. We're spending I had the opportunity to spend time with them, eat with them, chill with her sons, and then we make music together.
01:03:46 Speaker_07
So that's from that first time of seeing her at a Harvard bookstore, being so inspired to now being able to call her a friend and drop her references in some of my poetry.
01:03:58 Speaker_06
It's beautiful. It's just, it's beautiful. You must be, in the minds of big tech, the scariest poet on the planet.
01:04:08 Speaker_07
I'm just a poet.
01:04:09 Speaker_06
What could a poet do? Oh, really? Do we want to get into the wars, the heartbreak, the murders caused by poets throughout time? I mean, I never underestimate the power of a poet. Come on. No, that's why they burn us.
01:04:25 Speaker_06
Okay, I have one question before we get to our rapid fire. I get a lot of texts and emails from friends of mine who are writers, fiction and nonfiction, that our books have been fed. into the machines.
01:04:41 Speaker_06
I don't know exactly how that works, but I know my books have been fed. We just kind of kick back, and that's good, or what is our thinking about that? It doesn't feel good, but I don't know.
01:04:53 Speaker_06
I'm asking for both the mathematician, entrepreneur, and poet in you to braid an answer together that's heart, commerce, and IP protection, and let me know what you think.
01:05:06 Speaker_07
Sure. As a new member of the Authors Guild and also the National Association for Voice Actors, this topic came close, very close to home once I became a published author.
01:05:20 Speaker_07
And I started looking at the ways in which the work of so many authors had been used to train generative AI systems. And this is why I call them regurgitive AI systems.
01:05:35 Speaker_07
because oftentimes the best of what we see in AI that's impressive, right, is because it's been fine-tuned on some of the best of what's been made by human creators, be they visual artists, authors, filmmakers, and so forth.
01:05:54 Speaker_07
And so, for me, we are really looking at what you could view as a heist
01:06:01 Speaker_07
You didn't know that your books were included in datasets used to train powerful generative AI systems, and you don't need to have any kind of tech background to know it's not fair for companies to get billions of dollars of investment to then sell products
01:06:20 Speaker_07
that are made using your work and argue fair use when it's copyrighted, and also when some of those outputs can almost be whole entire paragraphs or samplings of that work.
01:06:34 Speaker_07
And furthermore, now you have the ability for some of these systems to say, write this in the style of, let's say, let's make it a Brown style, or let's make it a Bolognini style, or let's do it in the style of an artist.
01:06:49 Speaker_07
And so, you're seeing right now, lawsuit after lawsuit, right? Because of the ways in which AI systems that are called generative are truly regurgitative systems, and they're regurgitating the hard work and the IP of artists
01:07:11 Speaker_07
artists from all around the world, and we also have evidence of this.
01:07:17 Speaker_07
And so, the books, data sets that have been used to train some of these systems, we have the titles, we have the authors, so we can say, yes, actually, this author's work was used to train this particular system, and no, it's not fair.
01:07:34 Speaker_07
And so this is part of why we launched this campaign with the Algorithmic Justice League. So that's my day job. I like poetry. I like writing and things like that. I'm also the president and artist-in-chief of the Algorithmic Justice League.
01:07:50 Speaker_07
And we really fight for the X-coded, people who have been excluded or exploited by AI systems or AI. companies. And in this case, when we're looking at the future of the creative class, we have to push back and demand creative rights.
01:08:08 Speaker_07
And so what do creative rights look like? Well, first of all, consent. Hello? Ask. Right? Yeah, basic. Basics. We talking basic, you know?
01:08:21 Speaker_07
So it shouldn't be after the fact that you find out, oh, my book was used to train, or my books in your case, right, was used to train these systems. Compensation.
01:08:33 Speaker_07
And so it's not just acknowledging that the books were there, it's making sure that there is fair and adequate compensation for those whose work has been used in this way. I think that would be a form of redress.
01:08:47 Speaker_07
And then there's also this notion of control. And this part I find really fascinating because this is what I was grappling with when I recorded the audio book for Unmasking AI. I'm busy, you're busy.
01:09:02 Speaker_07
They said it would be four days in the studio, actually the same studio I'm in. I was like, maybe this is an opportunity for AI voice clone, right? Yeah. And I thought that might be an option because I didn't understand that voice acting is an art.
01:09:18 Speaker_07
And it wasn't until I was working with the director, who's won, like, the Oscar version for voice acting and all of that, that she really helped me appreciate it. I thought I was going to be reading aloud. No, I was performing.
01:09:34 Speaker_07
And by the time I would get home every day around five, I was out by six. I was done. And it was that process that just made me see how easy it is to underestimate the art forms that you don't call your own.
01:09:49 Speaker_06
Oh my God. I mean, do you know that I can only do three and a half to four hours a day of that? It is so difficult to do that well.
01:09:59 Speaker_07
Absolutely. And then also the voice can only go so far. I made the mistake of having, I think, I don't, I had dairy and because of that, my voice was cracked. So I was also learning different things just about being a biological being, right?
01:10:14 Speaker_07
That made it, I also had an attitude. You know? I had an attitude. So there would be some parts of the book where I'm talking about insults. I think I was reading through the TED insults. So I was struggling each time I was reading it, right?
01:10:30 Speaker_07
And the director was so patient with me. She's like, I'm gonna repeat it. Just give me a second.
01:10:37 Speaker_06
You know? No, I wish I would have been in the recording thing next to you, because we could have had... Yeah, because I mean, I literally would start talking and then they'd be like, you have to do that again.
01:10:49 Speaker_06
I'm like, MFR, that's the hardest story in my adult life. I am not saying that again. And they're like, no, you got to go again. I'm like, mother trucker.
01:10:59 Speaker_07
Or you'd get close, but you'd like transpose one word. I also learned I can't correctly pronounce many things, but I will do the English as a second language card. Okay, Oxford. I'm out of excuses here.
01:11:23 Speaker_07
No, but that was a whole fascinating experiment for me. I actually wrote this whole op-ed. It's not yet published, but it was just about how easy it is to underestimate the art forms that aren't your own. That's it. And so the control piece, right?
01:11:41 Speaker_07
Now that I know voice acting is a lot more than I thought it was, I would never actually want to use an AI voice clone for that type of work. But there might be other type of work where you decide you have the agency to say, hey,
01:11:57 Speaker_07
I want to experiment with AI in this way. Or maybe I'm using AI to do a thematic analysis of everything I've written on this particular topic.
01:12:06 Speaker_07
And so using it as a tool to, you know, boost collaboration or something in that nature, you should have the control to do that. So that's that third C. And that fourth C is credit. Again, this is why I keep calling it regurgitative AI.
01:12:21 Speaker_07
Let's credit the artists who make what seems breathtaking from AI possible.
01:12:28 Speaker_06
I'm just writing down these C's because they're very important to me.
01:12:31 Speaker_07
Yeah, AJL.org slash writers. You'll see the four C's there. We have the four C campaigns. So AJL.org slash writers.
01:12:40 Speaker_06
I'll put that up y'all on the Brené Brown.com so you can find it. But I think consent compensation control and credit is the entire way I've been brought up about work, period, as an academic. This is really important work. We'll link everything to it.
01:12:56 Speaker_06
I'm gonna sign up or whatever you do over there because I support this movement. It's so easy to underestimate the amount of work in art that you don't create.
01:13:04 Speaker_06
And that's like when you're walking through like the Louvre and people ahead of you are like, oh, I could have done that. I'm like, dude, then where's your, show me your art. I'd like to see it too.
01:13:14 Speaker_07
You can regurgitate it when you hear the line. when you see the final creation, but you cannot reproduce the process of creation as it went through the lived experience of the artist who gifted you that work.
01:13:29 Speaker_06
And there we end. And we go right into rapid fire because that was beautiful. Okay. Are you ready? Ready. Fill in the blank for me. Vulnerability is? Unmasking truth. I can't cry after each of these. Get your shit together, Brene. Okay.
01:13:51 Speaker_06
One piece of leadership advice that you've been given that it's so great you need to tell us about it are so shitty you need to warn us.
01:14:01 Speaker_07
This one is probably on the warning side. I was in an executive education course and one of the professors described leadership as the distribution of loss at a pace which people can handle.
01:14:17 Speaker_06
Oh my God.
01:14:19 Speaker_07
And I could see the perspective coming into a new organization as an outsider trying to bring in change.
01:14:26 Speaker_07
So they were talking about how often it happens when somebody is brought in and they try to change things too quickly without really understanding the stakeholders or the motivations or what people will perceive as loss.
01:14:40 Speaker_07
with that new thing they're bringing, right? So I got it from that standpoint, but it didn't sit well with me from, you know, I'm a storyteller, right? So, ooh, do we start with disincentives or do we paint a compelling vision? Right?
01:14:59 Speaker_07
So that one, I still, I find it useful in that it's thought-provoking. Okay, what's driving that particular perspective? And in which context does it make sense?
01:15:10 Speaker_07
And then I also found that it reaffirmed my approach, which is more joyful, as you might imagine.
01:15:18 Speaker_06
Yeah, you'll appreciate this. One of my favorite leadership quotes, and I think you'll appreciate it because it's the paradoxical piece. It's from John March, who was at Stanford. He said, leadership is poetry and plumbing.
01:15:30 Speaker_06
You have to be able to cast a vision that inspires and build systems that deliver.
01:15:36 Speaker_07
Isn't that good? Poetry and plumbing.
01:15:40 Speaker_06
Yeah, I kind of liked it.
01:15:41 Speaker_07
Yeah, I'm gonna have to write that one down.
01:15:44 Speaker_06
We'll send it to you. Thank you. Okay, you ready? You, Dr. Joy, are called to be very brave, but your fear is real. You can feel it in the back of your throat. You can taste the fear. What's the very first thing you do?
01:15:57 Speaker_07
I pray.
01:15:59 Speaker_06
I pray. Last TV show that you binged and loved?
01:16:07 Speaker_07
Ooh, Bridgerton. Oh, and also I got to meet Adjua, who is, I was at the end of LACP Awards. I was there for my situation and people were like, Dr. Joy, I was like, look, it's my fangirl moment. I gotta go. I gotta go meet some queens over here."
01:16:26 Speaker_07
And so season three is coming out, but they were there for Queen Charlotte. Oh, my gosh. So, I binge Bridgerton, then of course I had to watch all of Queen Charlotte as well. So, yeah, definitely excited for the new season to come out.
01:16:42 Speaker_07
I also watch a bunch of nature shows, but I don't binge them the way that Bridgerton. I gotta know what's next.
01:16:47 Speaker_06
Queen Charlotte.
01:16:49 Speaker_07
Also, really interesting with Bridgerton, when Gemini came out with Google's Gemini AI prompt system, so text to image, I was describing it like Bridgerton, because you would put in something like the founding fathers.
01:17:05 Speaker_07
and they were showing Black men as founding fathers in the U.S., to which there was a public outcry, this is not an accurate representation that they put in Pope.
01:17:18 Speaker_07
And there was a Native American woman asked Pope, what is going on with this kind of representation?
01:17:25 Speaker_06
I was like, okay, so it's almost Hamilton, Bridgerton. I mean, Shonda Rhimes got ahold of Gemini.
01:17:31 Speaker_07
in that way. But what I found fascinating was people who were used to being centered were rationalized. And so, the experience of this kind of reverse symbolic annihilation, and then also how quickly Google shut it down. Did they? they did.
01:17:48 Speaker_07
They took it offline, and people were calling it woke AI. These were actually evocative audits. These were examples of real-world evocative audits. But when I started reading and seeing the examples, or, for example, they did Nazis who were
01:18:06 Speaker_07
I think, I don't know if they had them as Asians in their representation. I'll need to double-check.
01:18:12 Speaker_07
Okay, so these were not historically accurate, so it led to another kind of conversation about representation, particularly when you're talking about the past. And then it made me think, because so often AI is described as being a mirror.
01:18:27 Speaker_07
It's like, okay, it's biased, but we're also biased. But oftentimes what I see is AI is a kaleidoscope of distortion. And that's a bit of what we were seeing with Gemini as well. But I go on and on about these things.
01:18:42 Speaker_06
Okay, so I might have to send you a DM when we're in the middle of the new Bridgerton series because I'm into it. I'm such a fan. Okay, favorite movie?
01:18:51 Speaker_07
Favorite? Ooh, this one is hard. I'll say the Sound of Music because that's partially how I learned English. I would watch it almost every day when I was four and it was my on-ramp to the English language. So Sound of Music.
01:19:08 Speaker_07
My brother would come home, Sound of Music again!
01:19:11 Speaker_06
I do love the Sound of Music. A concert that you'll never forget?
01:19:17 Speaker_07
Ooh. I recently went to a Nico concert. She does this song called Jericho, which has become kind of my theme song because she has this part, I'm your future, past, and present. I'm a fine line. I'm the missing link of this illusion.
01:19:35 Speaker_07
But she's talking about artificial intelligence in many different sorts of ways. It's just an incredible, ethereal-sounding song. And I first heard it acapella. Was this TikTok viral a sensation?
01:19:49 Speaker_07
And then I heard the full piece and I thought, oh my goodness, this is literally the theme song for Unmasking AI. So when she came to Cambridge, I got the meet and greet.
01:20:00 Speaker_07
I signed a book for her just to let her know how excited I was about the kind of music she's making. So that was an amazing concert to attend and then to share that experience with friends as well. So at the Sinclair.
01:20:16 Speaker_06
I love that, and we'll link to the song. Favorite meal. Favorite meal?
01:20:21 Speaker_07
Ooh. I just went to Ghana for the first time in 30 years, and I was well-fed by so many different family members. But my mother's older sister, she makes stews just the way my mom makes stews, and I hadn't tasted her cooking since I was maybe three.
01:20:42 Speaker_07
And so what I appreciated about that experience was how familiar It was. And then when I was in Ghana, I have so many cousins and so many people who look like me and my brother.
01:20:54 Speaker_07
We're just, because growing up in the US, we were the four of us, you know, our little unit. And I would always kind of feel a bit of a distance when people would talk about all their cousins.
01:21:04 Speaker_07
I'm like, oh, I wish I had cousins close by or aunties and uncles. And I do. And they're there, you know? So we had an amazing time together. But Mills just, or sitting next to my grandmother. Okay, so when my grandmother and I did
01:21:18 Speaker_07
the thing where you hold the unmasking AI hook. her face still features match this so perfectly, it's uncanny. I was like, oh my goodness. And also the way her hands were, you could, you saw her hands were like.
01:21:34 Speaker_06
On the mask, yeah.
01:21:36 Speaker_07
It was like, so it was this, her hands were there and then it's her face and it's this perfect match of the face on the book. And I'm thinking, wow.
01:21:45 Speaker_06
That's powerful.
01:21:46 Speaker_07
The legacy is there. Yes, so, favorite view.
01:21:49 Speaker_06
than eating stew that tastes like home. I mean.
01:21:52 Speaker_07
Yes.
01:21:53 Speaker_06
Yeah. What is a snapshot of an ordinary moment in your life that gives you real joy?
01:22:00 Speaker_07
Probably sitting on my couch playing my guitar. You know, making up some song. I love words and playing around. So making up some random song about something that's happened. Yeah. Beautiful. Pull it out.
01:22:16 Speaker_06
And then one thing, last question, one thing you're deeply grateful for right now in your life?
01:22:22 Speaker_07
I'm so deeply grateful for my grandmother. Again, I mentioned this trip that was three decades, you know, in the making and to see the family she's created. My grandfather actually died before I was born.
01:22:40 Speaker_07
And I think she lost him when they were both in their fifties, right? And then to see the legacy of everything that has come through her. in her children and in her grandchildren. I'm just so grateful that I was able to see her.
01:22:57 Speaker_07
I'm looking forward to seeing her again. I call her Nana, and so I've been calling her Nana AI and Nana Robo, you know. So I'm feeling really grateful for my grandmother in particular, and for my lovely extended family.
01:23:14 Speaker_07
And also for these jeans, because I met my dad's sisters who used to look after me when I was little, and I felt like I saw my future. And the future is bright. The future is bright. The skincare is tight. So I'm like, OK. I'm grateful for that.
01:23:30 Speaker_07
Grateful for these jeans. Thank you, Mom and Dad.
01:23:33 Speaker_06
Yeah, they delivered. Unmasking AI is just one of the best books I've read. It helped me so much. I felt walked through it by you. I didn't feel alone when I was reading it.
01:23:45 Speaker_06
I felt like I had a guide, and I felt not just your mind but your heart, and that makes a big difference.
01:23:54 Speaker_07
I'm so glad to hear that because that was my hope. I wanted people to not feel like, oh, AI tech, this is a topic I can't engage in.
01:24:04 Speaker_07
I really believe if you have a face, you have a place in that conversation, and I wanted to be that friend to invite you into the conversation.
01:24:13 Speaker_07
So your nerd cousin that went to tech school but can still speak to people, I wanted to be that person for others.
01:24:20 Speaker_06
You felt like that person in there for me. I never lost sight of your heart the whole time. Thank you. Thank you. And I really appreciate you being on the podcast.
01:24:29 Speaker_07
Thank you so much for having me here.
01:24:42 Speaker_06
I told y'all. I told y'all that this was going to be an amazing conversation. I mean, I
01:24:50 Speaker_06
learned so much, not just about technology and AI and racial bias, but about myself and about what it means to engage with your whole brain, your right side, your left side, to be an artist and a coder, to be a poet and a mathematician.
01:25:09 Speaker_06
Dr. Joy is just a beautiful person doing, I think, life-saving work, given what we're looking at. This series has been
01:25:19 Speaker_06
just one of the most interesting and kind of hardest and scariest things that I've ever talked about and done, because I think AI is not coming, it's here.
01:25:29 Speaker_06
And the more we know and the more we can talk about it, the more we can make decisions informed by our values and our ethics. You'll find everything you need to find about Dr. Joy on brenebrown.com under her podcast, her Instagram, all of her talks.
01:25:47 Speaker_06
I'm going to do one final episode where I just talk about what I've learned and how it's changed, how I work and how I lead our organization.
01:25:54 Speaker_06
I really appreciate you being with me on this really weird walk through AI, technology, social media, all the things that are living beyond human scale. Y'all stay awkward, rave, and kind.
01:26:15 Speaker_06
Dare to Lead is produced by Brene Brown Education and Research Group. Music is by The Sufferers. Get new episodes as soon as they're published by following Dare to Lead on your favorite podcast app. We are part of the Vox Media Podcast Network.
01:26:29 Speaker_06
Discover more award-winning shows at podcast.voxmedia.com.
01:26:48 Speaker_04
I like walking around, it's good for me. Good to do.