Skip to main content

Data and Diversity AI transcript and summary - episode of podcast Think with Google Podcast

· 19 min read

Go to PodExtra AI's episode page (Data and Diversity) to play and view complete AI-processed content: summary, mindmap, topics, takeaways, transcript, keywords and highlights.

Go to PodExtra AI's podcast page (Think with Google Podcast) to view the AI-processed content of all episodes of this podcast.

View full AI transcripts and summaries of all podcast episodes on the blog: Think with Google Podcast

Episode: Data and Diversity

Data and Diversity

Author: Think with Google / Gimlet Creative
Duration: 00:20:48

Episode Shownotes

Diversity matters to your bottom line. And now, machine learning is being used to prove just that. In this episode, we hear from the Geena Davis Institute about their collaboration with Google and USC to develop machine learning technology in order to analyze film and television for gender parity. We’ll

then talk with Google’s Chief Marketing Officer, Lorraine Twohill, about how Google is developing their own practices around diversity, equity, and inclusion. Learn more about your ad choices. Visit podcastchoices.com/adchoices

Full Transcript

00:00:00 Speaker_07
Imagine you're a famous actor. You're successful, you're known in the industry as being super smart, and you're a woman. So you've seen some things that feel unfair, like how you're often one of the few women cast in a film.

00:00:16 Speaker_07
You notice that the roles just aren't there. And because you're an A-lister, you think to yourself, I can do something about this.

00:00:24 Speaker_07
So you go to the studio head you know best, and you say, hey, I think we need to do something about getting more roles for women. And he says,

00:00:35 Speaker_05
know that's been fixed. And they would name one female character that was in the entirety of the movie or the TV show. And she thought, you know what? I need to get data.

00:00:51 Speaker_07
Welcome to the Think with Google podcast. I'm Tess Vigeland with Gimlet Creative. In this show, we explore the future of marketing and share insights and perspectives to get you and your marketing team ahead of the curve.

00:01:07 Speaker_07
Today, we're talking about diversity and inclusion and how diversity-focused data can help your future customers feel seen. So let's back up to that theoretical actor we described earlier, only she's not theoretical.

00:01:28 Speaker_07
She's Academy Award winner Gina Davis from The Accidental Tourist and A League of Their Own. There's no crying in baseball. That Gina Davis. And this thing where Gina was told, oh no, that's been fixed. That really happened.

00:01:44 Speaker_07
She'd talked to studio heads after watching TV with her daughter one day. She noticed how few female characters there were and how sexist the roles tended to be.

00:01:54 Speaker_07
But after she was told the problem was fixed, she realized she was going to have to come back with proof that it wasn't.

00:02:01 Speaker_05
Our whole purpose in Gina's initial vision was change the numbers.

00:02:06 Speaker_07
This is Madeline Dinono. She's the CEO of the organization that Geena Davis founded to work on this problem, the Geena Davis Institute. Madeline says Geena knew they had to find out what the numbers were before they could change them.

00:02:21 Speaker_07
But sitting down to watch every film from even the last decade would be untenably time consuming. It would take a prohibitive amount of staff.

00:02:31 Speaker_07
What they really needed was a method that either proved or disproved the hypothesis that there weren't many roles for women.

00:02:39 Speaker_05
It would be a great way to put analytics in front of decision makers to say, look, these are the facts now. How can we work collaboratively? How can we make change?

00:02:54 Speaker_07
So the Gina Davis Institute set out to look for experts in machine learning, a form of artificial intelligence. They wanted to use it to collect data and learn from it.

00:03:04 Speaker_07
And the Institute was particularly interested in machine learning that could do auditory and facial recognition.

00:03:11 Speaker_05
Gina and I walked in, met with them, a team of engineers, and we said, yeah, we want a thingy. And they, like, looked at us, like, dumbfounded, like, what? Well, we want a thingamajiggy that can extract gender and screen in speaking time.

00:03:27 Speaker_07
That's how the Gina Davis Institute, with the support of Google, developed GDIQ, or Gina Davis Inclusion Quotient, in collaboration with Sri Narayana.

00:03:39 Speaker_03
My title, I have lots of titles, so I'm a professor at the University of Southern California. I'm a professor in a number of different fields.

00:03:47 Speaker_07
In the engineering school, I'm a professor in... Sri has a lot going on, but for the purposes of this story, he's the director of the Signal Analysis and Interpretation Lab at the University of Southern California.

00:03:59 Speaker_03
So I bring to bear a lot of different methodologies from linguistics and psychology and so on.

00:04:03 Speaker_07
Sri's lab took Google's machine learning technology and applied it to film. Now, here's how the software works. Sri pulls up a film. He showed me hidden figures, appropriately enough, about mathematicians.

00:04:18 Speaker_07
It's silent, but as we're watching, these frames pop up over the characters' faces, kind of like when you're tagging someone in a photo on social media.

00:04:29 Speaker_03
The inspiration we had for this research was to see how much people are seen and heard on screen. So the boxes you're seeing, as the video rolls, it basically figures out where faces are. And when it detects a face, it just puts a box around it.

00:04:47 Speaker_03
Boxes are color-coded in this clip that we are watching together.

00:04:50 Speaker_07
Yeah, the men are in yellow and the females are in a green box.

00:04:55 Speaker_03
Exactly. And you also see the little legend that says it's male and female.

00:05:00 Speaker_07
The algorithm isn't perfect. It can be hard to determine who fits into the category of male or female. It's early days, though.

00:05:08 Speaker_07
Future iterations point toward being able to reach a much broader definition of gender, one that would include people who are outside the gender binary.

00:05:17 Speaker_03
A lot of these are social constructs, right? I need to point that out. And these are based on sort of how people view and label these things. And that itself is a very important question.

00:05:28 Speaker_03
Then we can use these examples to train these algorithms to do what one could do by just human annotation, sitting and watching these kinds of things.

00:05:39 Speaker_07
The machine learning is much faster and produces a spreadsheet that makes the data easy to analyze.

00:05:45 Speaker_05
We were presenting the facts versus opinion, because most people were of the opinion that there was an equal representation of male to female characters.

00:05:59 Speaker_07
Here's Madeline Dinono from the Gina Davis Institute again.

00:06:02 Speaker_05
No one knew until the first study that we did that, at that time, it was about a 3 to 1 ratio in film.

00:06:12 Speaker_07
She says gender parity has been achieved in TV and family films, but that advertising is still catching up. And, as we know, media have a tremendous impact on our culture. And targeted media is supposed to, and does, make money.

00:06:28 Speaker_01
diversity does in fact sell. And to the extent that you're not diversifying, our argument is you're leaving money on the table.

00:06:36 Speaker_07
That's Dean Darnell Hunt from UCLA. He's the creator of the Hollywood Diversity Report, an annual review of the diversity, or lack thereof, in film and TV.

00:06:47 Speaker_07
He says the data his team has collected shows that more representative media is better for business.

00:06:55 Speaker_01
African-Americans for generations have watched more TV than any other group, and now we know that Latinos buy per capita more movie tickets.

00:07:02 Speaker_01
So in fact, even though they're about 40% of the population, we argue they're probably closer to half of serious media consumers now.

00:07:10 Speaker_01
We've noted that the top 10 films at the box office in terms of global sales, people of color have bought half the tickets.

00:07:17 Speaker_07
And yet, according to Darnell, Latinos only had 5.2% of all roles in all films, and Black people only had 9%. For businesses creating content, it seems pretty obvious what the benefit of diversity is for them.

00:07:33 Speaker_07
A more diverse product means a more diverse audience, which can lead to a larger audience, and that means more profit. But...

00:07:42 Speaker_01
People don't always act rationally, nor do organizations, because organizations are composed of people. It's not necessarily in the immediate interest of the white men who control the executive suites to shift the way they're doing everything. Why?

00:07:57 Speaker_01
Because those men, based on their experiences and background, probably aren't the best suited to tell the types of stories and the types of narratives that increasingly diverse audiences want to see.

00:08:07 Speaker_01
And what that means is they would then have to open up the door to allow other people to come in, women and people of color, who have had those experiences and can credibly tell those stories. But that means sharing their power.

00:08:20 Speaker_07
Even though it's entertainment, the impact it can have on the real world can be dramatic. Here's Madeline DiNono from the Gina Davis Institute again.

00:08:29 Speaker_05
What we have seen is the cause and effect and actually heard from women and girls how deeply they've been inspired by a fictional character that they have seen. And when you're the Gina Davis Institute, you can prove that.

00:08:48 Speaker_07
In addition to being famous and successful, Gina Davis is very good at archery, as in bows and arrows. She took it up a couple of decades ago, and starting in 2012, she noticed there were a lot more little girls taking up the sport.

00:09:02 Speaker_07
So she put her institute to work, figuring out why that was, and the data came back with two reasons.

00:09:10 Speaker_05
One was Hunger Games and the other was Brave. Seven in 10 girls named Katniss and Brave as the reason why they took up archery. So they saw the movie and they bought a bow. It was immediate.

00:09:24 Speaker_05
That's how dramatic the impact could be for seeing something in the fictional world and how it then spills over into the real world.

00:09:37 Speaker_07
In fact, the machine learning tool used by the Geena Davis Institute isn't just a tool for counting gender in films. Google has applied the same tool to focused marketing campaigns.

00:09:49 Speaker_02
You have to reflect the values that your customers care about.

00:09:54 Speaker_07
That's Aman Govil, Global Head of Platforms and Tools for Think with Google.

00:09:59 Speaker_02
The reality is most people want to see themselves reflected in the creative work of the brands that they relate to. So it's not just about being present on screen, it's being present in a positive light that makes me feel good about myself.

00:10:17 Speaker_07
Aman and his team created the basics of the tool that the Gina Davis Institute used and Sri Narayana further developed to better analyze movies for that all-important screen and speaking time.

00:10:28 Speaker_07
He says it's a tool that can directly influence marketing practices.

00:10:32 Speaker_02
If we're going to create modern marketing for the digital world, we need to have engineering teams and data science teams involved in that work. People can provide their lessons and we can learn from them and share them at scale.

00:10:48 Speaker_07
Coming up after the break, we'll dig deeper into how data and machine learning can help shift your marketing practices.

00:11:05 Speaker_00
You're listening to the Think with Google podcast brought to you by Google. At Think with Google, it's their mission to make marketers more knowledgeable by providing research, insights, and perspectives that change the way you do business.

00:11:19 Speaker_00
In this episode, we've been discussing why marketers should be making sure their customers feel seen.

00:11:24 Speaker_00
To learn more about what Think with Google is working on when it comes to DE&I, head to thinkwithgoogle.com slash inclusion, or subscribe to their email newsletter at thinkwithgoogle.com slash subscribe. Now, back to the show.

00:11:45 Speaker_07
Welcome back to the Think with Google podcast. We're talking about how data can help marketers fully understand how they represent their brand to diverse audiences.

00:11:56 Speaker_07
Because as we learned in the first half of the show, diversity matters to your bottom line. Google has a whole team devoted to making sure its marketing efforts reflect the diversity of its users.

00:12:10 Speaker_07
And it's using Google's own data to change daily practices in big and small ways. One of those ways centers around advertising campaigns. Mecca Williams works with Google partners like advertising agencies as a diversity, equity, and inclusion lead.

00:12:28 Speaker_04
There was a ride that was developed. It looked like a roller coaster. A ride? Yes.

00:12:36 Speaker_07
For the 2019 Consumer Electronics Show, Google was showing off its Google Assistant with not exactly a roller coaster, but a ride through one family's day. Hey Google, take me on the Google Assistant ride. It's cute.

00:12:51 Speaker_07
It shows you all the ways that the Google Assistant can help a family out. Calendars, directions, shopping. Mecca Williams saw the character options for the family when the ride was just a concept and still being developed.

00:13:03 Speaker_07
But what she saw left her with a few questions.

00:13:06 Speaker_04
The family was meant to be a black family. I was happy to see that they were a family of color. However, when we looked at the family being a woman of color, what stood out to me in particular was looking at the mom who had the really large hips.

00:13:23 Speaker_04
And I know that that's something that can be seen as stereotypical for a black woman to be really shapely. And it's not necessarily meant in a negative way, but it just felt like it was a cartoon image.

00:13:34 Speaker_04
And it just felt like we were just taking it a little bit too far.

00:13:40 Speaker_07
Mecca is one of the marketing volunteers who meet to review projects that have been submitted by other Google marketing teams. They're known as the Inclusive Marketing Consultants.

00:13:49 Speaker_07
The group gives feedback on things like stereotyping and lack of diversity in proposed marketing concepts. In the case of the ride, the group's feedback ensured the stereotypical depictions of the black family never made it out of the concept phase.

00:14:04 Speaker_07
But with such a massive volume of marketing materials at Google every year, this group can't realistically review every single thing. So that's where data and machine learning comes in. It can help reveal trends that might otherwise go unnoticed.

00:14:21 Speaker_06
I've always been very focused on gender diversity as a senior woman in both the tech and advertising industries. This is Lorraine Tuhill, chief marketing officer at Google.

00:14:31 Speaker_07
With almost 30 years in these industries, she knows a thing or two about pushing for gender equality.

00:14:37 Speaker_06
Our engineering teams had helped the Geena Davis Institute build the tool that they've been using to audit movies. So I knew about that and was fascinated by that. So I asked, would it be possible for us to use this to look at our work?

00:14:51 Speaker_07
And when Google used the tool on themselves, they discovered the same thing the Geena Davis Institute did. The representation of women in their creative was not on par with men. And even at that, the technology still doesn't account for everything.

00:15:07 Speaker_06
Over the past number of years, four or five years, I'm realizing that there's very different types of diversity, from racial diversity, gender diversity, socioeconomic diversity, accessibility, age, and that we needed to be more nuanced in our thinking and make sure we were solving for many different kinds of diversity.

00:15:22 Speaker_07
While machine learning offers a strong starting point, Google also supplements its findings with manual reviews to look at more nuanced issues.

00:15:31 Speaker_06
So on the one hand, for example, we made progress on having more African-Americans in our creative, but then when you looked at it more closely, we were featuring largely lighter skinned African-Americans.

00:15:41 Speaker_06
On the one hand, we made progress in terms of women in our creative, but we looked at it more closely, they were younger than the men. So there's still work to do in closing those gaps and getting that right.

00:15:50 Speaker_06
And so it's really those nuances that make a huge difference.

00:15:57 Speaker_07
Another thing that makes a huge difference, Lorraine has discovered, is that tackling diversity issues should not be addressed only at the end of a project.

00:16:06 Speaker_06
I would say one of the biggest mistakes we made was letting people get quite advanced in their work.

00:16:12 Speaker_06
Because when the work is quite advanced, no matter how much rich feedback you give people at that point, they're not going out to do an entire reshoot. Lorraine says while it may be obvious, it has to be said.

00:16:24 Speaker_06
For us in the early days, too much of the burden was being carried by my African-American and Latinx marketers.

00:16:30 Speaker_06
And it was only when we started to all care and have all of my leaders care about it and have everyone focus on their work that you actually make real progress without putting too much of a burden on minority groups in your team.

00:16:41 Speaker_06
Google has already seen the impact of this focus. One example is their Chromebook campaign. We introduced a new, dedicated, specific creative and media buy for the Latinx audience.

00:16:54 Speaker_06
And when we did that, the campaign performed better than it had without that. So we had clear evidence that there was a business value and business benefit in taking the time to create tailored creative for your audience with tailored media plans.

00:17:09 Speaker_07
Lorraine has been very forward about Google's push to do better on diversity and inclusion.

00:17:15 Speaker_07
She's even released two articles detailing how Google is changing their marketing habits internally and asking the company's external partners to get on board too.

00:17:25 Speaker_06
I realize that we have a sense of responsibility as a major player in the industry to show and lead the way and show that we're responsible and show that we care about this and by doing so encourage others to ask themselves the same questions and to do the same.

00:17:40 Speaker_04
Mecca Williams, whose job is to work with these external partners, is on the team spearheading this push. DE&I, diversity, equity and inclusion, it can be really sensitive. People want to know why you care what the makeup of their agency looks like.

00:17:55 Speaker_04
As long as they're doing really good work, it's like, why do we care?

00:17:59 Speaker_07
Mecca says this isn't about trying to push a uniform code on Google's partners, but more about getting those partners to reflect shared values.

00:18:08 Speaker_04
So it's not just about ensuring that we have one black person, one white person, one Latinx person. It's not really about that.

00:18:15 Speaker_04
It's just ensuring that we are reflecting any and all dimensions of diversity and ensuring that we're doing it in an equitable and fair way.

00:18:23 Speaker_07
And just as Google audited itself internally, they've asked the same of their partners to gather data and create action plans.

00:18:31 Speaker_06
So it has to be ongoing. There's no one and done. Yes, you should do training. Yes, you should spend time with your agents, understand their makeup and their plans.

00:18:38 Speaker_06
But you have to do regular check-ins and regular daily reminders, nudges, to make sure people don't forget. People are very often heads down in their work, they're in a hurry, and they revert to old habits.

00:18:51 Speaker_07
But Google isn't interested in shaming marketers into this. Instead, Google wants to plan with them in order to hold everyone to a higher standard.

00:19:01 Speaker_04
As a partner who helps tell our stories to all of our users, it's critically important that you're aligned with us and how we're thinking about inclusive marketing. We like to say that we're like an accountability partner, like a gym buddy.

00:19:13 Speaker_04
We're not a drill sergeant.

00:19:15 Speaker_07
And for Google, where data is so important, the transparent and clear benchmark that all this data provides allows marketers to chart their progress year after year. But ultimately, it's about more than numbers.

00:19:30 Speaker_06
You have to appeal to the heart as well as the head. At Google, we are often very rational, we use data, but it's not enough.

00:19:36 Speaker_06
You have to lead with the head and the data and the evidence, but you also have to lead with the heart and explain why this matters and why you care. Which is really the foundation of all great marketing.

00:19:59 Speaker_07
This season on the Think with Google podcast, we'll learn how Google uses insights from data to move the workplace forward, both in their own practices and in the marketing industry.

00:20:10 Speaker_07
The Think with Google podcast is brought to you by Google and Gimlet Creative. This episode was produced by Katie Shepard and Carrie Anne Thomas. James T. Green is our lead producer. Gabby Bulgarelli is our fact checker.

00:20:23 Speaker_07
We're edited by Rachel Ward and Andrea Bruce. Bhoomi Haddaka mixed this episode. Katherine Anderson is our technical director. Our theme is by Marcus Thorne-Pagala. Additional music from Marmoset, Billy Libby, and So Wiley.

00:20:37 Speaker_07
You can find us on Spotify, Apple Podcasts, or wherever you listen. And if you like what you've heard, share with your friends and colleagues. We'll see you next week.