Skip to main content

Brené and Barrett reflect on the Living Beyond Human Scale Podcast Series AI transcript and summary - episode of podcast Dare to Lead with Brené Brown

· 43 min read

Go to PodExtra AI's episode page (Brené and Barrett reflect on the "Living Beyond Human Scale" Podcast Series) to play and view complete AI-processed content: summary, mindmap, topics, takeaways, transcript, keywords and highlights.

Go to PodExtra AI's podcast page (Dare to Lead with Brené Brown) to view the AI-processed content of all episodes of this podcast.

Dare to Lead with Brené Brown episodes list: view full AI transcripts and summaries of this podcast on the blog

Episode: Brené and Barrett reflect on the "Living Beyond Human Scale" Podcast Series

Brené and Barrett reflect on the "Living Beyond Human Scale" Podcast Series

Author: Vox Media Podcast Network
Duration: 00:44:46

Episode Shownotes

In this episode Brené and Barrett discuss their learnings on AI and social media and some of their favorite nuggets from each of the guests in the series. Learn more about your ad choices. Visit podcastchoices.com/adchoices

Full Transcript

00:00:00 Speaker_01
Thumbtack presents the ins and outs of caring for your home. Out. Indecision, overthinking, second-guessing every choice you make. In. Plans and guides that make it easy to get home projects done. Out. Beige on beige on beige. In.

00:00:23 Speaker_01
Knowing what to do, when to do it, and who to hire. Start caring for your home with confidence. Download Thumbtack today.

00:00:34 Speaker_04
How much money do you make? My name is Vivian Tu, better known as your rich BFF and your favorite Wall Street girlie. My podcast, Net Worth & Chill, is back and better than ever for season two.

00:00:44 Speaker_04
We've got finance experts and your favorite celebs answering all those taboo money questions you've been too afraid or too embarrassed to ask. With new episodes dropping every Wednesday, you can watch or listen.

00:00:54 Speaker_04
Sit back and relax and get ready to Net Worth & Chill.

00:01:03 Speaker_05
Hi, everyone, I'm Brene Brown, and this is a Crossover Dare to Lead, Unlocking Us episode. Hey, Barrett. Hi. I'm excited about this episode.

00:01:13 Speaker_05
We're coming to the end of an eight-episode series called Living Beyond Human Scale, The Possibilities, the Cost, and the Role of Community. And the whole kind of driver for me behind this series was trying to

00:01:33 Speaker_05
step maybe, I don't know, back or up and look at what's going on around us. It feels, you agree, it feels untethering?

00:01:41 Speaker_06
It does, yes, I agree.

00:01:42 Speaker_05
Yeah, I think the velocity of change, social media, shitshow, AI conversations in every nook and cranny of every organization we're doing work in. What stands out to you as the most disorienting, if anything is disorienting?

00:02:02 Speaker_06
I think it's interesting because between the podcast episodes that we've done and then the events that we've been doing over the last several weeks, I think it's just the uncertainty that everybody's in and not really understanding what's next.

00:02:17 Speaker_06
And then you said something the other day at an event that really has stuck in my head that We're already not on solid ground and solid footing, so we don't have a real foundation to even start to build and understand what's next.

00:02:33 Speaker_05
Yeah, I mean, I think that's true.

00:02:34 Speaker_05
I think, by the way, if you haven't guessed yet, this episode is going to be just me and Barrett reflecting back on these episodes and what we've learned and kind of what our go-to plan is from what we've learned and what we're still confused about, which is the biggest category, maybe.

00:02:49 Speaker_05
I mean, I think this is the point. AI, social media, an election year, absolute devastation and violence and tragedy all over the world right now. And it's important to understand what's happening.

00:03:07 Speaker_05
And it's more than I think we're neurobiologically wired to handle. Right. And so the point that you just brought up is like, it's not like we're, Hey, shit's getting ready to get real. So make sure you're grounded.

00:03:20 Speaker_05
You're feeling good about yourself and well rested, well fed, well moved, you know, well connected. Let's go. It's we're on the ground and the whole room spinning up and down. And we're trying to get on our hands and knees and climb up.

00:03:33 Speaker_05
And then all the shit's coming at us at the same time. Cause we're not okay. People are not okay. Do you think that talking about the human scale. metaphor or story is worth repeating. I did it on Astaire's, the first podcast in the series.

00:03:47 Speaker_06
Yes, I totally do.

00:03:48 Speaker_05
Okay, so if you haven't listened to Astaire's, the podcast with Astaire, which we did live at South by Southwest, which kicked off this whole series, the idea came to me because maybe 10 or 15 years ago, I was getting my hair done and the stylist at the salon

00:04:07 Speaker_05
looked at me at some point, and I had like all those foils in my hair, and I had my phone against my ear talking to somebody at work, and then I was also on my laptop. And he goes, man, you were like shot out of a cannon. And I was like, dude,

00:04:24 Speaker_05
I'm working. Yeah, you do your focus. I'll do my focus. I got shit to do here. I'm at work. These appointments take 90 minutes to get to my natural color. And so like, you focus on you and I'll focus on me.

00:04:36 Speaker_05
And I just kind of looked at him and I'm like, I don't know what that means. I'm shot out of a can. Anyway, I got shit to do. And I'm on my laptop. I'm on my phone. And he came back and he goes, wow, just really living beyond human scale.

00:04:50 Speaker_05
And I was like, okay, enough is enough. What the hell? So I got off the phone, I closed my laptop and I said, what do you mean?

00:04:57 Speaker_05
Because I knew when he said you're living beyond human scale, what I looked like in that moment with a hundred foils in my hair and a phone tucked under my ear, my laptop open. I knew that I looked like a maniac.

00:05:12 Speaker_05
And so I was like, what do you mean living beyond human scale? And he said, well, I'm a private pilot. And he said, when you first get your pilot's license, It's really amazing. And I related to this because, as you know, I took flying lessons.

00:05:22 Speaker_05
I do know. In a two-seater Cessna when I was in high school as a way to bond with my father. And so he's like, you know, you take flying lessons and you're in these old two-seater planes. And it's really amazing because You take off and you're flying.

00:05:39 Speaker_05
As he said it, I could feel it. If the wind blows hard, the plane moves. If you throttle down, you can kind of feel it in your stomach. If you do this, you are in a plane that's built completely at human scale. It seats two people.

00:05:56 Speaker_05
It goes fast, but not super fast. He's like, you get tired of that. And so then you want to fly a jet. Well, I never got there because I never even did my solo on the two-seater human scale plane, let me just tell you.

00:06:08 Speaker_05
At some point, my flight instructor was like, I really can't in good conscience keep taking your money for these lessons because you have so many hours now and we've done all the stuff, you just got to go up by yourself.

00:06:17 Speaker_05
I was like, well, this is where our story ends because if you think if I'm going to get up in this damn thing by myself, you are not okay. So he said, then you get the jet.

00:06:31 Speaker_05
And he said, when you get the jet and you start flying the jet, you can't feel anything. And it's not built to human scale. It's going like beyond human scale fast. And now you're not in the moment at all.

00:06:42 Speaker_05
Now you're thinking ahead 30 seconds or a minute. You can't be in the human moment. Because if you're in the human moment, you die. You've got to think way out ahead. So you've disembodied

00:06:54 Speaker_05
and your body is in this moment, but your mind is in this moment, and you're thinking ahead, and this is where very few people crash in those little planes, but in those big planes, he said it's controlled flight into terrain, meaning as a pilot, you think you have control of the flight to the minute you're dead.

00:07:11 Speaker_05
You just fly into the side of the mountain, and you're like, oh, I wasn't far enough out thinking. And he said, so I think it feels like you look like you're living beyond human scale. And I was like, Why?

00:07:20 Speaker_05
Because I'm on my phone, my laptop, and I'm trying to get signals from out of space with my foils and my hair? Shut up. You don't know me. But did he? But he did know me. I never went back to that guy again. I was like, you got to leave me alone.

00:07:35 Speaker_05
But I thought about it ever since. So I wanted to do this series on living beyond human scale. We're disembodied. We're not tethered in our bodies, we're not grounded, and we're being swept away by change. And so that's what this was.

00:07:50 Speaker_05
So we started on Unlocking Us with Esther Perel, which is great. What were your big key takeaways from the podcast with Esther?

00:07:58 Speaker_06
Well, number one, I just thought it was really fun to be at South by Southwest for that conversation and to have the live audience. The conversation just about social media and comparison and the connection is not really real.

00:08:15 Speaker_06
And I love how you kind of frame it sometimes about social media is not a tool for connection.

00:08:20 Speaker_05
Yeah, it's a communication tool.

00:08:22 Speaker_06
Yeah. And so I think for me, that's where I kind of sat in the conversation with this there.

00:08:27 Speaker_05
I think my big takeaway from her is when she, you know, the image for the podcast on social media said, I have a thousand friends, but no one to feed my cat. And it's exactly to your point.

00:08:37 Speaker_05
I think like that we have 10 or 15 or something million followers on social media. I do. Yeah. I know 20 of them. And then whatever 20 minus 15 million is, that number, 40% of them are just following, hoping that something really terrible happens to me.

00:08:57 Speaker_05
And they've got some good collective schadenfreude material. Yeah, that's true.

00:09:00 Speaker_06
That thing is true.

00:09:01 Speaker_05
Yeah. Yeah. Once you get to those numbers, you got like every week on Instagram, I get something that says, you have a hundred and something thousand followers that we believe are spam.

00:09:10 Speaker_05
I'm like, well, if they're nice, they can stay to balance the people who are actually real or who are hateful. But it's just, I don't know. I don't think we're built for it.

00:09:18 Speaker_06
Has she framed it as artificial intelligence? The AI is artificial intelligence. I loved that.

00:09:24 Speaker_05
Oh, yeah. It's artificial intimacy. Oh, yeah. Artificial intimacy. I love that too.

00:09:27 Speaker_06
Yeah, that was so good.

00:09:28 Speaker_05
And I think, I guess one of my big takeaways that I've thought of a lot is that attention is an undervalued form of love.

00:09:36 Speaker_06
Oh, yes.

00:09:38 Speaker_05
It's big. Yeah. I highly recommend y'all listen. I'm going to say that to all of them, though. Okay, then we talked to William Brady. And so Esther Perel is a therapist. Amazing live conversation. Funny, fun, and hard, and really about love. Yeah.

00:09:56 Speaker_06
It was definitely, there are a lot of truths in that.

00:10:00 Speaker_05
Then William Brady, who is a scientist, scientist, and he is so interesting because he studies moral outrage on social media and the importance and helpfulness of moral outrage. I mean, there's a lot of shit we should be absolutely outraged about.

00:10:13 Speaker_05
I mean, if you look at the treatment of some of the student protesters, if you look at what's going on in Gaza, if you look at what's going on in Sudan and Congo, I mean, like, if the response to that is not moral outrage, then I think something is wrong.

00:10:26 Speaker_05
What I learned from this, though,

00:10:29 Speaker_05
was helpful because this was the start of a thread for me of understanding, I mean, talking about an indictment of late-stage capitalism, like, what happens when you've got technology, including social media, that's built to

00:10:56 Speaker_05
Leverage human vulnerability for commercial gain. That is my big takeaway from all of this, to be honest with you.

00:11:04 Speaker_05
What happens when moral outrage, if you study the construct of moral outrage, which I've mostly studied other people's research, I don't study it myself, but you understand that it does serve a purpose.

00:11:16 Speaker_05
It can be a catalyst for activism and change and wonderful things. It can also lead to dehumanization. Like for me, I just use the student protesters as an example of

00:11:26 Speaker_05
You know, the treatment of them by armed, basically military-looking police on campuses, to me, creates moral outrage in me. The treatment of Jewish students on campus also creates moral outrage in me.

00:11:42 Speaker_05
And so I think one of the problems is that with moral outrage also comes this self-righteousness that we become the monster we're trying to kill.

00:11:55 Speaker_06
Yeah and I think what one thing that was so interesting in this conversation for me was kind of the common enemy intimacy and this vicious cycle of connection that's not real but oh do we hate the same people so you kind of find yourself on the extreme ends of the spectrum because you're so

00:12:18 Speaker_06
looking to belong to a group of people. And when Dr. Brady talked about that, I thought it was really powerful to find yourself just in this vicious cycle of looking for connection, but is that really where you're gonna find it?

00:12:33 Speaker_06
And is that really what you believe? Are you just so far in now to the connection that you think you have with this community?

00:12:38 Speaker_05
I mean, I do think that there are people in my life that I'm connected to genuinely. And there is room for disagreement in those relationships. That counterfeit connection around moral outrage is so powerful.

00:12:59 Speaker_05
And then the social learning of, oh, the more outraged I am. And then his study that shows, like, they looked at hundreds of thousands of tweets and these things and then talked to the people who turn out to be less pissed off and outraged.

00:13:13 Speaker_05
than what they appear in their social, in their avatars on social media, but they're trying to belong to a group of people.

00:13:20 Speaker_06
And the more outraged you are and the harder your comments are, the more the algorithm puts it in front of everyone.

00:13:28 Speaker_05
This is the bullshit part to me.

00:13:29 Speaker_06
Yeah.

00:13:30 Speaker_05
Because this is where commerce meets the exploitation of human people, because they need us to stay on these platforms to serve us ads. And the second we forget that, We're just fucked. Yeah.

00:13:47 Speaker_05
I mean, we really are because I saw this, we'll link to this in the podcast page. I want everybody to watch it. It like was life changing to me, sending it to my kids.

00:13:57 Speaker_05
There was a psychologist from Oxford talking about, are some mental illnesses, illnesses are a hashtag. And what happens when the velocity of messaging around us that if you have these three things, you have this diagnosis.

00:14:12 Speaker_06
Oh, man.

00:14:12 Speaker_05
If you have these four things, you're probably this. If you have these six, I mean, let me tell you, I'm not even kidding you. Like, I take those tests. I'm like, that's what I have. And look, I mean, look, I'm a social scientist. I'm like, oh my God.

00:14:26 Speaker_05
I know. I have five of these five things. This is my diagnosis for sure. And I'm just not getting the right medication. And if I do get the right medication, my whole life is going to be great.

00:14:38 Speaker_06
But in the next swipe, you'll exactly see what you need to take for that to be better. I know.

00:14:44 Speaker_05
And then the next swipe, it'll tell you, whatever you do, don't take that medication. I mean, you're like, I know.

00:14:52 Speaker_05
Yeah, so I think William Brady has a lot to teach us about polarization, about, again, if you're not outraged by some of the stuff you're seeing, you're not paying attention.

00:15:01 Speaker_05
But my whole new thing about advocacy and activism, it's an interesting place where that dissects with my, what's the word I'm looking for? Dissects? Intersects. Intersects. God dang, son of a Seacook. Yeah, that's it. Takes a village. But no, listen.

00:15:23 Speaker_05
Oh, yesterday. Was yesterday the 12th? Yeah, 28 years of sobriety yesterday. Oh, happy birthday. Thank you, thank you. This is what I think about activism now.

00:15:34 Speaker_05
Hey, friend, you take your inventory about your activism and I'll take my inventory about my activism. You have no idea what I'm doing. You have no idea what I'm giving.

00:15:43 Speaker_05
You have no idea who I'm talking to, who I'm writing to, who I'm calling every week. You don't know anything about what I'm doing. And if you're going to evaluate my activism from what I do on social, you'll have no understanding of me at all.

00:15:57 Speaker_05
And I believe the same thing about you. So I'll take my inventory on activism. You take your inventory. And my hope for everybody is that they are as comfortable with their inventory as I am with mine. Amen.

00:16:28 Speaker_10
Fox Creative. This is advertiser content from Zelle. When you picture an online scammer, what do you see?

00:16:38 Speaker_02
For the longest time, we have these images of somebody sitting, crouched over their computer with a hoodie on, just kind of typing away in the middle of the night. And honestly, that's not what it is anymore.

00:16:47 Speaker_10
That's Ian Mitchell, a banker-turned-fraud fighter. These days, online scams look more like crime syndicates than individual con artists. And they're making bank. Last year, scammers made off with more than $10 billion.

00:17:02 Speaker_02
It's mind-blowing to see the kind of infrastructure that's been built to facilitate scamming at scale. There are hundreds, if not thousands, of scam centers all around the world. These are very savvy business people. These are organized criminal rings.

00:17:17 Speaker_02
And so once we understand the magnitude of this problem, we can protect people better.

00:17:24 Speaker_10
One challenge that fraud fighters like Ian face is that scam victims sometimes feel too ashamed to discuss what happened to them. But Ian says one of our best defenses is simple. We need to talk to each other.

00:17:37 Speaker_02
We need to have those awkward conversations around what do you do if you have text messages you don't recognize? What do you do if you start getting asked to send information that's more sensitive? Even my own father fell victim to a, thank goodness,

00:17:50 Speaker_02
a smaller dollar scam, but he fell victim, and we have these conversations all the time. So we are all at risk, and we all need to work together to protect each other.

00:18:00 Speaker_10
Learn more about how to protect yourself at vox.com slash zelle. And when using digital payment platforms, remember to only send money to people you know and trust.

00:18:12 Speaker_03
You're okay, you're okay.

00:18:15 Speaker_09
Wolves and dogs are pretty closely related. They actually share 99.9% of their genetics. But even when they're just a few months old, even when they're raised by human scientists, wolves are pretty different from dogs.

00:18:31 Speaker_00
They start biting you in the ears when you're lying down, if you don't sit up fast enough. And you hear this wonderful noise, this little... And then they chomp you in the ear, and you're like, oh!

00:18:43 Speaker_09
And when they grow up, these differences get even bigger. Dogs are our friends. Wolves are hunters.

00:18:50 Speaker_00
— If I had a sore shoulder, I wouldn't go in with the adult wolves, even if I had raised them, because it could trigger their hunting behavior.

00:18:59 Speaker_09
— This week on Unexplainable, how did we get the nice, friendly dogs we know and love today from wolves? Follow Unexplainable wherever you listen for new episodes every Wednesday.

00:19:14 Speaker_05
Let's go to the next person, S. Craig Watkins. Oh my God, this was a good one too. Okay, y'all need to listen to him. You know, he has a joint appointment at University of Texas at Austin, Hookham, and MIT.

00:19:27 Speaker_05
And I think the big takeaway for me is that when we start unleashing AI in vulnerable industries, like our vulnerable spaces, like policing, prisons, healthcare, And we're not careful.

00:19:46 Speaker_05
AI is just a machine that people put stuff inside of, and we'll be scaling injustice. And so the examples he gives about that are so powerful. And I love, where is that quote? He taught me the phrase, the alignment problem.

00:20:01 Speaker_05
I think the phrase is 20 years old, but the alignment problem with AI scientists is, are we building systems that are aligned with our values as a democratic society? And the answer right now is no.

00:20:11 Speaker_06
Yeah.

00:20:11 Speaker_05
What did you think about this?

00:20:13 Speaker_06
I mean, number one, I was so grateful that he spoke to us in terms that were so easy to understand.

00:20:20 Speaker_05
Yeah.

00:20:20 Speaker_06
Oh, yeah. He's so good. Yeah, he's so good. I love that he talked about the right people being at the table to build the AI systems. And I also just really loved, I mean, he really brought into focus for me AI is just what we make it.

00:20:38 Speaker_06
And we're going to screw it up or we're going to make it great.

00:20:41 Speaker_06
And the other thing is that he talked about that I thought was really disturbing kind of, but then at the same time hopeful because someone's talking about it is that we just don't have any guardrails yet. And what does that mean?

00:20:57 Speaker_05
Yes. His conversation with us was the first time I understood what the consequences are when tech moves faster than policy. I mean, he said, we're going to look back, right, in 10 years and be like, what the heck were we thinking?

00:21:15 Speaker_05
And this idea that when you're building AI, It's great to have the engineers and the computational mathematicians at the table, but you better have the ethicists, the people with lived experience, the liberal arts folks.

00:21:27 Speaker_05
Shout out to all the liberal arts people. Let's go. The social workers, the humanists, they better be at the table, too.

00:21:34 Speaker_06
Yeah, and I also just really love and give a shout out when he said, well, so far in this 30-minute conversation, you said scary six times or whatever he said.

00:21:44 Speaker_07
I was like, yeah, sure did.

00:21:46 Speaker_05
You're smiling from ear to ear because you got to talk to him offline a lot, like coordinating. He was so great, right? He was so great, yeah. So, we love that talk. So thank you to him too.

00:21:56 Speaker_05
Okay, so Jennifer Valentino-DeVries and Michael Keller, both New York Times, just award-winning journalists. They wrote a piece for the New York Times about young girl influencers on Instagram who are managed by their moms and then stalked by men.

00:22:15 Speaker_05
And this got a lot of comments and a lot of feedback across social. This was a hard conversation. It really was, yes.

00:22:22 Speaker_05
You know, and they're reporting and, you know, not everyone's going to geek out on their methodology, but as a researcher, I thought their methodology was just so elegant and sophisticated and amazing.

00:22:33 Speaker_05
I thought the article was really disturbing and scary and sad about parents running influencer accounts of young girls, not really often making a ton of money, but getting a lot of free stuff. And then.

00:22:50 Speaker_05
the men who follow those accounts and really send completely disturbing, lewd, inappropriate, sexualized comments and are scary.

00:23:02 Speaker_05
And it was interesting because I think we talked very much about how you're reading this and your focus is just like, what are these moms doing? What are these moms doing? And then the invisibility of the men and

00:23:18 Speaker_05
I think that men need to be held accountable. And I don't think there's there again, no guardrails, no policy, the platforms do exactly whatever they want.

00:23:28 Speaker_06
Yep. It was such an important piece to understand, but it was also kind of the underbelly of what social media can be. But it's also important to understand, you know, as moms of kids and understanding what social media is and what it isn't.

00:23:44 Speaker_06
And in the conversation, I thought it was really interesting that when they talk, because they talked to so many of the people that they were interviewing, that most of them started off in just like simple

00:24:00 Speaker_06
ways to really showcase what their kids are doing. And then all of a sudden, it just materialized into this monster thing almost that they couldn't shut down. And so I thought that was a really interesting aspect.

00:24:12 Speaker_05
And that's where I would like really pause and double click on the couldn't shut down. Yeah. You know, they can shut it down. They can shut it down, yes. But if you're listening and you're thinking, yeah, you can shut it down, well,

00:24:27 Speaker_05
Are you listening right now and social media is hurting you, but you're scrolling anyway?

00:24:30 Speaker_06
I know. Yeah.

00:24:31 Speaker_05
I mean, are you on social media?

00:24:34 Speaker_06
Yeah.

00:24:34 Speaker_05
Are you scrolling? Yeah. Does it hurt you?

00:24:36 Speaker_06
Yeah.

00:24:37 Speaker_05
Yeah.

00:24:38 Speaker_06
At 1.30 a.m. when I'm like, damn, I got to be up at 5. Yeah. Yeah. And you know, mentally, all of it.

00:24:44 Speaker_05
Yeah.

00:24:46 Speaker_06
And we asked somebody in our organization the other day, we had this conversation, and you asked them, why are you still on social media?

00:24:55 Speaker_06
And I can't remember what their response was, but it was something along the lines of, because this is how we all stay connected and communicate with each other.

00:25:02 Speaker_05
I don't want to miss anything. And I don't know, I got off social media for a year because I just the mental health of it as kind of a public figure.

00:25:13 Speaker_05
Literally, you know me probably better than you've known me since the moment they brought you and Ashley home. I mean, can you think of anyone who's less cut out to be public than me?

00:25:24 Speaker_06
No, I can't.

00:25:25 Speaker_05
I mean, I'm super introverted and I get my feelings hurt.

00:25:31 Speaker_07
Yeah.

00:25:32 Speaker_05
And I keep my feelers open, because that's what my research part of me likes to do. So I'm terrible at it. And every day I'm thinking, I just shut this whole thing down.

00:25:40 Speaker_06
I know. It's such a double-edged sword.

00:25:42 Speaker_05
Yeah.

00:25:43 Speaker_06
And this conversation was really helpful and just kind of, like, shined the light in some of the areas to pay attention to, I think. Yeah.

00:25:50 Speaker_05
It's a good conversation. But I even wrote in the caption, I think, on social media, ironically, that I got really judgy

00:26:01 Speaker_05
I came off that podcast saying, I'm going to lock all the men up forever, and then I'm going to get the moms into some deep therapy, and then I'm going to enact the rules that they have in France, no one under 18 on social media.

00:26:16 Speaker_05
And not, you kidding, put your kids' pictures up. That's it. We're done. I'm a closet dictator. Don't even say anything. I don't even know.

00:26:27 Speaker_06
I don't want to hear it. It was a really good conversation, as hard as it was. It was really good and helpful.

00:26:32 Speaker_05
Yeah. Okay, so then our crossover episode was Amy Webb, and she is incredible. What's an official thing that she says that she is?

00:26:44 Speaker_06
She says strategic foresight.

00:26:46 Speaker_05
Strategic foresight. Okay. I just think she's a badass.

00:26:48 Speaker_06
She's a total badass.

00:26:50 Speaker_05
Yeah. So this was the crossover episode where it was on Dare to Lead and Unlocking Us. If you could see the line at South by Southwest every year to get into her talk, and probably about 20% will get in, and the other 80% do not get in.

00:27:03 Speaker_05
And people are in line, if her talk is at 10, at 4 o'clock in the morning to get in. Because she'll say, here's what's coming. Here's where we are. This is how it's going to go. And it's amazing. And she's borderline prophetic in what she says.

00:27:20 Speaker_06
I love this conversation. I thought it was so good. And again, she has an incredible way to make really complicated things easy to understand and follow.

00:27:29 Speaker_06
And she had a really good metaphor in here that I'll let you explain about steering into the- Oh, yeah. Steering into the curve or to the ice.

00:27:38 Speaker_05
Steering into the ice path. Yeah. She's like, where we are basically is when you're driving and you hit a patch of ice, your human reflex is to slam on the brakes and kind of steer out of it.

00:27:50 Speaker_05
And what you have to do, because she grew up in a place where there's a lot of ice, is you have to keep your foot off the brake and steer into it a little bit. And she said, that's the uncertainty we're in right now.

00:27:58 Speaker_05
Like, we're hitting some spinny, out-of-control ice stuff. And we've got to take a deep breath, maintain some composure, and steer into it a little bit until we can gain control of, you know,

00:28:14 Speaker_05
Really, I think the car in this metaphor is our nervous system, until we can do that. So I thought the metaphor was incredible. I also thought the information that we're in a technology super cycle, three big things, which is artificial intelligence,

00:28:32 Speaker_05
wearables. If you're listening, you're like, oh, my God, I'm not going to be doing that. I'm not going to work with a helmet on my head that casts my thoughts onto a wall. No, no, no. I've got wearables all over me right now. I've got an aura ring.

00:28:46 Speaker_05
These are wearables. And then the last one is biotechnology, and that's coming. That's here. Yeah. That's the thing about artificial intelligence that I think

00:28:56 Speaker_05
I'm not saying if you're on Facebook that you're old like me, but the Facebook comments were so decidedly different on these podcasts. They're like, I'm not going to do it. I'm not going to do artificial intelligence. I don't believe in that.

00:29:11 Speaker_05
We're losing our minds. It reminds me of Meemaw at the Piggly Wiggly. Everybody, every person in midlife has a line where they're like, I don't give a shit whether I'm old or not. I'm not doing this part.

00:29:26 Speaker_05
I think our grandmother's memos was like, I'm not getting money out of a machine in the wall, and I'm not putting any check. I'm not doing the ATM. But when she went to Piggly Wiggly, the grocery store, it's really big in Texas.

00:29:39 Speaker_05
I don't know where it is, but I can't imagine the people in the north would have a store called the Piggly Wiggly. But she went, and they were scanning groceries. $2.99, $3.99, 49 cents."

00:29:49 Speaker_05
And she was so appalled and she said, I just don't think it's fair to keep those people crouched up like that underneath this conveyor belt looking up and reading those signs to people like,

00:30:04 Speaker_05
explained to her that it was like a computer reading a barcode. Do you have a line? Do you hear new things coming and you're like, I'm done with that now?

00:30:13 Speaker_06
I just think back to I don't even know how many years ago. It was probably like 16 years ago, and I remember sitting on my back patio with my teacher friend, and I was like, I'm never going to text. Like, why would people text? And she made me text.

00:30:30 Speaker_06
It was so funny. So I don't know my line right now. It's a really curvy line. But I tell you what I do hate is when my 14 year old has to teach me something on my phone.

00:30:40 Speaker_05
I hate it.

00:30:41 Speaker_06
Yeah. But the other thing I do love about Amy's talk was the concept of Gen T.

00:30:47 Speaker_05
Oh, Generation Transition.

00:30:48 Speaker_06
Yeah, because I thought that was really powerful because all of us, no matter how old we are, are going to be in that.

00:30:54 Speaker_05
Oh, yeah. We're all in the Generation Transition for sure. That's funny.

00:30:56 Speaker_06
I skipped right over your line question. Did you notice? Yeah.

00:31:00 Speaker_05
Well, I can tell you this. She has agreed to text, but she doesn't answer them.

00:31:04 Speaker_06
That is not true. I am a good responder. Well, we should get dad on the podcast one day and ask him who's the better responder. Never mind.

00:31:14 Speaker_05
Oh, yeah. You want to do that? Let's call him. Let's call him right now. You want to? Maybe 10 years ago. He calls me once a week and asks if you're dead. So, no. Okay. Then we talked to, love this conversation too, because at this point I was like,

00:31:32 Speaker_05
Not doom and gloom, but I was kind of going back and forth between, okay, I understand the cost, but what are the possibilities?

00:31:38 Speaker_06
But I will say, too, before you say what you're going to say, because I know where you're going, one thing I did love about Amy's talk and what she said is what an exciting time it is to enter the workforce. Oh, yeah. For young people.

00:31:50 Speaker_06
And I did think that that did feel really hopeful for me because we have a lot of really amazing young people coming in right now to the workforce, too. And I thought that was really important.

00:32:01 Speaker_06
And it'll hopefully piggyback on what you're getting ready to say.

00:32:05 Speaker_05
Oh, yeah. So, Lisa Gevelber, so she talked about AI for good and some of the things that are happening. how people are using AI, for example, lesson plans for teachers, which was incredible. Oh my gosh.

00:32:17 Speaker_06
I was thinking like when she said that people are putting in their lesson plans and they're like, give me four different reading levels, two languages. I was like, what the heck?

00:32:27 Speaker_06
I didn't even do lesson plans on my computer until the very end of my teaching career. I hand wrote them.

00:32:33 Speaker_05
Yeah, I mean, it's incredible, right? And plans for kids who have to miss, you know, two or three weeks of school because of illness or, you know, family emergencies, you know, just to say, feed in, this is what we've been doing.

00:32:45 Speaker_05
What's a great catch-up plan for someone in this grade with this reading level, you know, and this proficiency here? What's the best way to catch this?

00:32:53 Speaker_06
You know, like, that's... It was incredible. I was really shocked in her conversation that there was still such a digital divide. I had no idea when she talked about the work that they do to get people email addresses.

00:33:07 Speaker_06
They go into libraries and help people set up email addresses. I don't think I even understood. Number one, what all you need an email address for, because I think I just take it for granted.

00:33:18 Speaker_06
And number two, that there are people out there that don't have internet and don't have email addresses. And so I was surprised by that and so grateful for the work they're doing to catch everybody up in that area.

00:33:31 Speaker_05
Huge. And the role of libraries, especially as they're defunding them here left and right.

00:33:39 Speaker_06
I think your love for libraries and librarians has rubbed off on me, and so that was really important when she said that.

00:33:46 Speaker_05
Yeah, libraries are where people go for high-speed internet when they don't have it at home. And so when you defund a library, you can change the economic future of an entire family. So cut that shit out. So this was really good.

00:34:02 Speaker_05
So Lisa Gevelber for AI for Good. Also, Google Certificates was a conversation here. And I just read an article about Google offers certifications in technology that you don't have to have a four-year degree to do.

00:34:19 Speaker_05
And they're online and asynchronous, meaning you can be Busting ass doing some job, waiting tables. I always refer to that because that's what I did for so many years. But during the day or whatever, and then take these classes at night or vice versa.

00:34:31 Speaker_05
So they're super flexible classes. At the end, you're certified to do a tech job, and you're certified from Google. And these folks are making six figures and changing families' trajectories.

00:34:41 Speaker_06
I mean, it's incredible.

00:34:42 Speaker_05
Yeah.

00:34:44 Speaker_06
Yeah, I did too. The other piece I really loved in Lisa's conversation was the human loop. And I think it just ties back to other conversations that the humanness of the AI systems that we're building and the technology.

00:34:54 Speaker_05
Yeah, keep a human in the loop. And so I think it was as Craig Watkins that said, it's interesting what people said. From the top, as Sarah Perel said, AI, artificial intimacy.

00:35:06 Speaker_05
Craig Watkins said AI should be augmented intelligence, because it should be augmenting human intelligence, not its own. And then we have Dr. Joy, our final like...

00:35:19 Speaker_06
What a way to end.

00:35:20 Speaker_05
I love this. One of our colleagues sent me a text. Well, first of all, three or four of the people who work here from the youngest people to the most senior texted me and said, one of the youngest people texted and said, oh my God, Dr. Joy.

00:35:33 Speaker_05
And then one of the most senior people said, Dr. Joy's career making me rethink all my life choices.

00:35:50 Speaker_07
Okay, big reveal. Here we go. Wow. You guys kind of get a first look at a completely removed cop code number one.

00:36:01 Speaker_04
This is an area where there was not river sounds for over a hundred years.

00:36:05 Speaker_07
On this week's episode of Gastropod, we are telling a super exciting story about a river, but actually about fish. And in terms of fish, it's actually about one particular fish, about salmon.

00:36:17 Speaker_07
But it's really about something that's incredibly exciting that will help salmon. It's about the world's largest dam removal project ever. That's right. This episode, we are going to the Klamath River on the California-Oregon border.

00:36:29 Speaker_07
It was once one of the world's great salmon rivers. We wanted to see what happens when you take a river away from salmon and how to go about giving it back. Find Gastropod and subscribe wherever you get your podcasts.

00:36:51 Speaker_05
the founder of the Algemarithic Justice League. She's an MIT researcher, an artist, a poet, a Rhodes Scholar, a Fulbright Fellow, a recipient of the Technological Innovation Award from the Martin Luther King Jr. Center.

00:37:04 Speaker_05
I mean, she really walked me through it.

00:37:11 Speaker_06
She is so incredible and, well, she's so funny and she's so human. God, yes. Which she just talks about like, I have a new mantra because of y'all's conversation, and I'm like, well, if she can do those push-ups, I can do those push-ups.

00:37:27 Speaker_06
If she can play basketball, I can play basketball. So I'm like, if she can be a Fulbright scholar, so can I. I just love the possibilities that come along with listening to Dr. Joy. It's incredible.

00:37:40 Speaker_05
Yeah, and how she's dedicated her life to looking at AI and technology. And some people call AI generative. intelligence or generative learning. And she's like, no, they're regurgitating. It's regurgitating because they've been fed your books.

00:37:57 Speaker_07
Yeah.

00:37:57 Speaker_05
And they're regurgitating that stuff. And so she really talks about equity and justice and consent. And yeah, she just the daughter of a scientist and an artist, bringing poetry and coding into a justice lens. Amazing.

00:38:16 Speaker_06
Yeah. I mean, her study on gender shades, I could not even wrap my head around how people, how companies were missing, like being able to identify people based on the color of their skin and the shades of their skin.

00:38:32 Speaker_06
I was in awe of her approach to what she was doing and a little mortified that we weren't doing better.

00:38:42 Speaker_05
Yeah, no, yeah. I mean, what a group of people. Oh my gosh, I feel like we're so lucky. Yeah, I feel like we're so lucky too. It's a job to be able to talk to these folks and learn. So let me ask you this, I'm gonna put you on the spot. Okay, great.

00:38:56 Speaker_05
What did you learn that's changed how you're thinking about your 14-year-old daughter's social media use?

00:39:05 Speaker_06
Well, that's not the question I thought you were going to ask. You know, I think we're at a really weird time. So she's 14, so she's Some friends don't have phones yet. Some friends have phones, but no social media.

00:39:20 Speaker_06
And what's interesting is they feel like it's so critical to belong, for belonging. And so, you know, we have a rule that we go through it together. And anytime I want to say like, hey, let's check in. We all had this rule for our kids.

00:39:37 Speaker_06
I think it started with you. So thank you for that. But when they started on social media, they had to start. They can only follow the aunties and cousins. And so I think it was really helpful, good little start to navigate Instagram.

00:39:50 Speaker_06
And I just this week let her get Snapchat. And I mean, it's fun to watch her. And like, I love how she, what's the word I'm looking for? The style she has on Instagram. Oh, got it. Yeah, yeah, yeah. You like developing an aesthetic that's hers.

00:40:06 Speaker_05
Yeah, yeah, yeah.

00:40:07 Speaker_06
And I love that. It's so fun. And it's private. So just FYI, that was like, we were on lockdown on all the accounts. Oh, yeah, same. But I think it's like,

00:40:15 Speaker_06
Having to be really careful, it's such a great opportunity for trust building with us, because she had to sign a contract, number one, like, here are all the rules for social media.

00:40:26 Speaker_06
Here are the things that you, if you do these, you'll lose social media. And we also have a three-strike rule around the house.

00:40:33 Speaker_06
Like, if she's not being helpful or not getting her chores done, strike one, strike two, strike three, you're off social media. So, I mean, I think

00:40:41 Speaker_06
It's helpful in a lot of ways and I think we've also had to have conversations about people only sharing the best of them on social media and to watch out that not everybody has the best of every day.

00:40:53 Speaker_05
Yeah. I think it's interesting and You know, hard. I think it's hard. My kids are older. They teach me a lot, you know?

00:41:03 Speaker_05
So they're, you know, I think in one of the podcasts, I talk about my two buttons that I use when I'm scrolling through all of the experts. If you have these five things, then you mean have this.

00:41:13 Speaker_05
If you have these four things, oh, did you, you know, and then the ones that feel really vulnerable to me that make me super hateful were like, oh, if a parent with dementia, you can not have dementia if you eat, you know?

00:41:23 Speaker_06
Oh my gosh, yes.

00:41:23 Speaker_05
Do you, like, you see all those? Like, they must have known that mom had that. that kind of grifty, expert, like who are you?

00:41:34 Speaker_06
I know. There's like zero credentials for some of these people talking about really important things.

00:41:40 Speaker_05
Yeah. And that's really, I think it's why I went off social media.

00:41:45 Speaker_07
Yeah.

00:41:45 Speaker_05
Because I was scrolling through that one day and I just saw, I was so heartbroken because it was kind of toward the end of mom's life. And it was, you know, are you taking care of someone with dementia? You should do these three things.

00:41:55 Speaker_05
If you're taking, whatever you do, don't do these two things. It was the same things at that. And I was getting like overwhelmed. And then I came up. in the feed. And they had clipped something that made me seem so certain and so definite.

00:42:08 Speaker_05
Because on social media, that's why we're really careful about we have to review clips now, because they cut out the part where I say, geez, I don't really know, or I don't study that, or I'm unsure, or I'm struggling that with myself.

00:42:20 Speaker_05
And then they leave the definitive part. And then I'm like, I can't be another one of these people contributing to this bullshit. And so that's why I went off for a year, because I was just like, I can't do that. because the more I understand.

00:42:33 Speaker_05
the less I know, for sure. And so I think that part's hard. Grateful for everyone who was part of the series, for sure.

00:42:41 Speaker_06
Yeah. And the one last thing that I would just add is it's been fun to be back out on the road and do all these big events.

00:42:47 Speaker_06
And I think what's been so interesting is just how AI is the topic of so many conversations in organizations and big conferences and big HR conferences that we've done lately. And I think it was so helpful. to hear all these different perspectives.

00:43:08 Speaker_06
There were some overlaps in how people are thinking, but the different perspectives, especially when we hear what's going on inside of organizations, I think it was really helpful to have this series right now in such a transformational time in the world.

00:43:26 Speaker_05
I mean, I agree. And I think sitting on top of that, which we haven't really talked about publicly, is the fact that I'm in the middle, coming maybe 70% into understanding the skill set that we're going to need. to navigate this at work.

00:43:39 Speaker_05
And so looking at leadership and AI for non-coding people, what skills are we going to need to survive the change that's coming?

00:43:47 Speaker_05
And so to have this podcast series on top of that research that we're doing right now, and then on top of being out talking to people, it's been exciting. I agree. Yeah. I think people will be surprised by the skill set when we get there.

00:44:03 Speaker_05
I can say this, you don't have to know Python or R or those things.

00:44:06 Speaker_05
It's not about coding skills, but it is about probably some neuroplasticity and really being in our bodies to understand how much our nervous system can take and not take and how to regulate emotion probably. It's gonna be big parts of it.

00:44:23 Speaker_05
All right, well, thanks for doing this summary with me. It was fun.

00:44:25 Speaker_06
Oh, yeah, you're so welcome.

00:44:27 Speaker_05
So all the notes, from this podcast will be on brenebrown.com under this podcast page, the episode page. And then we'll be back for another series in a couple of months.

00:44:40 Speaker_05
And I think it's been hard to find the podcast because we're trying series instead of every week, both podcasts. I just can't do that and do my research and my work. And so we're trying like time limited series.

00:44:52 Speaker_05
I think it's been hard for people to find them because they've been across Dare to Lead and Unlocking Us. And so we're gonna work on that.

00:45:01 Speaker_06
Yeah, I'm excited. I think it's an exciting time.

00:45:07 Speaker_05
All right, y'all stay awkward, brave, and kind. Bye. Bye. Dare to Lead is produced by Brene Brown Education and Research Group. Music is by The Sufferers.

00:45:22 Speaker_05
Get new episodes as soon as they're published by following Dare to Lead on your favorite podcast app. We are part of the Vox Media Podcast Network. Discover more award-winning shows at podcast.voxmedia.com.

00:45:48 Speaker_03
It's good