Futurist Amy Webb on What's Coming (and What's Here) AI transcript and summary - episode of podcast Dare to Lead with Brené Brown
Go to PodExtra AI's episode page (Futurist Amy Webb on What's Coming (and What's Here)) to play and view complete AI-processed content: summary, mindmap, topics, takeaways, transcript, keywords and highlights.
Go to PodExtra AI's podcast page (Dare to Lead with Brené Brown) to view the AI-processed content of all episodes of this podcast.
Dare to Lead with Brené Brown episodes list: view full AI transcripts and summaries of this podcast on the blog
Episode: Futurist Amy Webb on What's Coming (and What's Here)
Author: Vox Media Podcast Network
Duration: 01:14:16
Episode Shownotes
Quantitative futurist Amy Webb talks to us about the three technologies that make up the "super cycle" that we're all living through right now: artificial intelligence, wearable devices, and biotechnology, and why, despite the unnerving change, we still need to do some serious future planning. Learn more about your ad
choices. Visit podcastchoices.com/adchoices
Full Transcript
00:00:04 Speaker_06
Hi, everyone, this is Brene Brown, and welcome to, it's a mashup, it's a collab. It's between Dare to Lead and Unlocking Us. We're gonna drop this podcast. I so desperately want you to hear it. We're dropping it in both feeds. It's the crossover.
00:00:19 Speaker_06
It's like when, I don't know, I'm probably gonna get my parents laughing at me already, y'all. But back in the day, it was always such a big deal when Fred and Wilma Flintstone showed up on Scooby-Doo,
00:00:35 Speaker_06
or one of the characters did a crossover with the other, this is that. And I'm so glad that you're here for it.
00:00:42 Speaker_06
It's the fifth episode in a series that we're doing about, I call it, living beyond human scale, the possibilities, the cost, and the role of community.
00:00:51 Speaker_06
And when I say living beyond human scale, I go back to this quote by Jon Kabat-Zinn, you'll hear it in the podcast too,
00:00:58 Speaker_06
His definition of overwhelm, which I think is so beautiful, is the world and my life are unfolding at a speed that my nervous system and my psyche can't manage.
00:01:10 Speaker_06
And to me, living beyond human scale is everything from social media, AI, 24-hour news, bringing us really traumatic, violent stories from every corner of the world right now. And so the point of this series for me
00:01:29 Speaker_06
really, to be honest with you, it's super personal, is to try to get to the place where we can actually inhale. Just, okay, like, what's happening? What do I need to understand to be less afraid?
00:01:43 Speaker_06
We're using, you know, if you listen to the podcast with S. Craig Watkins, I hope you do, like he is a scholar who studied AI, UT, Austin, and MIT. He's got a joint appointment at MIT as well.
00:01:55 Speaker_06
And he called me out halfway through the podcast and said, you know, this is not unusual, but you've said scary like 20 times. And so this is a podcast series about things that seem very scary to me.
00:02:09 Speaker_06
And if there's anything I've learned over the past 25 years of my work, it's turn toward what's scary and look it in the eye and try to understand it. And you'll see my podcast guest today, Amy Webb, you're going to love this conversation.
00:02:23 Speaker_06
She said that it's like driving on ice right now. Intuitively, you want to slam on the brake and steer out of the ice, but you've got to not step on the brake and turn into the ice.
00:02:32 Speaker_06
And so I think that's really a great metaphor for what this podcast series is about. We are not socially, biologically, cognitively, and spiritually wired for some of the shit going down right now. Probably most of it. So that's the series.
00:02:47 Speaker_06
Astaire Perel, William Brady, S. Craig Watkins, Pulitzer Prize winning New York Times journalist Jennifer Valentino-DeVries, and Michael Keller. just really trying to understand.
00:03:09 Speaker_01
Thumbtack presents the ins and outs of caring for your home. Out. Uncertainty. Self-doubt. Stressing about not knowing where to start. In. Plans and guides that make it easy to get home projects done. Out. Word art. Sorry, Live Laugh Lovers. In.
00:03:31 Speaker_01
Knowing what to do, when to do it, and who to hire. Start caring for your home with confidence. Download Thumbtack today.
00:03:42 Speaker_06
So let me tell you a little bit about who I'm talking to today. She's got a job that if I didn't have this job and I wasn't an Episcopalian priest, singer, songwriter, this would be what I would do.
00:03:53 Speaker_06
Amy Webb is the founder and CEO of the Future Today Institute. She is an expert in strategy foresight.
00:04:00 Speaker_06
She has pioneered a unique kind of quantitative modeling approach and data-driven foresight methodology that really helps us understand what's happening, what's going to happen, why, how it's happening.
00:04:14 Speaker_06
She uses all this data along with her colleagues to identify kind of white spaces, opportunities, and threats early enough for action, which I think is really critical. They develop predictive scenarios along with executable strategy.
00:04:30 Speaker_06
They work with all the big companies, global leaders. She's also a professor of strategic foresight at NYU Stern School of Business. She's a visiting fellow at Oxford University, their school of business.
00:04:43 Speaker_06
And she's actually regarded as one of the most important voices on the future of technology. with specializations in both AI and synthetic biology. Her whole bio will be on the website page, so brenebrown.com will list all the links.
00:04:57 Speaker_06
I'm going to do a lot of referencing to her talk at South by Southwest. Y'all should see this freaking chaos and mayhem when she's at South by Southwest.
00:05:04 Speaker_06
People literally get in line at four and five in the morning to get into her talk because every year at South by in Austin, she releases kind of tech trends for the year.
00:05:14 Speaker_06
So I thought, well, we're not all at South by and I only understood about 60% of what she was saying. So I asked her, Hey, talk to us like we don't know what's happening. And she did. It's really human and you gotta listen.
00:05:30 Speaker_06
If you've got a job, you've got to understand this. If you're living in the world, you've got to understand this. If you've got a partner or kids, you've got to understand this. If you're taking care of parents, you've got to understand this.
00:05:41 Speaker_06
We've got to just turn into the ice together. Let's go. Amy Webb, welcome to, well, this is a crossover episode, so you're going to drop in both the Unlocking Us feed and the Dare to Lead feed, so welcome to both podcasts.
00:06:03 Speaker_04
I feel very honored. Thank you. This is very, very exciting. You know what? Let me tell you why. Secretly, I would love to time travel, and I feel like I'm now in a way like doing that by appearing in two places at once. It's kind of quantum.
00:06:17 Speaker_04
You kind of are. It's a quantum me.
00:06:20 Speaker_06
You're messing with the time-space continuum like Phineas and Ferb.
00:06:24 Speaker_04
Yeah. I love Phineas and Ferb. It's a great show. It's a great show.
00:06:26 Speaker_06
I do too, I always talk about it because I always feel like we do, Adam Grant, Simon Sinek and I do a traveling podcast and it's always like, I'm Candace and they're Phineas and Ferb. I'm like the older sister that, yeah.
00:06:39 Speaker_06
Okay, so let me start by saying that I am completely enamored, obsessed with your work and have been for a long time. I love what you do, I think it's crazy and scientific and crazy.
00:06:54 Speaker_04
Well, I have been called crazy before, but from you, I take that as a good compliment.
00:07:00 Speaker_06
Oh, it's such a good compliment. It's so fascinating. Before we get into it, because I'm excited to get into it, tell us your story. Where were you born? Where'd you grow up?
00:07:10 Speaker_04
Sure. Who were you? Who was little Amy? Who was I? So I typically tell people I'm from Chicago, but that's a total lie. I'm from Northwest Indiana, in a tiny sliver of Lake County,
00:07:23 Speaker_04
that was blue in a state full of red, and my parents moved there because it was affordable. My dad's family were, his parents were subsistence farmers, and so there just wasn't a lot of money.
00:07:41 Speaker_04
My mom's family immigrated, so education was strong on that side of the family, but generally speaking, everybody was sort of a first. Uh, so I grew up in very, very, very modest circumstances. My mom was a teacher. fourth grade teacher.
00:07:57 Speaker_04
My dad managed retail outlets, stores and things. He was brilliant and graduated from high school I think when he was 15 and probably could have gotten a full ride to a very prestigious college if that had been an option in his family.
00:08:15 Speaker_04
But he had to go to work. So that's where I started. And I think the thing with my parents who were very supportive of my sister and I was really like, put your head down and just work hard and hard work and education will get you through.
00:08:32 Speaker_04
So what were you like in high school? I was voted by the faculty. I went to a public school. There were 600 kids in my class. So it was enormous. I was voted most likely to succeed by the faculty.
00:08:47 Speaker_04
and voted most likely to be assassinated as president by my fellow students. Okay, I love this. I can see it. I was on the speech and debate team. Yeah, of course. Speech and debate was basically my life.
00:09:04 Speaker_04
I also had this separate track, which was music, which I didn't want to do anything with professionally, but I enjoyed it and I was good at it. That's how I got into college. But speech and debate was really my life. What kind of music?
00:09:19 Speaker_04
I started piano when I was four. Actually, I started classical guitar when I was four or five. My dad's whole family is very musical. Then piano, then I picked up clarinet in sixth grade and basically all woodwinds but double reeds in high school.
00:09:33 Speaker_04
I just was really good at it and I had a full ride to a very good music school. Given my circumstances, did not matter that I wanted to go to Georgetown. My parents said, look, you're going to go where we can afford.
00:09:49 Speaker_04
And that means you're going to go study music. So I enrolled.
00:09:53 Speaker_04
I actually got into the graduate studio for clarinet as a freshman and on the side started taking classes in game theory and economics, which is really what I wanted to study and try to make it all work. And it all came crashing down.
00:10:06 Speaker_04
By the end of that year, I was miserable and everything was horrible and I dropped out and then re-enrolled and put myself through school.
00:10:15 Speaker_06
I actually see the music in your work. It would definitely be jazz based on your work, but I definitely see some kind of pattern and rhythm in your work. So I'm not surprised about the music for some weird reason.
00:10:33 Speaker_04
Well, I mean, you've studied so much and researched so much about behaviors and thinking and how our brains work. I think you're right. The things that I tend to be good at all relate back to pattern recognition.
00:10:46 Speaker_04
And I kind of, we were joking at the beginning about the craziness. I really had a hard time in school because teachers didn't know what to do with me. I was a kid who had interest in several different things.
00:10:57 Speaker_04
And I happened, you know, I was like good enough in several different things. And I remember I wasn't like disruptive in class, but there was some amount of like, hey, pick a thing. And I didn't want to pick a thing.
00:11:09 Speaker_04
I liked doing a lot of different things because I saw the connections between them. And I think you're right. Music, math, languages, that type of thinking requires pattern recognition.
00:11:19 Speaker_04
The dark side of pattern recognition is that it can also cause a lot of mental distress because you start seeing things that maybe aren't as catastrophic as they really are.
00:11:31 Speaker_06
We could talk about that for a long time because I think connecting the seemingly unconnectable feels like the definition of my job and it's definitely got a downside.
00:11:40 Speaker_06
I thought something was wrong with me when I was young because I could predictably see the connection between emotions and behavior and what was getting ready to happen.
00:11:49 Speaker_06
And I felt like, oh, if I think if Harry Potter would have been popular, I would have thought, oh, I'm just waiting for my letter to Hogwarts because I'm clearly not a muggle. Something's going on.
00:11:59 Speaker_04
Well, that could be a pretty lonely place to occupy, though. And I'm in the end of my 40s now. When I was in middle school and high school, mental health was not a thing that anybody talked about.
00:12:09 Speaker_04
And so what I learned much later in life is that I have obsessive compulsive disorder that I've been managing for a long time.
00:12:18 Speaker_04
The same part of my brain that lights up and helps me see patterns and helps me make all those do all the stuff that I do is the same part of my brain that when I'm under duress, I tend to see like I'm very, very good at seeing.
00:12:32 Speaker_04
absolute worst possible scenarios. I'm great on risk and forecasting, but when you're 14 years old and you're still developing, you don't have the tools to cope with that.
00:12:44 Speaker_04
So there were a couple of times that I, I remember the first time I got a D. I was a straight A student. I would outwork and outstudy everybody, but I got a D once and I had a breakdown and I just remember being in the guidance counselor's office.
00:12:59 Speaker_04
She was just trying to help. Nobody knew what to do with me. She was trying to help. She brought my parents in for a conference, gave me a special light bulb, and was like, this light bulb will somehow help you.
00:13:10 Speaker_04
And somebody in the room might have been like, hey, maybe this kid needs to see somebody, because maybe this is something else. But that wasn't talked about, I guess, 30 years ago. And thankfully, it is now.
00:13:25 Speaker_06
Yeah, for all the pattern finders out there, just know A, you're not alone, and B, there's a cost to it.
00:13:31 Speaker_06
I think I'm learning at this stage of my life, late, I'm in my 50s, that the cost for me is a level of hypervigilance because I'm seeing a consideration set differently than other people are seeing, and I'm seeing if A, then B, and then C is a possibility, and shit.
00:13:46 Speaker_04
That would probably make you a very good candidate for strategic foresight, the field that I'm in. But it can be very, very challenging, especially if you don't understand why you're able to do those things.
00:13:58 Speaker_06
It's funny that you say that, because when I was getting my PhD, I introduced the idea to one of the faculty members I worked with that I wanted to study this thing called environmental scanning. I was told there was no such thing.
00:14:11 Speaker_04
Oh, there very much is.
00:14:13 Speaker_04
That's the old way of doing what's called horizon, which I guess you already know this, but for the people who don't, the sort of older version of strategic foresight, which is the field that I occupy, it was called horizon scanning.
00:14:27 Speaker_04
So this is like looking at news stories, talking to people, sort of scanning the stuff that's out there in a broad way and trying to extrapolate from that insights.
00:14:38 Speaker_06
So interesting. Yeah. So I thought it would be a good place to use my pattern making, but I, because it didn't exist for the people in my college, I moved on and I'm glad I did.
00:14:46 Speaker_06
But I think the thing about your work, so tell us what you do, what it means and where you CEO.
00:14:57 Speaker_04
So I am a futurist, which is a term that I really, really dislike. because it sounds like a silly made up job title. And to be fair, I get it. But also to be fair, like at some point accountant was a silly, stupid, you know, made up sounding job title.
00:15:16 Speaker_04
For sure. My field is strategic foresight. So the purpose of strategic foresight is to use data and build models to look for patterns, to see change and plausible outcomes. And most of my work is with the world's largest companies and governments.
00:15:36 Speaker_04
I do a little work in Hollywood, but for the most part, this is where do we play? How do we win? And what is it going to take to be resilient? There are a lot of people who work in this space that say there is no way to predict the future.
00:15:48 Speaker_04
I have a quantitative background. Eventually it's where I got to in college. I disagree with that. I think you can very much predict plausible futures, but that's really not the point of the work. The work is not to be prepared for everything.
00:16:04 Speaker_04
it is to be prepared for anything. And that's a tricky distinction for leaders to understand. So really, what I do and what my organization does is, you know, work with companies to help them figure out where they make their billion-dollar investment.
00:16:21 Speaker_04
AI is happening. What are they supposed to do? What's the strategy? What's the long-term play? So it's a lot of that. so we can get to tangible outcomes.
00:16:29 Speaker_04
But along the way, it is also helping leaders reframe their positioning in how they think about change and being comfortable sitting with ambiguity, which is a challenging thing for most people to do, to sort of sit with uncertainty.
00:16:45 Speaker_04
And to be able to have a very clear vision of what their future should look like, but be agile in how they get there. So that's it. And there was a third question about CEO. What is, was the question like, what does it mean to be a CEO?
00:16:59 Speaker_06
No, where do you, where do you CEO? Like, what is your company? Where are you located?
00:17:04 Speaker_04
So we are. Physically based in New York city. I'm the chief executive officer. We're tiny. We're tiny, but mighty. There's only 20 of us and we really only do one thing, but we do this thing, I think better than everybody else.
00:17:17 Speaker_04
And we also care deeply about.
00:17:20 Speaker_04
helping others learn more about it, because from our point of view, the more companies, the more governments, the more individual people that learn the core concepts of strategic foresight, the better it's going to be for everybody, because it'll mean we all stop making stupid short-term decisions.
00:17:36 Speaker_06
Okay, so you are very much like the Taylor Swift of South by Southwest. I have to tell you. Without a squad. I guess I have a tiny squad. Oh, you have a squad. You have a total squad. And let me tell you this.
00:17:51 Speaker_06
Do not get, I mean just do not get between you and the Brazilians who attend South by Southwest.
00:18:00 Speaker_04
It is a lot. I have been going to South by for I think 20 years. I have a wonderful relationship with South by Southwest and it's my, it really is my favorite time of the year. And I think over the years I have become part of what happens there.
00:18:14 Speaker_04
And yeah, a lot of people show up. I felt really bad this year. People waited. I think the line started four hours early. So I think there were people there at 6 a.m. because I started at 10.
00:18:25 Speaker_06
Oh yeah, they were camping out for the new iPhone or for a Taylor Swift ticket. I thought it was amazing, actually.
00:18:35 Speaker_06
And we called your team and said, you know, Amy and Brené are going to talk on the podcast and she really can't get there at 545 because she's got a gig later in the day. So camping out's not her style. Is there any way that we can get a copy?
00:18:49 Speaker_06
And so luckily, we will link to your South by Southwest talk for everyone. And I just am imploring people to watch it. And I want to go through some of the things that I learned from your South by Talk this year.
00:19:03 Speaker_06
So the talk every year is kind of unveiling for the year of the 2024 Strategic Foresights, the trends, what you're seeing, what you're learning, what you think we need to be thinking about.
00:19:15 Speaker_04
Right. Every year for many, many years, we launch our annual Tech Trends Report, but it's really more than just specific tech trends or science, it's society, it's like basically the year ahead or the years ahead.
00:19:26 Speaker_06
Yeah, and it's everything. It ranged from like ethics to face computers, like it was just everywhere.
00:19:34 Speaker_06
What I'd like to do in the time we spend together right now is go through some of those findings for an audience that's probably not a South by Southwest audience. And that's not for just the Unlocking Us folks.
00:19:50 Speaker_06
I, in the last, for some weird ass reason, my travel schedule has been really crazy for the last couple of weeks.
00:19:56 Speaker_06
And I've probably been in front of 12, 15,000 people at different large conferences, mostly doing fireside chats with CEOs, with professionals. This is B2B kind of, these are business leadership folks. And there's a real reservation.
00:20:16 Speaker_06
for people in business who understand that they're almost, I don't even know what word to use. I think you talked about FOMO. I've really not ever seen CEOs in the position I'm seeing them right now.
00:20:32 Speaker_06
And I've been inside of organizations doing leadership and org development work for 20 years. There's a scarcity mentality about, I don't understand AI. What is it? And we need a sophisticated 45-page strategy tomorrow.
00:20:46 Speaker_04
Yeah. Do you see that? Absolutely. We work with enormous companies, big part of the Fortune 50 world leaders. And I mostly now as the CEO, I'm not doing hands on client work, but I do work with a handful of CEOs every year directly.
00:21:02 Speaker_04
But we have this horizontal view into what's happening and to all organizations across different industries. And you're absolutely right. I think there's this perception that everything is happening and changing quickly.
00:21:14 Speaker_04
AI is yesterday, and we've somehow got to get caught up. And there's a general sense of anxiety, not just about technology, but we've got economic uncertainty. There's geopolitical instability, for real, for the first time in a while.
00:21:30 Speaker_04
And people are concerned, rightfully so. And so leaders are sort of at this point making decisions based out of fear or FOMO, which is the fear of missing out. And both of those cases are really, really bad. I guess you're one quick story with you.
00:21:46 Speaker_04
So I think you and I are both out there speaking quite a bit. I was at an event at the end of the summer in the Hamptons, which is not a place that I usually hang out. But I'm there. And there was an event that was happening where I was speaking.
00:22:02 Speaker_04
After that, there was a party at a beautiful home, and I get cornered by a venture capitalist who comes up to me and wants to talk to me about AI. And because he's a venture capitalist, he didn't have any actual questions about AI.
00:22:14 Speaker_04
He just wanted to tell me what he thought about AI. And his big, deep thought was that AI is going to become more powerful than humans. We're going to become like slaves.
00:22:25 Speaker_04
And he was exuberant telling me this dystopian future where our overlords are AI systems, which I thought was incredibly bizarre. And then I get away from him. There's a CEO of a bank, large bank, who wants to also ask me questions about AI.
00:22:42 Speaker_04
But in his case, he was like, how quickly can we get AI working to eliminate staff? So basically, reduce headcount. Fast, fast, fast, right?
00:22:51 Speaker_05
Oh, Jesus.
00:22:52 Speaker_04
To improve our bottom line. And the reason that I bring this up is because the conversations could not have been more surreal. The problem is that the situation is very real. So let me say this.
00:23:08 Speaker_04
Yes, we are dealing with heightened volatility, uncertainty, complexity, and ambiguity. Those are collectively known as VUCA forces. They're always present, but they are all heightened right now in a way that they haven't been before.
00:23:22 Speaker_04
So if you're feeling uncertain, that is a correct and normal feeling to have. It's also the most complex operational environment that I've seen for two decades.
00:23:32 Speaker_04
If you're an executive, if you're part of the C staff, you got there because you're a great manager and now suddenly you're supposed to be an expert in blockchain or metaverse or this year it's AI, which is unfair.
00:23:44 Speaker_04
I mean, these are complex technologies. So what's needed is a longer term perspective and some sense of like, Things are not happening as fast as everybody thinks that they're happening.
00:23:59 Speaker_04
AI is a technology that has been in some form of development for a century. That's for real. It's a long horizon technology. I know that it feels like suddenly everything changed overnight and it didn't.
00:24:12 Speaker_04
What changed is your perception of what's been happening all along. But there's this collective realization that kind of all snowballed and happened at once when OpenAI introduced its chatbot. And that's really what set the ball in motion.
00:24:28 Speaker_04
So at South By, I was trying to make everybody feel a little bit better that that collective anxiety they're all feeling and that, Brene, you're sensing when you're having these fireside chats with CEOs, it is palpable, it is pervasive.
00:24:42 Speaker_04
I think it's normal and it's okay to have those feelings, but at some point we have to set the feelings aside and like get back to logic and reason.
00:24:52 Speaker_06
Weirdly and coincidentally, I started my research, kind of my long-term qualitative research that I'm still doing now, probably five or six months after 9-11.
00:25:02 Speaker_06
And I see two camps of people today when it comes to kind of the tech trends, scramblers and ostriches. And so there's either people scrambling to leverage it in ways that they don't even understand what it is, and that's dangerous.
00:25:22 Speaker_06
And then there's people saying to themselves, and maybe this is on the Unlocking Us side, why do I need to know about this? What difference does it make? I don't care.
00:25:31 Speaker_06
The one thing that both kind of groups have in common, and it's a continuum, it's not a binary, it's a continuum, but the one thing that both groups have in common that I've not been able to convey in my meetings with people is that everyone's waiting for the ball to settle and the growth and change to stop so they can get their hands around it.
00:25:56 Speaker_06
I don't think that's happening. Is that going to happen?
00:26:00 Speaker_04
No, and again, I think this, we have this sensation that novelty is the new normal. And so- Wow, say that again. Novelty is the new normal.
00:26:11 Speaker_06
Yeah.
00:26:11 Speaker_04
And that was true pre-COVID because we were in a different area of tech acceleration. And at that point it was social media and video and moving to streaming and things like that. But especially post-COVID,
00:26:25 Speaker_04
it just feels like everything is new all the time. And so it does weigh on you pretty heavily and it can lead to indecision, it can lead to entropy. So the way around that is, and you, unfortunately you may not get this metaphor.
00:26:45 Speaker_04
Where you're at in Texas, do you get ice and snow? And probably not.
00:26:50 Speaker_06
No.
00:26:50 Speaker_04
Okay. But to get it to me, because I get it. Okay. Okay. So where I grew up, we had terrible wind, terrible cold. And when you learn how to drive, you have to learn how to drive on ice.
00:27:02 Speaker_04
I mean, I'm assuming you still have to, but you had to be able to prove that you could drive in ice when I was a kid. So what you learn how to do is to do the opposite of what it feels like you should do.
00:27:15 Speaker_04
So if you're driving along and you hit an icy patch, everything in your body is telling you, slam on the brake. And that is your instinct.
00:27:24 Speaker_04
And if you break down why, what's happening is you're instinctively drawn to if A, then B. If I slam my foot on the brake right now, I will stop. And this sensation of spinning out of control will cease.
00:27:43 Speaker_04
But physics, of course, tells us that that is not what you can do. Because when you slam your foot on the brake, you don't stop the car.
00:27:51 Speaker_04
In order for that equation, if A, then B, to work out, you would somehow have to know the exact gradient of the road, the tread on your tires. You would have to know exactly how much ice there is.
00:28:02 Speaker_04
You would have to have a omniscient viewpoint on every single data point around you. So what you learn how to do is steer into the slide, which is another way of saying, like, embrace the uncertainty.
00:28:17 Speaker_04
So you whip your steering wheel into the direction that you're sliding, and then as you start to move in a different direction, you whip it again.
00:28:25 Speaker_04
And basically, you are slowing down time and you are reducing uncertainty by making a ton of tiny decisions. while remaining as calm as you possibly can.
00:28:38 Speaker_04
That's what everybody needs to be doing right now, whether we're talking about artificial intelligence and gen AI or the thing that's coming next, which is going to make everybody even more concerned, and that's generative biology.
00:28:52 Speaker_04
And that also will impact every single industry sector, every business, every CEO. You have to learn how to steer into the slide because there is no if, a,
00:29:03 Speaker_04
than B or if this, then that equation that works out at this stage in human history, there are too many codependencies and too many variables that you're never going to have total control over.
00:29:16 Speaker_06
Damn. I do not like that feeling. Let me tell you, we get ice. I'm looking at Barrett. How many times do we get ice a year? Twice, maybe? Yeah, twice.
00:29:25 Speaker_06
And everybody knows don't leave because Texans are in big trucks, and the minute they slide, they slam on the brakes and whip their wheel the other way. And every time there's ice, there's a 50-car pileup on every one of our freeways.
00:29:41 Speaker_04
But I think that's kind of what's happening. Again, my point of view on this is that there's a 50-car pileup right now happening in business.
00:29:48 Speaker_04
I'm really concerned about the amount of capital flowing into AI, the amount of capital flowing into startups that have yet to prove out long-term profitability and commercialization potential.
00:30:01 Speaker_04
And the reason that I get worried is because we've already been through one AI winter, and that was between 1974 and 1980 when
00:30:10 Speaker_04
A bunch of AI pioneers made these huge promises, none of which were technically feasible, but the promise was so exciting, and they painted these really wonderful concrete images for people, and everything collapsed.
00:30:25 Speaker_04
We can't afford a 50-car pileup where we are right now, not just with AI and technology, but just
00:30:32 Speaker_04
where we are generally speaking in terms of our politics and society and the decisions that we're making for the future, we've got to be willing to steer into the slide and to be comfortable with uncertainty.
00:30:45 Speaker_06
Yeah, and we're not. I mean, and now we're in my field because I study vulnerability and uncertainty. We are just neurobiologically not good at it. Our nervous system tolerance for it. You train on that or you don't have it.
00:30:58 Speaker_06
So for someone listening that is, you know, They're not thinking about this professionally. They're thinking about their everyday life. They're like me in many ways.
00:31:08 Speaker_06
I mean, I have a big professional life, but I'm also up in the morning packing lunches, talking to my daughters in graduate school. My son's getting ready to go off to college. We're talking about what should he major in.
00:31:16 Speaker_06
We're talking about kids and social media. There's also a 50-car pileup around AI, families, kids, social. What do you see there?
00:31:31 Speaker_04
So I can sort of zoom in and then zoom out on this. I'll zoom in first. I would love that. So I have a 13-year-old. And people are very surprised when they find out that she does not have a phone. And she's in a wonderful school founded by five feminists.
00:31:51 Speaker_04
You'll appreciate this. She's in a school founded 175 years ago by five women who were pissed off that they couldn't get into Hopkins. because it was on all men's.
00:32:01 Speaker_04
And so when they showed up trying to enroll, they were told you can't come in because of our dress code. And they were like, cool, cool, cool. So they went out. They had men's suits made. They show back up. And then they had them arrested for indecency.
00:32:15 Speaker_04
And they were like, go fuck yourselves. Am I allowed to say fuck? I would assume so.
00:32:19 Speaker_06
Yeah. In this context, there's no other word that would fit.
00:32:22 Speaker_04
So long story short, they wound up going out and getting advanced degrees in other countries that would have them.
00:32:29 Speaker_04
They got PhDs, came back, started an all-women school that they specifically designed to be significantly harder than Hopkins because their thought was by the time these women get out, they're gonna have to put their heads down and outwork and outthink everybody else in order to make their mark.
00:32:47 Speaker_04
So that's where my kid goes to school. today. And that's amazing. It is amazing. The other thing that's amazing is that she's in seventh grade and she's one of four kids right now that does not have a phone.
00:32:59 Speaker_04
And in this school that is full of high performing families at the beginning of the year, the advisory teacher asked the group, you know, what does everybody want to be when they grow up? My daughter came home that day and was telling me this story.
00:33:12 Speaker_04
And she's like, Half of the kids said TikTok influencer. She wants to be a lunar architect. She may change her mind a few times, but like she couldn't. And it's been a little bit of a challenge.
00:33:23 Speaker_04
We did get her a watch because we need to be able to find out when to pick her up from school, but she doesn't feel like she's missing out. by not knowing the latest TikTok dance, not knowing the latest whatever.
00:33:35 Speaker_04
We've also shielded her so there's no pictures of her online. She's a very unique person because everybody else has submitted pictures of their kids. It's a tricky thing to talk about. Everybody makes different choices for different reasons.
00:33:51 Speaker_04
But I think that there is probably a lot of people out there now wishing that their parents hadn't spent the first 15 years of their lives posting pictures of them. Because there's no way to take that back.
00:34:04 Speaker_04
And going forward, again, I think we're moving into a place where better choices can and should have been made at the dawn of AI. AI requires data. And everybody's been pretty loosey-goosey about what data they submit.
00:34:20 Speaker_04
And it's going to be hard to roll all that back now. So I don't have a rosy outlook. Got it. So things can be bleak, but also things can be pretty great.
00:34:30 Speaker_04
So if we zoom out, the thing that I talked at South by about one of the themes of the report and just our ton of the research that we've been doing shows that we've entered a technology super cycle.
00:34:41 Speaker_04
So old school economics, super cycles are these events that can last a few years or decade. And it's an era of unprecedented productivity and usually economic development.
00:34:53 Speaker_04
And usually there's good outcomes on the other end, although it does require a transformation. And typically that gets started by some type of big innovation or development. So the steam engine results in the Industrial Revolution, right?
00:35:08 Speaker_04
That's the catalyst, then that's the super cycle. What's happening right now is that there's not just one technology, there's three.
00:35:16 Speaker_04
So there's artificial intelligence, there are all of these different wearable devices that are coming, and then there's biotechnology.
00:35:23 Speaker_04
They don't seem to be related, but they are, and each one of them plays off of the other, which means that we've entered this technology super cycle. So the good part of this, and there's good and bad to that, but the good part is,
00:35:36 Speaker_04
I think there's a lot of kids out there who probably have varied interests or who have been told they have to focus just on one thing and that's it. I think we're entering this era of varied interest is a good thing.
00:35:50 Speaker_04
And if you're somebody who's interested in biology and robotics, you are going to have a bright career ahead of you because there's going to be a ton of opportunity. and lots of new types of jobs and lots of new types of work.
00:36:03 Speaker_04
So there's risk as well, but the positive note is that it's going to be a cool time to enter the workforce because there's going to be a lot of new types of jobs out there that we haven't had before.
00:36:27 Speaker_02
Fox Creative. This is advertiser content from Zelle. When you picture an online scammer, what do you see?
00:36:36 Speaker_03
For the longest time, we have these images of somebody sitting, crouched over their computer with a hoodie on, just kind of typing away in the middle of the night. And honestly, that's not what it is anymore.
00:36:46 Speaker_02
That's Ian Mitchell, a banker-turned-fraud fighter. These days, online scams look more like crime syndicates than individual con artists. And they're making bank. Last year, scammers made off with more than $10 billion.
00:37:01 Speaker_03
It's mind-blowing to see the kind of infrastructure that's been built to facilitate scamming at scale. There are hundreds, if not thousands, of scam centers all around the world. These are very savvy business people. These are organized criminal rings.
00:37:16 Speaker_03
And so once we understand the magnitude of this problem, we can protect people better.
00:37:22 Speaker_02
One challenge that fraud fighters like Ian face is that scam victims sometimes feel too ashamed to discuss what happened to them. But Ian says one of our best defenses is simple. We need to talk to each other.
00:37:36 Speaker_03
We need to have those awkward conversations around what do you do if you have text messages you don't recognize? What do you do if you start getting asked to send information that's more sensitive? Even my own father fell victim to a, thank goodness,
00:37:48 Speaker_03
a smaller dollar scam, but he fell victim and we have these conversations all the time. So we are all at risk and we all need to work together to protect each other.
00:37:59 Speaker_02
Learn more about how to protect yourself at vox.com slash zelle. And when using digital payment platforms, remember to only send money to people you know and trust.
00:38:11 Speaker_06
I think these are two great takeaways. One is this is going to be an exciting, super-cycled, charged job market. We don't even know what the possibilities are. And two, all of these things that we're building require data.
00:38:28 Speaker_06
Make choices about what you're going to share from your life with these big machines that need data.
00:38:37 Speaker_04
Yeah.
00:38:37 Speaker_06
Because your kids' pictures will be data. I wear an Aura ring, so I'm sure that my heart rate is data and my heart rate variability is data. My wearable is data for the machine. Is that true?
00:38:52 Speaker_04
That's true. And I wear a Garmin watch because I'm a competitive cyclist and I need the data. So I think the trade-off here is being smart and savvy about how you're sharing and what you're cloaking.
00:39:02 Speaker_04
So having some amount of data literacy is good for teenagers and families. Yes. But businesses need data literacy, too. It's amazing how many leaders we work with who are just not aware of what data their company are creating or whatever else.
00:39:19 Speaker_04
I know this all feels confusing and challenging.
00:39:24 Speaker_04
I wrote a book in 2019 called The Big Nine, and it's all about what I thought at that point was going to be the future of AI, and it turns out most of what's in there turned out to be true for good and for bad.
00:39:36 Speaker_04
But the second chapter of that book is a quick and dirty history of AI and why it matters. So if you're somebody who's like, I wish I could do the thing like Keanu Reeves and the Matrix where like, I know jujitsu. Yeah, yeah, yeah.
00:39:49 Speaker_04
If you give yourself, it's a dense chapter, but that chapter is, it's actually a book about the future, but that chapter won that book several history awards. So if you just read that, like Keanu Reeves, you will know jujitsu, but for AI.
00:40:03 Speaker_06
And I think that's really important. I'll tell you why. When I was working with a bunch of HR folks this week, I said, you know, I think your time is here.
00:40:10 Speaker_06
I think about Linda Hill at Harvard who said the biggest challenge to digital transformation is not technology, it's people. I think the same will be true with all these new technologies.
00:40:22 Speaker_06
I did an interview for this podcast series that you're a part of, as my big crossover, with S. Craig Watkins, who is on an AI team at UT Austin, has a counter appointment with MIT.
00:40:33 Speaker_06
And one of the things that he said that I thought was really important is, It is no longer acceptable for the people at the table who are building AI to just be engineers and computational mathematicians.
00:40:46 Speaker_06
We need folks from human resources, ethicists, humanists, liberal arts folks, domain expertise, lived experience. That's especially when we're unleashing these things in vulnerable areas like policing, criminal justice.
00:41:02 Speaker_04
So you want to hear a really funny story?
00:41:04 Speaker_06
I do, always.
00:41:04 Speaker_04
I mean, you may not think it's funny. It's not like funny haha, it's like funny seriously. So the term artificial intelligence was first coined in 1956, and that's when some academics got together at Dartmouth University for two months.
00:41:22 Speaker_04
So a guy named Marvin Minsky, another guy named John McCarthy, they were like, You know, everybody that they had been talking to had been teetering around these ideas about machines that could think.
00:41:33 Speaker_04
Alan Turing had written a couple of very influential papers. They were like, let's get everybody together for two months and just, they made this list. this list of all the best and brightest that they could imagine.
00:41:46 Speaker_04
So in addition to tech people, they did kind of the same thing. Let's get an anthropologist. Let's get a linguist. Let's get a psychologist. Let's get all these amazing people together.
00:41:56 Speaker_04
And so for two months, they gathered, coined the term artificial intelligence, and for the first time described this future in which machines might think just like humans do.
00:42:09 Speaker_04
Now, what's interesting about this group of people, Brene, is that as they made this exhaustive list, everybody that they could imagine, the smartest people around, ask me how many women there were. I don't want to. There was one. It was Minsky's wife.
00:42:29 Speaker_04
Her job was to come and walk the dog and generally take care of the men as needed. So there were no women, there were no people of color. And I bring that up because you would think that whatever, 2024 minus 1956, we've been at this for a long time.
00:42:49 Speaker_04
You would think that things would be different today. In the room, I'm still one of the only women, if not the only woman in the room, when conversations about the future of AI are happening. So again, I think that there is this new interest.
00:43:03 Speaker_04
I think that people are aware and they're willing to talk about this more. But when it comes down to it, Ambition isn't the same thing as action.
00:43:12 Speaker_04
Ambition has to translate to action and action requires courage and sometimes making decisions and doing things that are politically unsavory inside of organizations.
00:43:24 Speaker_06
I absolutely see that every day and believe it's true and also believe, that's why I was telling this group of HR leaders, listen to the podcast series, get a couple of books, train yourself, understand what some of this means.
00:43:37 Speaker_06
Demand a seat at the table during these conversations. Know more than your leader, know more than your CEO. Ask hard questions. Say, well, that's really interesting. How are you dealing with the alignment issue? Become somewhat conversational.
00:43:52 Speaker_04
Yeah, I think that's great. I think it's a good point. You don't have to be an expert. No. You have to be conversant, which means you have to have a minimum amount of knowledge and then feel comfortable enough to be able to talk about it.
00:44:04 Speaker_06
And ask questions. You don't have to have answers because no one has answers.
00:44:07 Speaker_04
Yeah, yeah, totally.
00:44:09 Speaker_06
One thing that I was trying to quote you actually this week, and I was like trying to quote the super cycle. And I was like, but it's all these things happening at one time. And they're like, well, I don't understand.
00:44:19 Speaker_06
I was like, imagine if the steam engine, electricity, and the internet all happened at the same time. And they're like, no, no, no, they can't. That's too much change in too many different directions. Nothing would escape being touched.
00:44:36 Speaker_06
And I was like, I'm pretty sure that's what she's saying.
00:44:39 Speaker_04
Yeah, I mean, that is not how I said it, but you're not off. So each one of those examples is the other GPT. GPT, outside of AI, stands for general purpose technology. And that's the type of technology that has, again,
00:44:55 Speaker_04
has the ability to totally transform how we live and work. So you're right, steam engine, electricity, internet, all of those are general purpose technologies. And what's happening right now is they're converging.
00:45:08 Speaker_04
So again, not a terrible thing, but it will increase this sort of complex operating environment for leaders. And so to your point earlier, we cannot afford ostriches with their heads in the ground. Or scramblers. Or scramblers.
00:45:26 Speaker_04
This is not a good time for that. We need leaders who are expansive thinkers, who are open-minded, who will allow their cherished beliefs to be challenged, and who are comfortable leading through uncertainty.
00:45:41 Speaker_06
Okay, so the three things that you said are kind of converging is AI, what I call the interconnectedness of all things. How do you phrase it and what does it mean?
00:45:51 Speaker_04
So there's AI, there's this sort of connected ecosystem of things. So generally speaking, that means wearable devices. So your Aura rings, your smartwatches. Pretty soon, face computers.
00:46:05 Speaker_04
So that's what I would call Apple's Vision Pro, you know, these things that are coming to market. Meta has some stuff. They are wearable face computers much more than they are smart glasses. So it's all that stuff. But it's also connected cars.
00:46:19 Speaker_04
I sleep in an AI-powered bed. That's true. There's factory warehouses that are full of sensors. So it's all of these physical devices that are collecting enormous amounts of data. for good purposes, to help us do whatever. So that's the second set.
00:46:34 Speaker_04
And then the third set is biotechnology.
00:46:37 Speaker_06
Okay, so when I use CHAT GPT-4 a lot, and when I use it, I will say, I'm like... Can I ask, what do you use it for? Well, I will tell you, but I'm old. Let me tell you what old is for me. I remember my grandmother at the Piggly Wiggly in San Antonio.
00:46:53 Speaker_06
She was so upset one day, and I said, you know, what's wrong, Memaw? And she said, I feel so bad for those people having to crouch down underneath the checkout stand. And I said, I don't know what you're talking about.
00:47:08 Speaker_06
And she said, well, now when I go, they run a little thing over, and there's people underneath there reading the price tag saying $3.99, $4.99, $6.99. I was like, those aren't people. That's a computer. And she said, no, no, no, they're talking.
00:47:23 Speaker_06
And I said, it's a computer reading a barcode. And that's where she stopped. And then I remember she stopped there and she stopped with the ATM. She's like, if there's no way in hell I'm gonna get money in or do anything there.
00:47:37 Speaker_06
So for me, I can tell that I'm scared because I use it to thematically analyze large amounts of qualitative data, survey data. for trends and those kind of things and even outliers, but I still manually code everything to make sure that it's right.
00:47:57 Speaker_06
And what I'm learning very quickly is it's as good as my prompts and no better. So you have to almost, in my mind, you almost have to have some research prowess to even ask the right thing if you want a thematic analysis. But that's why I'm using it.
00:48:17 Speaker_04
It's so helpful. Yeah, I would say that's kind of true writ large. So beyond playing with it and asking it to write a haiku in the style of Moby or something, Moby probably writes hakus anyways. That probably wasn't a good example. I bet Moby does.
00:48:31 Speaker_04
Yeah, that's so good. But yeah, it's a tool. It's a tool just like Microsoft Excel is a tool. And most people don't know this, but Excel has more than 500 functions. And like,
00:48:45 Speaker_04
I think fewer than 10 probably get used with any regularity by the vast majority of people.
00:48:51 Speaker_04
So either that means Microsoft significantly over-engineered this software product, which there's some of that, or you just have to know how to use the tool in order to exploit the tool.
00:49:02 Speaker_04
So ChatGPT and Claude and Perplexity and there's like Gemini, they're all tools. And they themselves are not all of AI. AI is this umbrella term that encompasses many different technologies.
00:49:17 Speaker_04
But you have to know how to write a prompt, which is basically a little request in a window, in order to make the best use of it. And people are learning how to do that.
00:49:29 Speaker_04
I've also seen some companies listing jobs for prompt engineers that would make you rethink your own education because the prompt engineer jobs are like $200,000 a year. or more so. So would you say that that's text to concrete?
00:49:48 Speaker_04
So right now, you have to know what to ask in order to get something out in return. That's not nonsensical. What we are evolving toward is, and there's a lot of technical reasons for this that I won't bog the conversation down with.
00:50:04 Speaker_04
But what's coming next is requiring fewer specific prompts in order to get great results on the other end, which means you can start with a concept.
00:50:17 Speaker_04
You can start with a concept, brainstorm along an AI, and at the end of it, get a concrete answer or a concrete framework or design or whatever it is that you might need, which is a huge difference from where we are today.
00:50:34 Speaker_06
I have to ask this question. I'm scared to frame it this way. Are you saying that I would engage in thought partnership with AI?
00:50:42 Speaker_04
So yes, and that's good and bad. So the good part is... No shit.
00:50:47 Speaker_06
That doesn't sound good at all.
00:50:49 Speaker_04
Well, here's what's good about it. It's hard to shut my brain off. It's not hard. I understand how to do it. I just choose not to because I keep thinking new things. I was talking to some bankers.
00:51:01 Speaker_04
I promise you I spend time with other people than bankers and VCs. Not that there's anything wrong with bankers and VCs. I was going to say, okay, come on.
00:51:10 Speaker_04
But I don't know, I was starting to get worried that I'm hearing people talk about something that sounds a whole lot like a credit default swap and it's making me, giving me the heebie-jeebies that the housing market could collapse again because everybody's trying to make a quick buck.
00:51:22 Speaker_04
So I was like, OK, isn't there some other way to get investors' money fast on something that feels like you're distributing risk? Like, what could be an alternative? So I actually started with this general concept.
00:51:37 Speaker_04
And I was thinking through different pools of something that has value. Like, what's a pool of a thing that has value? And I wound up with genomes. And so I'm like, OK.
00:51:47 Speaker_04
Genomes, if you have a whole bunch of people's genomes, that's actually very valuable, as is the individual genome, because there's a lot of customers for that.
00:51:57 Speaker_04
A pharmaceutical company would want that, a university would want that, a beauty company might want that, so they could test things out and develop new things and whatever.
00:52:07 Speaker_04
And I start going through all these different permutations, and what I come up with is a genome-backed security. I have a whole plan for that. I know how it would work. I have yet to sell any bank on it. It's a crazy idea.
00:52:21 Speaker_04
I just more think it's awesome and I want somebody to do it. I don't think I personally would make money on it. At any rate. It's viable. It's plausible.
00:52:31 Speaker_04
And I never would have been able to come up with that on my own because I needed somebody, I needed something to stretch me. And usually that's my husband, who is not a futurist. He's an eye doctor, but he's really great at asking crazy questions.
00:52:44 Speaker_04
And so sometimes he'll just keep digging and digging and digging. And it just keeps getting my brain to open more and more and more. But I, you know,
00:52:52 Speaker_04
He's seeing patients at two o'clock in the afternoon, and sometimes that's when I want to have the discussion. So AI can be super beneficial for that. And starting with this concept, get to a concrete framework on the other end.
00:53:05 Speaker_04
The flip side, which is the bad part of this, is that I guess technically you could use the same process to invent new bioweapons or new ways to cause a political collapse.
00:53:16 Speaker_04
The other thing I just found out, and I'm sure you're vigilant for this also, so somebody has made a GPT out of me, and I didn't give my consent. And right now it's in Italian. There's a couple of others in English.
00:53:29 Speaker_04
So I don't love the fact that I'm dispensing advice. Because they took all of this. I've written four books. I've written all these articles for Harvard Business Review. So there's quite a bit out there. Lots of speeches.
00:53:41 Speaker_04
You know, I don't have the ability to control that. And it's not just about commercialization. I don't want somebody to think that the AI version of me is dispensing bad advisor or was a bad AI thought partner. I mean, no, I have that.
00:53:56 Speaker_06
I hate her haircut. Everything she's saying is like my work adjacent.
00:54:03 Speaker_04
Yeah.
00:54:04 Speaker_06
It's all the words in the wrong order.
00:54:06 Speaker_04
Yeah. It's like a funhouse mirror.
00:54:08 Speaker_06
It's a funhouse mirror. Oh my God, that's such a good analogy. There are like 15 or 20 books by me that I didn't write.
00:54:15 Speaker_04
Yeah. Oh, that sucks.
00:54:18 Speaker_06
Yeah, and Kara Swisher had the same thing when her new book came out. I interviewed her in Chicago for her book tour. And there's like this fun house mirror picture of her in her Ray-Bans. And they have fed my books to the machines.
00:54:32 Speaker_04
Yeah. One of the databases that train GPT-4 is a database of, I think, 140,000 books, something like that, that have ever been written, ever. All of my books are in it. All of your books are in it.
00:54:44 Speaker_04
There's millions of books published a year, so it's a select few, relatively speaking. But I'm assuming nobody was like, hey, Brené, is it cool with you if we train AI on how you think? I was waiting for them to ask for my address for my check.
00:55:12 Speaker_06
tell me two things I want to understand.
00:55:16 Speaker_06
I tell myself a story until you just fucking shred it, Darren South by Southwest, that all of the AI is in a Fort Knox of AI, controlled by super ethical people, and that if anything goes wrong, there is an accountability chain
00:55:36 Speaker_06
that is going to be much different than the non-existent social media bullshit of, we're just a platform. So I need you to assure me that all of the AI is locked down. I need you to assure me that there is an accountability chain.
00:55:55 Speaker_06
And I need you to assure me that policy that is aligned with our democratic values is gonna get out ahead of technology.
00:56:04 Speaker_04
Where do you wanna start? Sure, I feel like people listen to you because they want to feel good about, they wanna feel confident about the future. I feel like I can't give them that right now. Can you give me an accountability chain? I cannot.
00:56:21 Speaker_04
There is no accountability chain right now. And there hasn't been, and there continues not to be. Not in this country, not in any country. Wait, Europe has to have one. They're so strict on everything.
00:56:34 Speaker_04
So to be fair, Europe has recently passed a sweeping set of regulations and policies that have yet to go into enforcement and when they, if and how and when they do. Europe is a collection of different countries.
00:56:48 Speaker_04
AI isn't geographically bound, so I'm not entirely sure what this is going to look like. I think they have the best intentions, but ultimately regulation is a look backwards. It's a reaction to something that's already happened.
00:56:59 Speaker_04
My feeling on this is very controversial, but I don't think regulation works for AI. I don't think the way that most governments deal with trying to guide or control technology and science works when it comes to AI.
00:57:17 Speaker_04
Because AI is new, new-ish, and the way that we use artificial intelligence is not the way that people thought we would. So I think we need a totally different plan, and that is to incentivize companies to make better choices.
00:57:33 Speaker_04
And I think the way we do that is to make it so that they can make enormous sums of money. So this is the part that rubs people the wrong way. If it's regulation, then that's a punitive response.
00:57:47 Speaker_04
And in almost every country, the legal system is going to take over, and they're going to try and fight it. Like, the tech companies will sue, right? Or they will not sue, or they'll fight it through policy. Everything will wind up in litigation.
00:57:59 Speaker_04
But nothing concrete will change right now.
00:58:02 Speaker_06
And right now, is it fair to say the default? Is there actually incentivized right now to do it without care?
00:58:10 Speaker_04
Yes, that's right, because this is about speed. It's about getting to market as fast as possible. And we're talking about AI, and I think most people are translating that to like chat GPT. But all of these AI systems that most of you are familiar with,
00:58:26 Speaker_04
connect to the cloud, that they require specialized hardware systems that most companies don't have sitting in a back office somewhere, which means that they're all tethering themselves to AWS and Azure and Google.
00:58:39 Speaker_04
I'm not saying any of this is inherently bad. I'm just saying that our current system incentivizes the wrong choices. So if we want to incentivize the right choices, then we have to just acknowledge where we're at.
00:58:52 Speaker_04
And the thing that will make people make different decisions is by making it so that they earn more revenue.
00:58:58 Speaker_06
I mean, you're speaking their language. I mean, you're meeting people where they are and understanding what their incidents are. If it's controversial, it's just smart.
00:59:05 Speaker_04
It's not popular, but I think if we create a pathway to have companies make boatloads of money in a way that doesn't have negative consequences on the other end, like a potential market crash, and the only way to do that is through transparency and making that black box transparent and making the data traceable, protecting IP while making the decision making
00:59:32 Speaker_04
process open, revealing what got used to train the AI through audit trails. There are lots and lots and lots of things that can be done that will protect competition, but will incentivize everybody to make smarter decisions.
00:59:45 Speaker_04
I believe that is the only way forward. And for me to say that, given what I do for a living, that's a pretty strong statement. Normally, I would say there's several paths forward, but I really think this is the one.
00:59:55 Speaker_06
Okay, so there is no chain of accountability, just like with social media. We're just gonna get a whole bunch of, we're just the platform, we're just the platform. We need to incentivize ethical AI.
01:00:09 Speaker_04
I mean, I would like for there to be an accountability chain. I'm saying that there just isn't one, structurally.
01:00:17 Speaker_06
What's one thing you wish we all understood about the super cycle before we go to the rapid fires?
01:00:25 Speaker_04
I think, given what I'm seeing, what I have been seeing and what I'm starting to see happening more right now, is that with all of this disruption and change happening so fast, there's this sense that there's no use thinking about the future in any concrete way.
01:00:40 Speaker_04
I'm not bullshitting when I say, I think 50 years from now, possibly less, people are going to look back at us and everybody alive today as the group of people that lived through the great transition. Gen T. Gen T. Right.
01:01:00 Speaker_04
Everybody talks about Gen Z and Gen Alpha. We are collectively Gen T. And to me, that's more important
01:01:07 Speaker_04
It's us, and I don't know how long this transition is going to last, but in a way this is mind-bending and exciting, but it also means everybody alive today is going to go through an unprecedented amount of change faster than we are probably capable of managing.
01:01:28 Speaker_04
So now the reaction is gonna be to think very short term and to not be a futurist, but to be a nowist, right? To make decisions that feel comfortable right now. That is gonna turn out to be a mistake.
01:01:43 Speaker_04
So the trick is, I guess, to be like Kahneman and engage both sides of the mind, think fast and slow. Yes. But you gotta do both at the same time. So you have to be willing
01:01:55 Speaker_04
to think through the next order impacts of your decisions and what things might look like, even when those things are scary, and also tie that future back to today. Every company right now is shortening their planning cycles to extremes.
01:02:12 Speaker_04
This is the absolute worst time to be doing that. So you have to be willing to do three-year, long-term, five-year corporate planning cycles. You have to do long-term scenarios. You have to do that visioning work.
01:02:26 Speaker_04
And you have to recalibrate it as the super cycle spins up, or I can assure you, you will get left behind. So if I can encourage everybody to do one thing, it's to just recognize and acknowledge this is the situation that we are all in together.
01:02:43 Speaker_04
And, and everybody is every leader, every HR department, every C staff, every company, every individual. So we're all in the same boat. It is not an excuse not to do long-term planning. And there's a lot of resources out there to do that.
01:03:00 Speaker_04
I've got books that explain it. You've got books to help people think about it. But you've got to do it. You can't put it off anymore.
01:03:07 Speaker_06
Yeah, this reminds me of that quote by Jon Kabat-Zinn, who says, you know, he defines overwhelm as the world unfolding faster than our nervous system or psyche can handle. There's a skill set we need.
01:03:20 Speaker_06
Like, you know, when I was talking to a group of CHROs and CEOs last week, I said, you know, they're like, what's the skill set? Is it Python? Is it R? And I said, no, no, no, no. And I see your face. People can't see your face.
01:03:31 Speaker_06
Like, oh, God, if you're just looking for Python. I said the skill set is deep thinking, critical thinking, anticipatory thinking, and the ability to manage paradox. Can you straddle binaries? Can you hold the tension of two things that can be true?
01:03:49 Speaker_06
It's changing every day, but we have to long-term plan. Generation transition. Yeah. In every way. Okay, you ready for the rapid fire? Let's transition to it. She's got jokes too, y'all. She's the whole package.
01:04:05 Speaker_04
I do. I'm multifaceted.
01:04:06 Speaker_06
Yeah. I still see the music. Fill in the blank for me. Vulnerability is? Authenticity. What is one piece of leadership advice that you've been given that is so remarkable you need to share it with us or so shitty that you need to warn us?
01:04:26 Speaker_04
I've got so many thoughts. I think I originally got this from Julie Sweet, who's the CEO of Accenture, when we were talking
01:04:33 Speaker_04
a year ago, half a year ago, there are companies that have a learn-it-all culture and companies that have a know-it-all culture. Oh.
01:04:42 Speaker_04
And the know-it-all cultures tend to be the ones that have, you know, they don't always recognize it, but they're always problematic.
01:04:50 Speaker_04
And the learn-it-all cultures, it's harder to get that going, but once you're there, and it takes very courageous leadership. Yeah, tons. But once you're in a learn-it-all culture, everything changes and you sort of become more resilient.
01:05:04 Speaker_06
That's amazing because one of the first kind of we have a daring leader assessment and one of the first kind of series of like in a factor analysis, one of the first things that things ladder up to is knower or learner mindset. Okay.
01:05:19 Speaker_06
What is your best leadership quality?
01:05:21 Speaker_04
Yeah. I mean, the sort of leading by example thing, I value hard work. Productive, not like working for the sake of working. I just, I really value hard, good work and I am willing to do it. I'm the chief executive of my company now.
01:05:39 Speaker_04
I guess I don't have to do that in certain ways, but I, we have a very strong company culture. And I think that is partially, I would hope because I lead by example and my example is like, let's get down to work.
01:05:53 Speaker_04
But I don't know, maybe talk to my staff and they, I would assume, I would hope that that's the case.
01:06:00 Speaker_06
I think that's the case. I love that. Okay. I've got one more leadership and then we're going to, because you're a crossover, you have to do rapid fire from the leadership questions and the fun questions.
01:06:10 Speaker_06
Not that the leadership questions are not fun, but what is the hard leadership lesson? that you have to keep learning and unlearning and relearning and the universe will just not stop putting it in front of you.
01:06:25 Speaker_04
that nothing is ever gonna go as fast as I want it to go. I'm constantly frustrated that everything is going more slowly than I want it to.
01:06:37 Speaker_04
But again, these things take time when you've got a lot of people involved, and we all have a lot of people involved in everything. But I'm just, I'm really impatient.
01:06:46 Speaker_04
And that can be a good thing, because it means my drive will keep that engine moving forward, but it can also lead to sometimes bad decisions, as we all know.
01:06:56 Speaker_06
I relate to this so much, but I will tell you, I used to ask myself when I was doing the Dare to Lead research and I was looking at the data, is it harder for someone to unlearn action bias, or is it harder to teach a sense of urgency?
01:07:14 Speaker_06
And I actually think it's harder to teach someone a sense of urgency.
01:07:18 Speaker_04
I was going to say the same thing. I think that's right. In my world, without a sense of urgency, no action gets taken on the future. It's hard to feel a sense of urgency about the future, right?
01:07:29 Speaker_04
And it's very, you deal with leaders who either get it and they can either cultivate that in their staff or they can't, but at least they understand from a fundamental level why that's important or they don't. Yeah. Okay, fun.
01:07:44 Speaker_04
Last TV show you binged and loved? I watch as much TV as humanly possible. So I would say Bodies. I thought Bodies was amazing.
01:07:52 Speaker_04
It's a sci-fi series on, I honestly don't remember which streaming channel it was at this point, but it was about a series of crimes that were interconnected, involves time travel. It was a cool story. Yeah.
01:08:03 Speaker_06
You're revealing your nerdiness by the second.
01:08:06 Speaker_04
Well, I was like, do I say bodies or do I say Vanderpump Rules, which I also very much watch religiously. Favorite movie of all time. Dr. Strangelove. Uh, Fred, it's great. I actually have an annual viewing. I watch it once a year. Okay.
01:08:25 Speaker_04
A concert you'll never forget. So the concert that I will never forget is George Michael on his, the last tour. And it was amazing. It was the one with the giant colorful stage. He was incredible.
01:08:38 Speaker_04
And I think most people who know me know that I mostly listened to metal, like Soundgarden and stuff like that. But I, I, my first and true love is, is George Michael.
01:08:50 Speaker_06
Okay, a snapshot of an ordinary moment in your life that gives you real joy.
01:08:55 Speaker_04
A family dinner. We didn't do it before COVID, and COVID happened, nobody was traveling, and we started having family dinner every night. And it is the thing that I look forward to every day.
01:09:07 Speaker_04
One of us cooks, and even if we don't cook, we get food somewhere, but the three of us sit, no devices, we just talk to each other about what happened, or more often than not, like what's happening in the world, and it's awesome.
01:09:19 Speaker_04
God, I love family dinner.
01:09:21 Speaker_06
Okay, what's one thing you're really grateful for right now?
01:09:24 Speaker_04
I mean, it's kind of a big thing, but I'm grateful for my health and my mental health and the fact that I have choices, and I have agency, and I have a strong will, and I've got people around me who are supportive.
01:09:41 Speaker_04
I just feel so lucky for all of that. Some of that's cultivated, but I think it's also just, I'm just grateful, grateful for all of that.
01:09:50 Speaker_06
bigger than I ever thought it would be at my age. I don't know that there's much more to be grateful for. I mean, wow. It's really beautiful. Thanks. Well, I am grateful for you and your work and your time with us. I'm a metalhead too.
01:10:03 Speaker_06
So I appreciate that. This was a fun conversation and thank you for the work you do. It's so important and you bring so much insight to it, but also humanity. And not all futurists do that, I have to say. So thank you for that.
01:10:20 Speaker_06
It's a really incredible combination.
01:10:22 Speaker_04
Thank you. And thank you for all the work that you do in all of your conversations. We all learn from you every day. So that's what it's all about, right?
01:10:31 Speaker_06
Yeah, I think so. Just community. Gen T transitioning together. That's right. To hell and back, possibly. All right. Thanks, Amy.
01:10:43 Speaker_04
Thank you.
01:10:54 Speaker_06
Alright, what did y'all think? I don't like the turning into the ice feeling, like I hate that feeling. But I will tell you that I love knowing that people like Amy, with a strong ethical background, are looking into it and teaching us.
01:11:09 Speaker_06
You can learn more about the episode along with all the show notes on brenebrown.com. We'll definitely link to Amy's South by Southwest talk, all of her books, her website, where you can get the report.
01:11:19 Speaker_06
I think the report, Barrett, is the report like a thousand pages this year? Yeah, it's a thousand pages this year. I'm going to stick with a podcast in the South by Talk to avoid more information than my nervous system can handle.
01:11:33 Speaker_06
We normally have transcripts within three to five days of the episode going live. And just a reminder, this is really fun. This is a growing newsletter list, because we're doing some fun stuff on it.
01:11:43 Speaker_06
We have a monthly newsletter, and we got a lot of people on that. And now we're doing a weekly digest of just some fun things that happened that week, what I'm reading, listening to, thinking about. We've got some surveys embedded in it.
01:11:54 Speaker_06
What do you want to hear about? So you can go to brenebrown.com and join. either or both of our newsletters. I'm really doing a lot of that exploration with y'all in that format more than anywhere else right now. So, giving everything a shot.
01:12:09 Speaker_06
I'm beta testing ways of being. Okay. Stay awkward, brave, and kind. Steer into the ice. Dare to Lead is produced by Brene Brown Education and Research Group. Music is by The Sufferers.
01:12:30 Speaker_06
Get new episodes as soon as they're published by following Dare to Lead on your favorite podcast app. We are part of the Vox Media Podcast Network. Discover more award-winning shows at podcast.voxmedia.com.
01:12:57 Speaker_00
I like walking around, it's good for me. Could you tell me?