Skip to main content

Indigenous AI: Revolution or Colonizer Bullsh*it? AI transcript and summary - episode of podcast All My Relations Podcast

· 36 min read

Go to PodExtra AI's episode page (Indigenous AI: Revolution or Colonizer Bullsh*it?) to play and view complete AI-processed content: summary, mindmap, topics, takeaways, transcript, keywords and highlights.

Go to PodExtra AI's podcast page (All My Relations Podcast) to view the AI-processed content of all episodes of this podcast.

All My Relations Podcast episodes list: view full AI transcripts and summaries of this podcast on the blog

Episode: Indigenous AI: Revolution or Colonizer Bullsh*it?

Indigenous AI: Revolution or Colonizer Bullsh*it?

Author: Matika Wilbur, & Temryss Lane
Duration: 00:43:11

Episode Shownotes

Send us a textIn this thought-provoking episode, we sit down with Dr. Keolu Fox (Kanaka Maoli) to explore the environmental impacts of artificial intelligence (AI) and what it means for Indigenous data sovereignty. From the energy-hungry servers behind our everyday Googling to the broader implications of AI on Indigenous knowledge

systems, we ask: Can AI be done better?Can contemporary Native communities live in harmony with AI, or is it just another tool of colonization? Dr. Fox breaks down the risks, opportunities, and what Indigenous-led AI could look like. If you've ever wondered how technology intersects with sustainability, sovereignty, and cultural preservation, this is the episode for you.Tune in to join the conversation and rethink what AI could mean for the future of Indigenous innovation.++++++Big Thank you's to Dr.Keolu Fox and the Indigenous Futures Institute. Editing & All the things by Teo ShantzEpisode artwork by Ciara SanaFilm work by Francisco SánchezSupport the showFollow us on Instagam @amrpodcast, or support our work on Patreon. Show notes are published on our website, Allmyrelationspodcast.com. Matika's book Project 562: Changing the Way We See Native America is available now! T'igwicid and Hyshqe for being on this journey with us.

Full Transcript

00:00:00 Speaker_04
Welcome back, relatives, to another episode of All My Relations. I'm Matika Wilbur. I'm from the Swinomish and Tulalip tribes. I'm the creator of Project 562. I'm a mama, a wifey. I'm the founder of this space that we're recording in today, Tidelands.

00:00:15 Speaker_04
I'm really happy to be here with Ms. Temaris Lane.

00:00:19 Speaker_05
Oh, I'm so happy to be here again for another episode of All My Relations. Hi, everyone. I'm Temra Slain Khalitya from Lummi Nation. I have the pleasure of working in communications with tribes and Native-led nonprofits and being a storyteller.

00:00:39 Speaker_05
And today I'm really excited about this conversation because we're moving into a new era of storytelling.

00:00:47 Speaker_04
We are here with Dr. Keolu Fox, the co-founder of the Native Biodata Consortium, a nonprofit research institute led by Indigenous scientists and tribal members.

00:00:58 Speaker_04
He is also a professor at the University of California, San Diego, where he is co-founder and co-director of the UC San Diego Indigenous Futures Institute, the first Kanaka Maoli to receive a doctorate in genome sciences.

00:01:15 Speaker_04
His work focuses on the connection between data as a resource and the emerging value of genomic health data from indigenous communities. He's also a TED fellow, like me.

00:01:28 Speaker_04
And at TED 2024, he spoke alongside professional surfer and scientist, the sexy-ass Cliff Kamono. Welcome to the show, Dr. Fox.

00:01:38 Speaker_02
Oh, man. He'll fucking love that. He will. Yeah, yeah. Aloha. Thanks for having me.

00:02:02 Speaker_03
relations.

00:02:06 Speaker_02
I'm originally from Kohala, which is on the big islands, the northernmost tip. And now I'm at UCSD as a professor. Yeah, it's nice. Yeah, just recently had a baby.

00:02:18 Speaker_05
Welcome to Parenthood. Yeah.

00:02:21 Speaker_00
It's wild.

00:02:21 Speaker_05
It's amazing and wild, right? Dr. Kealu Fox is the world's first native Hawaiian genome scientist. What is a genome scientist?

00:02:33 Speaker_02
Everybody's familiar with genetics, right, like heredity, genealogy, you're half your mommy, half your daddy, and we just work with a lot of different tools to understand those things.

00:02:46 Speaker_02
Most of it's through a medical lens, so type 2 diabetes, things that plague indigenous people across the board, heart disease, cancer. We're trying to predict and prevent them using different tools like genome sequencing.

00:03:00 Speaker_02
It's just mapping heredity and mapping our genealogy using these kind of molecular tools. So that's the reading. And then there's the writing. So that's like identifying a typo and removing the typo to cure or like alleviate sickle cell disease.

00:03:21 Speaker_02
That's something recently that happened in the field.

00:03:23 Speaker_05
How did you wind up in that field?

00:03:26 Speaker_02
Oh man, honestly, I was like a very bad student and I was on the brink of just like dropping out. I think I had a 1.1 GPA and that's like really hard to do.

00:03:38 Speaker_02
And yeah, and I just ended up in this class and I had good mentorship from somebody that was really cool. And I learned about people that are, you know, adapted to high elevation and Nepal and Tibet and

00:03:53 Speaker_02
all kinds of incredible things, like why do we have ABO blood groups? And then once you kind of like get cued into it a little bit on the surface level, it's just a natural curiosity, I think.

00:04:05 Speaker_04
We spoke a little bit about genomics with Dr. Kim TallBear.

00:04:08 Speaker_02
Oh, alright. Auntie Kim was on here.

00:04:10 Speaker_04
We asked her, can a DNA test make me Native American? Right. Right, because of her book that studies, you know, how white scientists have studied indigenous people. I find it so fascinating. What do you think? Can a DNA test make you Native Hawaiian?

00:04:25 Speaker_02
Wow, yeah, we should ask, what's her name, Elizabeth Warren?

00:04:31 Speaker_02
I personally don't think, I think that like blood quantum is a tool that has been used to separate our people from natural resources, including land, probably with Auntie Kim, in that sense, it's like, Nobody gets to tell us who we are.

00:04:50 Speaker_02
It's more like, does your community claim you?

00:04:53 Speaker_05
And where are you accountable to?

00:04:55 Speaker_02
What land are you accountable to? What people? Exactly. Are you really a community member? Do they claim you? Do you know your language? Do you know your history? Those things, I think, are far more important.

00:05:05 Speaker_02
Now, will your genome and your genealogy contribute to how you're going to die? Yes. If your ancestors were exposed to nuclear radiation, higher chance of getting cancer, yes. We always talk about how the land is our ancestor.

00:05:22 Speaker_02
I think that's like ubiquitous through indigenous cultures. Like Mauna Kea has shaped our genomes over time. It has shaped us

00:05:31 Speaker_02
in these really beautiful ways, but it also probably contributes to various things like, you know, our susceptibility developing certain diseases. But it's a scientific fact to say, that's my ancestor, right?

00:05:45 Speaker_02
So I think that's an interesting distinction and point. But it's complicated, and you know, on this one, I wouldn't get in Kim's way on this one. You know? Yeah, yeah.

00:05:54 Speaker_04
Auntie Kim said, you know, we are our relations. You know, we're defined by our relationships to one another. And that cannot be defined in a scientific blood test.

00:06:09 Speaker_04
But, you know, there are many tribes across the nation that are using DNA tests, including my own. Really?

00:06:17 Speaker_05
To determine blood quantum?

00:06:19 Speaker_04
Paternity.

00:06:21 Speaker_01
Ooh. Interesting.

00:06:22 Speaker_04
So for tribal members that are born, there's a few requirements. You have to first live within the tribal boundaries for two years at some point in your life.

00:06:33 Speaker_04
And then if you're not the mother and you're the father, you do have to do a DNA test to prove paternity.

00:06:38 Speaker_02
People often do lie about their identity, as we know, with people that claim that they're a part of a community. Right. I'm not going to name specific groups. We're aware.

00:06:51 Speaker_00
Yeah, yeah, yeah.

00:06:53 Speaker_02
For sure, for sure. And I think in Hawaii, we have our own fact-checking systems. I don't know. But if people were bullshitting, I think we would know.

00:07:04 Speaker_04
Well, today we're talking about artificial intelligence in Indian country. Yes. This episode is thinking through how Indian country and ourselves as indigenous people are interacting with AI. And Temaris really wanted to do this episode.

00:07:17 Speaker_04
Tell us why, Temaris. Yeah.

00:07:19 Speaker_05
You know, as a communicator, I work for a communication strategy firm. And we support tribes and Native organizations. And we do a lot of other work.

00:07:31 Speaker_05
But there's this moment where if we don't start to participate or understand AI, how are we using it as strategists, as communicators, as storytellers? And what are the ethics around it? And I read this in an article that you wrote.

00:07:58 Speaker_05
The article was on... Opportunity and Risk, Artificial Intelligence, and Indian Country, and it was in the Tribal College Journal.

00:08:07 Speaker_05
You first start off by saying, today AI refers to solutions demonstrating new and profound capabilities, sparking optimism for the role this technology might play in reshaping and elevating society.

00:08:22 Speaker_05
I would love for you to just share your interests, your understanding of where we are at right now in relationship to AI.

00:08:36 Speaker_02
There's a whole lot to this. Right now, it's like we're making a bargain, I think. It's a fool's bargain really.

00:08:49 Speaker_02
I'm pessimistic about many things when it comes to artificial intelligence, but I want to like take us back a peg just to think about like if we go under the hood and we look at the scripting languages that people use to create the connective tissue that is AI and a lot of the languages we're using to communicate in those spaces are not

00:09:13 Speaker_02
indigenous languages.

00:09:15 Speaker_04
The programs that are being used to do the coding. It's colonizer language.

00:09:20 Speaker_02
Totally. It restricts the way we can express ourselves. Like if we think about it as a creative medium and actually communicating our consciousness in a digital space, we're totally restricted. There's like one language, maybe, and it's called art.

00:09:37 Speaker_02
It's made by a Maori guy. As a Hawaiian person, it pains me to say this. No, I'm teasing. And he's great. He's epic. The guy's a legend. He created this. language for data, visualization, which you would probably love as artists, both you guys.

00:09:55 Speaker_02
But it's limited. So I want to get us to really think about that. As indigenous people, as indigenous futurists, that's a limited medium. It's like we're limited to black and white, while Western-dominated colonial enterprise has a full palette.

00:10:14 Speaker_02
And that's... highly problematic and biased. So I'm into technological independence. I'm into building out community-oriented strategies and solutions. And I think it's important to note, too, like we're in Seattle.

00:10:32 Speaker_02
This is where, this is what we call ground zero. This is Amazon Web Services headquarters. This is where the first data centers and the architecture of those data centers were engineered and thought up

00:10:45 Speaker_02
You know, it's really important to reflect on that too. Some historical context on this. Before the year 2018, all of the largest companies in the world fucking our people over left and right. Pipelines, oil. You get the picture.

00:11:00 Speaker_02
They're major petroleum-based fossil fuel companies. These are the largest companies in the world. Getting that oil out of the ground is harmful.

00:11:07 Speaker_02
Communities that are created around it are full of domestic violence and- Missing and murdered indigenous women. Yep, directly related to this, direct correlation. Then you have what happens when you run it through a combustion engine.

00:11:21 Speaker_02
It's changing the density of our actual atmosphere. You know, it's a major driver of climate change. And I think the vast majority of the world has focused on CO2 as a variable.

00:11:34 Speaker_02
But the thing that's kind of slipped by the wayside, that's obvious that we're all participating in every day, this emergence of data centers as a major contributor.

00:11:45 Speaker_02
And cloud computation is poised to surpass the fossil fuel industry as the number one contributor because of the heat emissions from data centers, because of the amount of water that it takes to cool data centers.

00:12:01 Speaker_02
So, so we're dealing with, um, yeah, it's major. So we're dealing with this, like we have a dual addiction to both oil and data. And then when I tell you, who's the largest companies in the world now in 2024?

00:12:14 Speaker_02
Well, they're Amazon Web Services, Google, NVIDIA, like these companies that really are oriented towards this like data, data based economy. Right. Well, I'm going to tell you what my grandma told me. You can't have something for nothing.

00:12:32 Speaker_02
You can't produce and process that information, and it's not part of the second law of thermodynamics, it's going to release heat. So you've been focusing on CO2 when you should have been focusing on CO2 and heat.

00:12:47 Speaker_02
So, and nobody's holding these companies accountable. So it's highly problematic. So if we as indigenous communities in there, you know,

00:12:55 Speaker_02
want to get involved and you want indigenous data sovereignty, and you want to build your own servers or cloud or data center, then you should start with understanding what your relationship is to land.

00:13:07 Speaker_02
What, what are my wind resources that exist here? What about water access land? You should start by saying, how do I create renewable systems that offset this?

00:13:19 Speaker_04
You say that the data industry requires approximately 200 terawatt hours annually. Yes.

00:13:25 Speaker_04
With the combination of phones and personal computers and AI, heat emissions are poised to surpass the fossil fuel industry as the number one contributor to the climate crisis in our lifetimes.

00:13:33 Speaker_02
In our lifetimes.

00:13:34 Speaker_04
In our lifetimes. It's what we're doing. That's right. We're participating.

00:13:37 Speaker_00
Correct.

00:13:38 Speaker_04
And so what does it mean then for us to be land-based people and for us to go to Standing Rock and protest and for us to stand up against the Keystone Pipeline and to stand up for the Arctic and Line 3 and to fight for the salmon and to try to fight for land resources and land back and to believe fundamentally in all these ideas as sovereigntists, as indigenous people?

00:14:01 Speaker_04
What does it mean then in the larger context to be complicit in this ever-growing climate crisis that we're doing on our phones, most of us doing without knowing.

00:14:12 Speaker_05
Oh, unknowingly, yeah.

00:14:13 Speaker_04
Yeah, unknowingly.

00:14:14 Speaker_02
Profound relationship you just carved up for us. There's a relationship between the things you query on Google and your participation and habits in the digital world and understanding that there are consequences.

00:14:33 Speaker_02
Because it's really easy for all of us to point fingers at AWS and be like, Jeff Bezos and his bald head and da-da-da, you know, and that's not helpful.

00:14:41 Speaker_04
That's what I say.

00:14:42 Speaker_02
Yeah, yeah, yeah.

00:14:43 Speaker_04
It's Jeff Bezos and his bald head.

00:14:45 Speaker_02
Yeah, yeah. It's not helpful because, you know, we're all willing and able participants. and we're cooking ourselves, like literally cooking ourselves.

00:14:55 Speaker_05
Yeah, November has been the hottest month on record by far.

00:14:59 Speaker_02
It's important to think about certain things that we do every day and what the consequences of them are.

00:15:07 Speaker_02
I was getting mad at my cousin the other day because she took like seven pictures of her breakfast and all of it automatically is uploaded to the cloud.

00:15:16 Speaker_02
Let's talk about some other things too because you're an extremely talented artist and photographer, right? I think you probably feel a certain way about AI art because it's not authentic. Okay, so mid-journey, let's just pick on them for a minute.

00:15:29 Speaker_02
That's the one where I could put in a picture and be like, I don't know, make a picture of some stupid-ish whatever, and then it'll generate the image, right?

00:15:39 Speaker_02
And I could choose it and say, oh, do it like Studio Ghibli or do it like a comic book or whatever. And it outputs the image. The consequence of that is approximately one full iPhone 15 charge. One image generated.

00:15:57 Speaker_02
People have to understand what the relationship is. It's not, you're not just making some throwaway piece of art. There's a consequence to it.

00:16:03 Speaker_04
There's a resource requirement for computation, for artificial intelligence, for chat GBT, for Googling information, for playing around on it, for just trying something new.

00:16:14 Speaker_02
Bro, think about this. We talked about this on the phone. OK, so now you know every time you plug in something into Google, it says like the AI overview.

00:16:23 Speaker_05
Which is the first thing that now pops up when you Google anything.

00:16:26 Speaker_02
Correct. Yes. So think about that. I didn't ask them to do that.

00:16:29 Speaker_05
No.

00:16:31 Speaker_02
did that on their own volition. It means that every question that's answered is queried on a GPU, a graphical processing unit made by Nvidia somewhere and spits out that information back to me. So that means anytime somebody queries anything.

00:16:47 Speaker_02
It's doing this next level overhead of AI and heat emissions. And then we calculated that to about the equivalent of 30 major data centers per day. We're just producing so much information every day, exponentially. There is a tax.

00:17:07 Speaker_02
There is a negative effect or impact that that's having on the Aina, the Moana, the Earth. A lot of it is really held and processed under three major companies.

00:17:23 Speaker_02
We've surpassed the state of capitalism and we moved to something called techno feudalism and we're all just

00:17:30 Speaker_02
participating in this system, whether it's like a gig economy and you have to make your dollar bills as a, you know, an Uber driver or delivering food on Uber Eats, or like it's grim, it's a new type of slavery.

00:17:43 Speaker_02
You're like locked into this, this total digital apparatus. I know that's super dystopian, but that's the reality.

00:17:50 Speaker_02
And part of the magic of it is, is I could separate you and yourself and your wellbeing and your participation in this from what its environmental impact is. Like I encourage people that listen to this, there's probably a mine near you. Go visit it.

00:18:07 Speaker_02
See how fucked up it is to watch people pillage the earth for natural resources. You know, where are people disposing of waste? Where is that going?

00:18:17 Speaker_02
I think we have to understand that like in order to process this amount of information, if you were to tell me, okay, Keolu, show me a more like corrupt or unsustainable or harmful for the environment supply chain than that of creating an integrated circuit.

00:18:43 Speaker_02
I couldn't find one. And it starts with quartz and sand that is refined and chemically processed over and over and over again in the most harmful ways that you can imagine to get really beautiful, perfect glass.

00:19:01 Speaker_02
and it goes all the way into acquiring the circuitry necessary with things like cobalt and lithium batteries, things that are used to power many of the components and mechanisms that requires mining deep into the earth in places like the Copper Belt in the Congo.

00:19:22 Speaker_02
The colonial just headlock

00:19:25 Speaker_02
that is going on to acquire all of these resources from all of these different places on planet earth to bring them together so that they're designed in Cupertino and assembled in China where people are doing backflips off the Foxconn assembly plant committing suicide every day.

00:19:43 Speaker_02
It's like you got to understand what it takes to get an iPhone into your hands. Okay? That's the first part. Just layers upon layers upon layers of the most extractive forms of capitalism that exists on this planet.

00:20:00 Speaker_02
If anything, like we have perfected that trajectory and it's just growing exponentially. So people have to understand these components come from somewhere, you know, it's not just like it,

00:20:15 Speaker_04
It didn't magically appear in a beautiful white box.

00:20:18 Speaker_02
Somebody's homeland. Right. It comes from somewhere, whether it's native land. This is based on your participation and addiction to data. Whether it's text messages, Instagram, pornography. Video games. Video games. I'm just saying, go play outside.

00:20:36 Speaker_02
Go surf. Go do what your ancestors did.

00:20:40 Speaker_05
Yeah. So this is your sign. If you've been trying to find a reason to stop scrolling and doom scrolling, stop now because there's attacks. There's attacks not only on your person and your spirit.

00:20:54 Speaker_04
So the data that we're developing, we're talking about the phone, the pictures we're taking, the information we're Googling, the 12,000 emails that I haven't deleted.

00:21:05 Speaker_02
Exactly. And when I don't email people back and they're like, man, you didn't email me back. I'm like, hey, look, it's better for the environment.

00:21:16 Speaker_00
That's mine now.

00:21:17 Speaker_04
You are the arm belling at Indian, okay? I love it. Timers, did you know that we're on this handy-dandy platform called Patreon? What? Tell me more.

00:21:55 Speaker_04
Well, for as little as $1 a month, you can subscribe to our Patreon at patreon.com slash allmyrelationspodcast. And really, it's Patreon that makes it possible for us to do this work. So tigritsy, our hands are raised.

00:22:09 Speaker_04
Thank you to all our Patreon subscribers. You really are making this work possible.

00:22:18 Speaker_05
I imagine that people have array, a spectrum of experience using, collaborating with AI. And I would love to hear from you what your perspective is on ways people can engage, or should, or avoid. What are your thoughts?

00:22:41 Speaker_02
One thing that's really, really important is to think about the term, it's a catch-all term,

00:22:47 Speaker_02
When you hear the term AI, artificial intelligence, I think the most immediate association would be with like a chat GPT, which is a, but that's like a version of machine intelligence. So you have these different branches.

00:23:01 Speaker_02
So if you think about it, it's like AI. And then within AI, there's deep learning. And then within deep learning is a type of machine learning, right? All forms of machine intelligence. Okay? Some of them are biased. Some of them are unbiased.

00:23:17 Speaker_02
Some of them just have massive amounts of data that are pumped in and they identify trends immediately. And a lot of these models are only as good as the data that's included. So the diversity, the size of the data sets.

00:23:32 Speaker_02
And again, back in the back of our minds, we should be thinking more data, more heat. Right? That's the relationship. More data, more heat. You'll also hear terms like LLMs, large language models. You'll hear small language models.

00:23:49 Speaker_02
These are other kind of classes of algorithms that fall under that kind of deep learning approach. You'll hear the term neural net. It's like, imagine you're trying to design lines of code that recapitulate what a neuron does.

00:24:08 Speaker_02
Okay, so that's kind of like the class of things. Now, historically, there's this guy named Alan Turing. You ever heard of him? It's like one of my favorite DEI diversity stories ever. So he was famously gay, super gay.

00:24:23 Speaker_02
This was, I think, during the 40s, I want to say, or something. He was designing algorithms for the English military.

00:24:31 Speaker_02
And while he was designing algorithms for the English military, he said, I wonder if I can differentiate between this algorithm being human or not. Right. So the Turing test is asking a series of questions to an AI.

00:24:54 Speaker_02
to see if it gives you the proper responses and you can differentiate if it has human consciousness, sentience, whatever the term you want to use to describe it. And that came from his own internalization and hiding that he was a gay man.

00:25:08 Speaker_02
amongst his colleagues. That's what a lot of people view as the birth of a lot of modern AI. I'm not even drunk, but that's the drunk history.

00:25:19 Speaker_02
But in commercial and capitalist settings, AI is the most overhyped, overpromised term that's used to say that I'm going to do some process from A to B. So you're not like wrong or dumb if you think it's confusing.

00:25:46 Speaker_02
Because I think even people that deal with these technologies every day are looking at it and saying, wow, they just really overhyped that. And it's going to be really hard to deliver the promises that they just, you know.

00:26:02 Speaker_02
So for the general public and everybody who's listening, that is exactly how everybody feels every day.

00:26:07 Speaker_05
for myself, or for you, personalize this. How do you use AI?

00:26:13 Speaker_05
How can people like, I am a brand new AI, chat, GPT user, and really it is the only way that I can, in my mind, ethically, without knowing more, engage, is say, okay, I'm gonna write this thing, it's not perfect, I have 15 minutes before my next meeting,

00:26:35 Speaker_05
And I'm just, it's a email for a pitch and I want it to be perfect. So I'll say, okay, here's the email, I wrote it, just tighten this up, put it in there, it tightens it up, great grammar, stronger language, perfect. And I'll edit where I need to.

00:26:49 Speaker_05
And that's just like one really vague example. But I would like to hear from you, your opinion on engaging.

00:26:56 Speaker_02
I think that's a perfectly fine application and use of that.

00:27:00 Speaker_02
It's like you jotted together your ideas and you plugged it into that algorithm and it output this like perfect document that saved you an hour and increased your efficiency and your productivity to a level that's inhuman.

00:27:17 Speaker_02
Again, there's that energy relationship, more energy, more heat. You're processing that memo, email, presentation, letter of recommendation, whatever it is, through a processor. It goes through the query bar.

00:27:37 Speaker_02
It is iteratively checking that information against and processing it through this LLM. And through that, it brings you a more polished version of it. So what are the consequences environmentally?

00:27:51 Speaker_02
Well, we know that since ChatGPT was released and became publicly available to people, that Microsoft's water consumption increased 34%. Is it worth it? I guess is my question to people who are using it. It's like, you're more productive.

00:28:06 Speaker_02
Would you have gotten fired if there was a typo in that note? No, you wouldn't. I think we're all thinking it's kind of feels a little dishonest. It kind of feels a little bit like cheating.

00:28:15 Speaker_02
There's like a certain level of like, we're questioning our own morals about our participating in it because it's not your voice. It's not you anymore.

00:28:23 Speaker_02
Now also it's literally like removing that layer of humanity and humanness and those beautiful imperfections that exist in the way we communicate. And it is superficial. It can only bring you so far. It's not profound. It's not. going to be memorable.

00:28:41 Speaker_02
The point is to make it in memorable.

00:28:45 Speaker_05
Say more.

00:28:46 Speaker_02
It's like you're removing the imperfection from something so that it can transition up the ladder. It's not something that's going to last long. It's you're intentionally making an email that's immemorable to be like, I checked the box, right?

00:29:03 Speaker_02
Yeah, it's perfect. People don't remember perfect.

00:29:05 Speaker_05
No.

00:29:06 Speaker_02
So I want people to think about what that means. And so, so far as embedding it more and more into society, did it get you an extra hour to pick up your kids or whatever? Yes, it did. And that's dope, but it's also a compromise.

00:29:21 Speaker_02
in terms of like heat and water heat water and then the final product you atrophy your brain in that direction too so it's like i can't expect to get barreled if i don't surf every day or you can't be a champion

00:29:37 Speaker_02
if you're not putting in the road work and sparring. It's just impossible. So how would you ever become a profound Pulitzer Prize winning writer or, I don't know, whatever thing we value if you're not putting in the work?

00:29:53 Speaker_04
Exercising the muscle. Energy and ecological impacts aside for a moment, are there ways that you can see AI providing opportunities or advantages to indigenous communities?

00:30:03 Speaker_02
Definitely.

00:30:05 Speaker_02
I think in, I'll just stick to the things that I know pretty well, like in the health space, I could tell you my first experience is scripting and working within the framework of what is now known as AI and machine intelligence was here in Seattle.

00:30:22 Speaker_02
And we were taking blood type data. So like, what's your blood type?

00:30:27 Speaker_05
I don't know. O positive.

00:30:29 Speaker_02
O positive. All right. So you don't know their blood type? Thank god. More people know their funicular scope than their blood type.

00:30:37 Speaker_05
I don't know that either. OK. O positive.

00:30:42 Speaker_02
You're an outlier. But OK, so you're O positive, right? I'm A. You have a higher chance of developing cancer in your lifetime. I have a higher chance of developing cardiovascular disease.

00:30:53 Speaker_02
That's the number one and number two cause of death in America, OK? Well, when we dig into it, there's a higher resolution that you can determine people's blood type at the genetic level.

00:31:04 Speaker_02
So if you sequence someone's genome, you could like widen that window for the acceptance of like blood transfusions or organ transplants where you're like on a list trying to get a kidney.

00:31:15 Speaker_02
Well, now I could trick your immune system into taking this kidney. based on the compatibility for your genome with the compatibility for the organ, right?

00:31:25 Speaker_02
So that was all my work, PhD work, was on designing those type of algorithms to do that, to say, how tightly can we create a match?

00:31:35 Speaker_02
When we're talking about applying AI, it's like, yes, we would like to have tools that can make better matches, for example, for our people, or predict cancer in stage zero or one before it gets to stage four or five.

00:31:49 Speaker_02
But if we don't participate and create diverse data sets, then we can't do that.

00:31:56 Speaker_02
Before we even get to the process of machine intelligence, we have to create trust within our communities, and we have to have a safe place to store that information and access that information.

00:32:08 Speaker_04
I want to talk a little bit about some solutions. You talked about, in your TED Talk, you guys are talking about, which I find so fascinating, right? Like you're talking about using plants to store data.

00:32:21 Speaker_02
Yeah.

00:32:22 Speaker_04
Can you explain that?

00:32:23 Speaker_02
I know. You guys are like fucking.

00:32:24 Speaker_04
It's like so crazy. Like, what are you talking about?

00:32:27 Speaker_02
Plant memory. Plant memory.

00:32:28 Speaker_00
I know.

00:32:28 Speaker_02
I know. Everybody's like, how much weed do they smoke?

00:32:30 Speaker_04
They're smoking so much weed.

00:32:32 Speaker_00
I know.

00:32:32 Speaker_02
Never look at the comments, dude. I went on the comments, and they were like, the dude on the left looks high. And I was like, come on, bro. Were you high?

00:32:42 Speaker_00
I cannot confirm or deny that I have been high since eighth grade.

00:32:49 Speaker_02
No, but I think, I think it makes sense though, because think about it from this point of view, like what does a genome do, but store data anyway? It's a represent, it's a data storage representation of your mom and your dad recombined to make you.

00:33:07 Speaker_02
Right. It is actually the some told lineages of all of the people back to, you know, whether, you know, I don't want to get too controversial here, but you are your ancestors, you represent them.

00:33:17 Speaker_02
And your genome is this data storage kind of archive of that, which is super cool. You know, And the same goes for plants. But they do something called photosynthesis.

00:33:30 Speaker_02
If you remember from biology class, they take CO2, essentially, and metabolize it into oxygen, which is great for the planet, which is the exact opposite relationship that our current data centers have, right? Or a lot of the harmful things like

00:33:47 Speaker_02
the fossil fuel driven industries and things like this. So if you can store data in a plant and it's photosynthesizing and you can access it at a later date, isn't that a good thing? It is a good thing. It's a living biological data center.

00:34:01 Speaker_02
For that TED talk, we just kind of huffed around with the idea of colonialism. We talked about sugarcane as this colonial crop, but you could apply that idea to coral reefs or many, many, many other things.

00:34:14 Speaker_02
I mean, again, not to center surfing too much, but it's like provocative for all the right reasons. It's 100% possible. All the technologies that you need to get to that point in position exist and.

00:34:29 Speaker_02
I don't, and it's probably not as appealing to venture capitalists and whatnot, because it's good for the planet.

00:34:36 Speaker_05
It's like some indigenous futurism right there.

00:34:39 Speaker_04
I realize that, you know, you are in some ways fucking around. It's not realistic for me to plug a plant into my hard drive.

00:34:46 Speaker_02
Right.

00:34:46 Speaker_04
Right, into my USB-C. Not yet. Not yet. But in terms of dreaming of an indigenous future, you know, in this article you talk about charting an indigenous future will require a shift in our consciousness.

00:34:57 Speaker_04
We can optimize landscapes for exponential growth, profit, and eventually failure, or we can optimize for harmony and balance. To quote an ancient Hawaiian chief,

00:35:08 Speaker_02
Hey, ali'i. Ali'i is a chief.

00:35:10 Speaker_04
Hey, ali'i ka'aina.

00:35:11 Speaker_02
Ka'aina, yeah.

00:35:12 Speaker_04
The land is a chief. Humans are its servants.

00:35:15 Speaker_02
Right. Don't you think that's pretty common knowledge amongst all indigenous groups? It's like, we don't view the world hierarchically. If we did, we would be on the bottom of the hierarchy and not on the top in some human exceptionalism.

00:35:31 Speaker_02
And that's one of the biggest problems with AI today is that we, that's that, that shitty value proposition and fool's bargain that you're, you're going to tech your way out of this situation and use AI to solve climate change. What?

00:35:47 Speaker_02
In fact, it's the contrary. Like we're literally creating more harmful technologies that leave our planet in a worse position that require more resources that produce more heat.

00:36:01 Speaker_02
So yeah, we need to reassess what types of technologies can be linked together in a circle. Not that hard folks.

00:36:10 Speaker_04
So what does that look like in my own life? You know, how do I, how do I offset my own behavior? Do I plant more trees? Do I, you know, like legitimately just get over this phone, like let it go, go back to using a map. Just a flip phone.

00:36:24 Speaker_04
Should I just have a flip phone? Should I just have a landline? Should I just stop using Google? Should I just stop putting shit on Instagram? Should we just not be doing this podcast at all? You know, how far down the scale do you go? in your own life.

00:36:39 Speaker_04
Because if we're all complicit, if we're all contributing heat, and that is causing the melting of our planet, if we're cooking ourselves, then we want to do better. I want to be a land-based person. So I'm not going to become a farmer.

00:36:54 Speaker_04
I'm not unplugging. It's just the reality. We talk about this with Dallas.

00:36:58 Speaker_05
We won't be able to survive at this moment if we were to unplug. My career would be done. Everything I do. You'd have to get a horse.

00:37:06 Speaker_04
How would you get land? How would I get land? How would I buy a house? Okay, totally, totally.

00:37:13 Speaker_02
Okay, okay. I totally agree with you. No, no. We could go really dark.

00:37:17 Speaker_00
Let's not go too dark. But it is dark.

00:37:21 Speaker_02
It is dark, but it's like, okay, so we've, we have like a pace we keep. Okay. I want everybody to remember to like, if every data sensor went down right now, you would be okay.

00:37:37 Speaker_05
We don't need it to breathe, we don't need it to drink water, we don't need it for food. You know, if the Tessie doesn't charge up, saddle up. Except how will Whole Foods deliver my groceries?

00:37:54 Speaker_04
You got a canoe.

00:37:59 Speaker_02
You're good. You got a fishing pole. You're good. You got a net.

00:38:01 Speaker_04
I couldn't even pull a canoe home right now.

00:38:03 Speaker_02
You would make it though. You would do it. I believe in you. Yeah, I believe in you.

00:38:07 Speaker_04
I'll come in.

00:38:08 Speaker_02
I'll get you there faster.

00:38:12 Speaker_05
I'll get you there faster.

00:38:14 Speaker_00
Take your athletics. Yes, there we go.

00:38:17 Speaker_02
But but instead of okay. Okay. Instead of like some abrupt just halt in modernity. Like there, there, we need to think of transitions. We have to think of intermediary States, which is like, what is our transition plan?

00:38:40 Speaker_02
You know, politicians never talk about this shit. Cause it's not sexy. They're not like the transition from oil to renewables or whatever it is.

00:38:49 Speaker_02
But I think that is a type of forecasting that we need to really invest a lot of time and energy within our communities. Because if you want independence, you can't just cold turkey this.

00:39:01 Speaker_02
You really have to think about what that transition plan is going to be like, and it's not fun and it's not sexy and it's requires like a whole new level of creativity. Uh, that I don't think that I'm learning how to exercise and perform now.

00:39:17 Speaker_02
So yes, if they were like. The data centers are going down. I would go through withdraw and be like, Oh shit. You know, what, what are we going to do? Right.

00:39:26 Speaker_02
But I think instead of panicking, we have to say, okay, it would be okay if we all focused on a slower level of productivity.

00:39:35 Speaker_05
and asked what is sustainable at this at the pace in which we're moving. Obviously it is not. So we say what is sustainable. We can do those calculations because we have a lot of really brilliant people on this planet.

00:39:48 Speaker_05
And then we work backwards from that.

00:39:50 Speaker_02
Absolutely. I mean I look at Hawaii and we're a hot mess. I mean between The tourism industrial complex and the military industrial complex, it's like we have all these toxic relationships.

00:40:00 Speaker_02
We have to work with the next generation and be like, all right guys, where do you want to see Hawaii in the future? Do you want to carry around somebody's golf bags?

00:40:10 Speaker_02
Do you want to watch your mom sell her fucking ranch because the Airbnb next door is fucking up your property taxes? What type of relationship do you want with your land? based on the everyday decisions you make?

00:40:30 Speaker_02
What are you prioritizing in terms of your occupation and how you like contribute to our lahui, our community? I don't think that that's a priority a lot of times, but I hope that it will be. Yeah.

00:40:45 Speaker_04
Hmm. You know, I'm going to stop using chat GBT.

00:40:51 Speaker_02
You should. You should stop using it.

00:40:52 Speaker_04
I'm going to use my brain.

00:40:54 Speaker_00
Yeah. Use your brain.

00:40:54 Speaker_04
I'm going to use my brain, you know? Even if it takes me more time and it's imperfect.

00:41:05 Speaker_05
Well, we're going to tie this off in a pretty little bow for now, relatives. How wild to be in this moment. We've arrived in the future where we actually have to seriously consider these artificial intelligence realities.

00:41:22 Speaker_05
the intersections of our tech-dependent lives and the very real way it's impacting our environment, our sovereignty, our wellness, and our communities globally.

00:41:33 Speaker_04
I deeply appreciate this conversation. It has given me so much to consider and I really look forward to diving into this topic a bit more later on in the season. Speaking of later in the season, next week we'll be doing a part two of this episode.

00:41:48 Speaker_04
thinking through Indigenous data sovereignty with powerhouses like Dr. Dr. Dez, your former co-host, Dr. Tahu Kukutai, and Dr. Keolu Fox will also be returning.

00:41:59 Speaker_05
And we really want to thank Keolu for coming to speak with us. What a joy to be able to host these wonderful conversations and share them with each and every one of you. We hope you'll continue to follow along with Keolu's work.

00:42:15 Speaker_05
And who knows, maybe we'll see a regenerative data for us sooner than we think.

00:42:21 Speaker_04
Yeah, maybe here. You know, if you're here in Duwamish, Stohobe territory in the Seattle area, feel free to stop by Tidelands. We're open Wednesday through Sunday with our next big event happening on December 14th and 15th.

00:42:37 Speaker_04
We're hosting a holiday market with folks like Copper Canoe Woman, Kamis and KP of Black Belt Eagle Scout. My mom is going to make fry bread. It's going to be a good time.

00:42:50 Speaker_04
We're also hiring, you know, we're looking for more folks to join our team and you can learn more about that on our website. This is tidelands.com.

00:42:59 Speaker_00
Yeah. Yeah.

00:43:02 Speaker_04
Yeah. Thanks relatives. Thanks for being on this journey with us and we'll talk to you soon.