Navigated to AI: Policy, Bias, and Imperfections w/ Mutale Nkonde - Transcript

AI: Policy, Bias, and Imperfections w/ Mutale Nkonde

Episode Transcript

Mutale Nkonde

Mutale Nkonde : I remember, a couple of years ago, somebody was asking me about black lives matter in an interview, and I just happen to say, but they don't own the hashtag.

And they were like, they don't own the hashtag.

And I was like, there is no rule to IP hashtags.

It was created on a public site, everything on that site belongs to the company.

Now, if Facebook decides that they're gonna sell advertising space against the BLM hashtag which they definitely did, and say if you want to target this type of consumer is going to cost you $2 million.

That money goes to Facebook.

We are the product because we're on social media.

The hashtag get some clout.

People don't understand that.

George Garrastegui, Jr.

George Garrastegui, Jr.: What's up everyone?

Welcome to Works in Process.

The podcast about uncovering creative methodologies from people doing inspiring work.

I'm excited to announce that for the next three episodes Works in Process will be in collaboration with tech circus as a media partner for the what if summit this October?

What if design could change tomorrow?

What if organizations put dei first these are some of the things will be discussed at this virtual Summit, which brings together dei experts who aim to create breakthroughs and promote inclusive futures for everyone.

During the lead up to the Web Summit Works in Process will talk to summer presenters and explore the methods behind their creative decisions, and ways to prioritize the principles of diversity, equity, and inclusion with the organization's teams and professional practices.

I'm your host designer and educator George Garrastegui Jr.

Join me as I continue to elevate the creative process by shifting the focus to how we work over what we produce.

On today's episode I want to welcome matale And Conde I'm Talia is an artificial intelligence policy adviser who launched AI for the people.

A strategic communications firm by leading the advocacy for the introduction of the algorithmic Accountability Act for the House of Representatives and its language around integrating impact assessments into the product design process was integrated section 207 C of the American data privacy act of 2022.

Ai for the people launched by leading the advocacy for the introduction of algorithmic accountability in the House of Representatives.

In 2022 in Conde testified to the House Energy commerce committee in support of the algorithmic Accountability Act that's like a tongue twister, and started to develop across partnerships across the HBCU system as a way to diversify the fields in AI ethics.

And in 2023, and Conde was invited to become the Policy Fellow at Oxford Internet Institute in the United Kingdom.

Where should we be building out her work with the UN Business and Technology Project?

Also, she has been named woman who code top 100 technologists to watch in March of 2023.

And as a content moderator advisor for Tiktok Haven, Tali, how're you doing?

Mutale Nkonde

Mutale Nkonde : I'm doing fine.

I always, whenever somebody reads my bio, I'm always like, really?

I did all of that.

Right?

And so I'm a little embarrassed but good.

George Garrastegui, Jr.

George Garrastegui, Jr.: Don't be embarrassed.

That's it.

I was embarrassed by all the tongue twisters roll the A's in the algorithmic Accountability Act.

Mutale Nkonde

Mutale Nkonde : Listen, I write it better than I say it.

So hats off to you.

George Garrastegui, Jr.

George Garrastegui, Jr.: Oh, I say it better than I write it.

And that's not even saying much.

So.

Well, thank you so much for being on here.

I really want to get into learning about your journey, obviously in policymaking and also AI bias.

But before we do that, and when I clear our minds, I begin every episode with a simple set of icebreakers to get a conversation started.

Are You Ready?

Ready?

All right, coffee or tea?

Neither.

Ooh, well, what we had to pick one, tea, toast or bagel, Bagel, analog or digital analog AI, or artificial intelligence and privacy or accountability, accountability, and then some quick word associations, right?

So the first thing you think of when you hear these words, creativity, I should deter determination.

Pursuit, business,

Unknown

Unknown: fun, failure.

Never.

Community.

Love.

Education.

Curiosity.

Mistakes, resilience, skills, build history.

Wow.

Vital opportunity.

Always accessibility difficult

George Garrastegui, Jr.

George Garrastegui, Jr.: future.

Right.

And last but not least, process.

Continue.

Nice.

Nice.

Thank you.

You know, I always love to hear what people say.

It always goes from really funny to really serious to, you know a lot of it in between.

So it just gets our brains working on a different synapse.

And I just want to thank you for that.

So now when we start our my conversation, I want to give my listeners a glimpse and see how you were introduced and design and creativity.

And I call this section origin story.

So you're ready.

I'm ready.

Where did you grow up, and were you creative as a kid,

Mutale Nkonde

Mutale Nkonde : I grew up in the United Kingdom.

I was born in Zambia, which is a country in southern Africa, and went to the UK as a three year old, with my parents who are both medical doctors.

And we were supposed to be there for three years.

And they have been there nearly 50.

And every year, they always talk about how much they hate the snow.

And they hate the cold.

So I didn't come to the United States until my late 20s, actually, and I came straight to New York City, where I've been ever since.

And I spent my childhood in the world of story, whether it was being a princess, or whether it was being a ballerina, or whether it was dreaming about what I would do after like, after middle school after high school, to university, I spent probably the majority of my time in between the dream and the execution.

When I was about 16, I watched an Oprah Winfrey Show she was showing in the UK.

And she was in conversation with Maya Angelou.

She said, luck is when preparation meets opportunity.

And I then dedicated my life to being the luckiest person I knew.

And so I've been preparing for my opportunities ever since nice.

George Garrastegui, Jr.

George Garrastegui, Jr.: Right?

I also think like, I think Chance favors the prepared mind, right?

Like, you know, if you're ready for it, something will happen.

So who, if any, were the biggest supporters of your creative career,

Mutale Nkonde

Mutale Nkonde : my children, because they have no choice.

I grew up in an extremely conservative environment, where you could be a doctor, or you could be an engineer, or you could be a lawyer, or you could be a bum on the street, there really wasn't anything else to do.

So to have somebody that was creative.

And prior to getting into tech, I had worked as a journalist.

And so to have somebody that was thinking that they want to write, and they want to interpret events for others and bear witness, which is why the history thing was really interesting to me, because I really want to document history.

That was one of the roles I saw journalists play, there wasn't a great deal of support, but there didn't need to be a great deal of support.

Because when you're in the act of preparation, you're actually a huge support to yourself.

George Garrastegui, Jr.

George Garrastegui, Jr.: So when was your first creative job and kind of how did you stumble into it?

Mutale Nkonde

Mutale Nkonde : My first grade of Job was a freelance column in a paper called The Voice newspaper, which still is running in the UK today.

And it was the only newspaper that was printed from the black British experience.

I wrote one article, and it took a year to get published.

But I advocated for myself.

And that was really the beginning of seeing myself as important in a field that didn't see me as important.

And staying very true to that.

George Garrastegui, Jr.

George Garrastegui, Jr.: Wow, I mean, being dedicated that much for a year to get that article published.

They hunt me.

So when did you consider yourself a creative was it at that point,

Mutale Nkonde

Mutale Nkonde : probably in the last few weeks.

And what's really weird about it is I've made films, I've exhibited in film festivals, I've exhibited to actually the Smithsonian, a very short kind of film that I never create, considered myself as a creative until I was recently in a meeting in San Francisco with the UN Business and Technology Project, and we were speaking about women online and some work that they're doing along those lines.

And I was the last person to speak, and I was the first person to pose the question around And the way different types of women are treated women who are black women, from a trans experience, you know, women who are non English speaking, and how that showed up in online rhetoric, and to just watch the room kind of just light and go, we hadn't thought of that.

Tell us more.

And I was presenting some data.

And just watching that change in the conversation, a change in the room, that invitation for other voices to come forward was when I was like, I am a creative person.

George Garrastegui, Jr.

George Garrastegui, Jr.: It goes back to this when you talk about the ideas of stories, right, and this idea that to be able to say that, and then to have the room kind of all follow you.

Right.

Right.

Mutale Nkonde

Mutale Nkonde : Right.

And and to do it in a way that is in kind of getting up and saying, we have been speaking for so long.

And nobody has said this as opposed to asking questions and asking, Are we here to discuss human rights, I think was my opening.

And they said, Yeah, this is a UN meeting.

And I said, Well, what happens to people who are traditionally dehumanized, and then speaking about how that was the experience of many people in this country, many who are gendered identify as women.

And can we include those people too.

And I think that very, isn't just the story.

But the way that you tell the story, I often, I've picked up a bunch of mentees along the way.

One of the things they tell me about their careers, I'm often saying things like, it is so much easier to catch, um, bees with honey than it is with vinegar.

So where's the honey here?

Where's the fun?

Where's the thing?

That's good, you know, as police if you're around my age, you'll know the police's milkshake brings all the boys to the where's the milkshake?

You know, how can we do that?

How can we do that part?

George Garrastegui, Jr.

George Garrastegui, Jr.: Right?

Right?

How can we connect with something everybody kind of universally understands, instead of trying to scream and yell at them and get them to pay attention when it's a lot easier for us to connect on human similarities?

Mutale Nkonde

Mutale Nkonde : Right?

Right.

Oh,

George Garrastegui, Jr.

George Garrastegui, Jr.: that's amazing.

And I'm also I love to hear that you just said like a few weeks ago.

And because I think it's a moment when the moments come.

It's nice to have that right when, like you said, you see the room come with you.

And realize that it's all kind of coming together, you know, at this Apex point,

Mutale Nkonde

Mutale Nkonde : that you've honed this skill, because for me, I've been storytelling for 35 years, that I've been honing that skill I've been working towards the moment that happened.

When was the meeting, middle of June, were at the beginning of July.

So it's taken me 35 years of getting to that moment, and trying.

So that's 35 years of failing, that's 35 years of experimenting.

That's 35 years of being really like we got to do this and realizing that no one's got to do anything.

Number one, and my ways one of the many number of ways.

So how do we use story?

How do we use connection?

How do we use humanity to bring us to this point.

George Garrastegui, Jr.

George Garrastegui, Jr.: And just knowing that you said, it's taken you about 35 years of failing of this of all of these things is I think why I even have a podcast that talks about process because it is about those moments, right?

It's about that you've been working on this and honing this for that long to get to this point, to feel that way.

A lot of people think you get to that point early on.

Like you've said, it also takes a person to understand how long it may take to get to the place they need to be.

But the fact that you're you know that that's where you're at now, I think is amazing.

And the fact that you understand that it takes this journey of storytelling and honing to be able to say that right now that only a couple of weeks ago, you feel this, right, this thing,

Mutale Nkonde

Mutale Nkonde : and you may not get there.

And that's okay, because the winds come along the journey.

George Garrastegui, Jr.

George Garrastegui, Jr.: Also, you may not even understand where you're at.

Right?

So sometimes you you may not get there because you have a preconceived notion of what you're supposed to be doing versus just living in the moment.

Right.

So, I mean, that was beautiful.

So I like doing this kind of condensed version of finding out who you are.

Because really, we want to get to the heart of conversation.

Right.

And we started talking a little bit about you know where you've been, but now I want to talk about where you're at now.

Right.

So at the top of the show, we mentioned that you're a policy advisor and AI for the people.

Right, so here's a two part question.

That's probably a biggie.

One, what is AI for other people?

And also, can you give us an insight on to what a policy advice Heiser does.

Mutale Nkonde

Mutale Nkonde : Yeah.

So I'm going to start with the second question first.

Sure.

A policy advisor tells people what to do.

And one of the things that I realized very early on, was that I loved bossing people around, I loved being able to decide who speaks when and how, which is why I made a great like film director, and I made a great film producer, I also really liked using, like, structurally changing how people lived.

We live in a world that is governed by policies, practices, procedures, right, we live with roles.

So who gets to shape those roles, gets to decide how we live.

And, you know, we're speaking to each other a couple of days after the US Supreme Court has made a number of rulings that will change the way different people will live, whether it's people that can be refused goods and services based on somebody else's beliefs, whether it's applying for higher education, and knowing that, you know, structural, racial bias is not going to be counted in when you look at some of the applications of kids that honestly have all the talent and need more support, because of the way structural biases played out in their lives.

These are roles.

And so I think once I had left my career in journalism, and I'd had success there, and it was great, and I moved to the US.

And I was doing similar work.

But journalism in the US is so different to the type of work I was doing in the UK, I didn't really like it.

I'm kind of stumbling my way into the tech industry, it was really clear to me that as we were building products, we would use check sheets, we would use worksheets, there would be these rules that we had to follow to create the product.

And I really liked the idea of being one of the people that helps shape those rules.

So the AI piece just came in because I had been working in and around the tech industry.

I had at one point, done some contract work for Black Girls Code, which is a nonprofit from like the 2000 and 10s.

And their idea, which was really groundbreaking at the time, was one of the ways that we could achieve economic justice in this country is if more people were involved within the tech industry, and they chose coding as an entry point.

In hindsight, was it the best entry point that's probably a different podcast.

But it was an entry point, right?

It was somebody who had done the brave work of putting that forward.

And even within that environment, coming up against rules that just were not going to help this field, get talent from across the population was rolls.

And so that's kind of how I fell into it.

And under guarding that was the book emergent strategies by Adrian Marie Brown, where she talks about anything that you're really supposed to do in life will reveal itself to you by a series of yeses and noes.

And those yeses and noes will move you organically into where you're meant to be in the policy space.

I was not getting jobs here, I was being thrown off teams there I was being welcomed in this store.

I wasn't being welcomed, and kind of got there.

And then the first part of the question, AI for the people is a platform for that.

So we're a communications firm, we really want to work with global leading opinion formers who are thinking about how they are integrating advanced technology into their processes.

So that tends to be tech companies.

It could be other folk, but we start out with tech companies.

And then we will work with them to provide them with research, the research that they'll need to move on policies that we would like to see and those are any of the policies that reduce algorithmic bias.

So algorithmic bias is just when a technology expresses the same levels of racial bias or gender bias or ableism as people would face outside.

A really good example of that is if you ever go through TSA, airport checkout and we have to do the body scan, when you come out of the body scan, you will be directed to either a male TSA agent or a female TSA agent for further checks.

That designation is based on how the scan has identified you by gender If you are somebody who's trans or intersex and visibly has both sexual organs, they don't know how to, they don't know who to send you to the way the machine is engineered, is on a binary people are either male or female.

That's not true.

That actually leaves out gender nonconforming people, it leaves out trans people.

So that's just one example of the ways that our technologies Express bias, and they do racial bias and all the others we can talk about.

And yeah,

George Garrastegui, Jr.

George Garrastegui, Jr.: I mean, definitely going into the idea of the biases and bring it down, I think into the simple form of that scanner, at the airport, right?

Where it definitely leads us to a zero or one, right, it there's no in between, there is no flexibility.

And I think what a lot of people know, in their hearts is there's many shades, you know, there is no one or the other.

And a lot of people feel threatened by the idea of that there isn't one or the other.

And like you mentioned, all the different rulings that have been happening are people trying to hold on to that belief that you only can have things a certain way,

Mutale Nkonde

Mutale Nkonde : right?

And power, and it really comes down to?

And this is a lot of my academic work, it comes down to power, who has power and who doesn't have power?

And then how do we systematize that how do we create rules that make that true, and I'm someone who absolutely rejects that.

Genius, creativity, looks sounds a certain way.

And for anyone that knows me, I'm famous for changing my I changed my hair every eight weeks.

So if you are listening to this podcast, and you're not a black woman, then you might be like every eight weeks and I'm like girl, that's when I go get my hair did.

And I might feel like a platinum blonde one day, I may feel like black hair the next day, I may feel like an afro the next.

But these are all me, that does not change the fact that I am a leader in an emerging field it doesn't.

Whereas I think that there are people who would very much like to push me and people like me out of the marketplace.

And I don't want to see that happen.

No,

George Garrastegui, Jr.

George Garrastegui, Jr.: no, definitely not.

And I love how you keep on going back to this idea of rules, right?

Because rules apparently mean structure and ways of doing things and the idea of one, not necessarily abiding by the rules, but understanding that there are systems in place, there's ways of doing things.

And Tech has rules.

Design has rules, right?

But also rules are meant to be broken rules are meant to be adjusted rules are meant to be circumvented.

So the understanding that you're like, I like playing within the rules, gives me this understanding that you like using the system to break down the system?

Mutale Nkonde

Mutale Nkonde : I do.

And I think I look at roles critically.

And my question is always, why not?

You are telling me, the only way that we can integrate AI systems into the economy is if we give up our privacy.

My question is why?

Why not?

Why do we have to give up privacy?

Why can't this be something that's additive, right?

Why can't when the wheel was developed, we did not kill every single horse in the world?

Because we had reels and cars.

No, we integrated the wheel into our transportation system.

Why not?

We can also develop technologies and decide that they're not good for humanity.

We have the technology of the nuclear bomb, which we didn't use after the Second World War because of what it did to those people in Hiroshima.

Why not?

Why can't we do this now, right?

We are just coming out of the submarine disaster, which was terrible, terrible.

Those people lost their lives in that way.

But as somebody who is a technical designer, as somebody who is interested in rules, I was like, Where were the regulations for that, who is accountable for this?

Come to find out that the designers decided that they did not want rules, because they wanted to get to the bottom of the ocean.

And they did get to the bottom of their ocean, but not in the way that they had thought.

So rules can play.

You can break them to make them better, but they can also safeguard all of us, which is what we want.

am ultimately

George Garrastegui, Jr.

George Garrastegui, Jr.: Right.

Right.

Exactly.

And I think that is part of the point understanding where things lie so that you can work to best navigate them, right.

I love the fact that you're talking about You know, there's your bomb and how we've chosen to go away from that, because we understand the implications of it doesn't mean people don't stop working on it.

Because as we know, everybody's trying to get to that nuclear kind of thing.

So it's still a progression.

But it's like a universal understanding.

And I think when something is so brand new, like what we're dealing with right now, the new and shiny, it's always this uncharted territory, where we see how far we can push it, and then maybe we push it too far.

And now we're dealing with the ramifications of not understanding what we put out there in the world, we just put, and then they were like, Oh, crap, the matrix is about to happen.

And we're about to become batteries.

Mutale Nkonde

Mutale Nkonde : And I think that that really happened for us in 2016, not just in the United States with our election and the way that AI systems were being able to be manipulated to interfere with our democracy.

But across the world, right, we saw Bolsonaro, we were seeing LePen in France come like across the world, because we were all using social media, in elections and competitive elections, we were able to see the manipulation of that and realize that rather than people choosing their governments, platforms, like Facebook, or Twitter, or these other AI mediated spaces, were able to kind of change our reality.

And I think it's just gone on since then, certainly 2016 for people within the research community was a big Flashpoint.

But it wasn't until 100 million people downloaded chat GTP, that we then started to get this bigger public conversation, because all of a sudden, we saw that there was going to be another mass adoption.

And we want to make sure that this time, it didn't break our societies, but it actually made our societies better.

And that's where I really see the work of AI for the people being important.

And that's really where I think that we can intervene, we can provide intervention.

George Garrastegui, Jr.

George Garrastegui, Jr.: And I mean, when you think about it, right, and you think about, like you mentioned, the introduction of just how much influence social media platforms had in 2016, with the algorithms.

And it's almost really interesting that I think not enough people think that algorithms are AI, they just think they're not right, they think they think they're a computer program, which AI is but you know, that organizes and outputs mathematical data, you know, we'd ever use the term, but we use algorithmic interest.

That one thing looked at it one way, and now we're looking at it, because now AI is the hot word, right?

But it's still dealing with algorithmic things.

And I wanted to ask you, right, when we're talking about this idea of bias, right, I know we think about bias as a very binary thing to like race.

And I think your example of gender bias or things like that start to come up and you realize, obviously bias, the term is a broader word.

But with this social media, do you think the algorithms are really social media driven?

Bias?

Or like, are we thinking about biases in the chat DVTs, dollies mid journeys?

You know, all these other outputs of AI, like you said, that people are now starting to get into?

Are they different?

Are they the same?

Mutale Nkonde

Mutale Nkonde : It's exactly the same.

All algorithms work the same across the system, whether it's in a search engine, whether it's in a a tool, that you're deciding how long should someone be in jail, or whether they should get housing, or what content they should get on there.

I'm gonna say for you page, because I'm in Tiktok clan, but it could be I think, therefore, you pages on all socials.

Now, I'm not sure.

These are algorithmically mediated decisions.

And I think one of the ways that we've done ourselves a disservice, certainly in the research community is not introducing people to the language earlier and helping people bring along.

So when I first got into the tech industry, everybody was talking about big data.

Now we talk about data, bits, exactly the same as big data.

When I first got in people were talking about machine learning.

We still talk about machine learning, but we don't link it to AI.

When we start to talk about algorithms, we did not add in the next part of the sentence, which are algorithms drive AI technologies.

And now we're talking about AI.

And each point, people think it's this new thing.

I was just in a conversation prior to you.

And somebody asked me a question about quantum computing.

And I think that they were just like, expecting me to just like, Oh, my God, what's that fallback and die?

And I was like, No, quantum is just a faster processing speed.

All quantum computing is going to do is make that thing go faster.

Do I think that's a good thing?

And I'm like how now Oh, that's a terrible thing.

We can't even get the stuff that we're doing now.

Right?

And now you want to make it faster?

No, we need to actually start to understand what's happening at base.

And I'm hoping that now that people are like, chat GTP crazy, they will start to lean into understand a little bit more that if you have a smartphone, you've been using AI for as long as you've had that phone,

George Garrastegui, Jr.

George Garrastegui, Jr.: right?

That's why it's smart.

That's why it's smart.

When you look at what biases in something like open AI with chat TBT or algorithmic biases.

What are the big things that you're noticing that we as maybe consumers and drivers of this need to be most concerned about?

Mutale Nkonde

Mutale Nkonde : I'm going to talk about Chechi TP three specifically, because honestly, I have been too busy talking about chat GTP to use it very much.

So I haven't used for yet.

One of the things that chat GTP three did was it could not talk about the influence black women had had in society, it could talk about black women, it could give you like the Wikipedia, so and so with so and so.

And they did this but very specifically, I was in a conversation where somebody had asked chat GTP three, how was My Hayley Jackson, who is a famous gospel singer, influenced by Bessie Smith, and chat GTP could not answer that.

Now.

Bessie Smith is the first pop star that we ever had in the United States.

She lived African American woman who lived in the 1920s.

She was the first person ever to be signed to Columbia Records.

And she has influenced over 100 people, including but not limited to Elvis Presley, Janis Joplin, the late great Tina Turner, and on and on and on.

But because I was asking a question about power, so I'm asking about influence, chat, GTP did not have that information.

What that let me know was the way chat GTP is being trained, like the corpus, the underlying data, there probably aren't a lot of stories in that data about how people who are not male people who are not white people who don't speak English, have influenced human history and culture.

And that's not chat GTP three's fault.

There's really nothing the engineers can do about that.

The great thing about chat GTP three is that it is the first large language model to have 175 billion inputs.

And what that means is that they have 175 million unique pieces of data.

But as they are processing that data, how black women are powerful, influential and add to the culture, those patents were missing.

And that, to me is a form of bias as well.

So it's not as obvious as misgendering people, I think that's really obvious.

Or there are other there's a, there was a bail instrument that was being used in Florida a couple of years ago with a compass algorithm that was giving black men longer prison sentences than their white counterparts.

That's really obvious, I guess also exists in the way that we decide who's influential, who isn't influential, who's great, who's not great.

And in that same article, I was saying that, if we were training that same model to identify people who won Record of the Year as the best artists of our time, Beyonce would not show up in that dataset, because she's never won Record of the Year.

And the interesting thing about her never winning Record of the Year at the Grammys is when they asked Grammy voters why they never voted for her to win it.

A large number came back and just said we think she wins too much.

Her work is excellent.

We think she wins too much.

Not if that is not their own bias speaking because that would never be said of Mick Jagger.

That would never be said if the Beatles that would never be said of you know, pick your great white rock star.

George Garrastegui, Jr.

George Garrastegui, Jr.: So, you know when you're talking about right the underlying nature of of AI are algorithms and the training and the 170 5 million inputs of data is that because something like chatty BT is scrubbing, what already exists and kind of pulling that so isn't the larger folk of society is that we don't have enough information.

Like there's not contextual information in the way that somebody can say, this is what the influence of Bessie Smith to Mahalia Jackson is.

So that chat TBT can read that and be like, Oh, understand that, or it's supposed to make its own leaps.

Mutale Nkonde

Mutale Nkonde : So is artificial.

So it's not going to be able to make its own leaps.

And we have to, I think, right sizing, what we even expect from something that is artificial, has to be the first thing that we do.

I also think it's not even necessarily more data for me, because we curate data in particular ways, right.

So if you are looking at data from a lens in which women are not valuable, then they're not going to show up in your data set.

And we see this a lot in the scholarship around civil rights, where I'm in a graduate program at the moment, and one of my professors had to say there were women in the civil rights movement, who were more powerful, more prolific, more important than Martin Luther King, but because of sexism within that movement, they're not in the archive.

So we can't reference them.

But I as your professor, as your teacher, I want you to know, I don't want you to leave my classroom, assuming that they weren't there.

And that data chat GTP isn't going to be able to make that primary data, in the same way that the way that we program is with existing information that is crafted in a very, very particular way.

So one of the things I was I'm starting a new research project, it's looking at white supremacy on the internet, and I'm writing about it.

And one of the things I've written very recently is white supremacists have been on the internet since the early 80s.

That means when we are scraping the internet for anything in the bedrock is kk k literature is the Aryan knights literature is.

So we have to assume that when we're creating patterns and associations using AI, there has to be some type of human moderator, to fact check that.

But even in the humans, we choose, it's not clear to me in the open AI team, that there would have been anybody on that team who would have known who Mahalia Jackson or Bessie Smith draw.

And since they're still slipped through the cracks,

George Garrastegui, Jr.

George Garrastegui, Jr.: right?

So obviously, you know, there's always going to be the conversation of who's involved, right, like, if they don't even have the right people, or the proper amount of people to just kind of say, let's look at this base level to include a lot of different datasets and who were looking at, than we know, they're not even going to be bringing up Mahalia Jackson or Bessie Smith.

Unknown

Unknown: But when to

Mutale Nkonde

Mutale Nkonde : their to them, chat GTP did exactly what they were supposed to do.

They released the product in the wild to see how it would actually be used.

And then they can come back with that data and say, Guess what, we didn't know 100 million people were going to download they did, we need to have a much more expansive view of how we're going to be using this technology.

George Garrastegui, Jr.

George Garrastegui, Jr.: Right.

And I think that's where I think it becomes very interesting, because the the way you talked about the professor, kind of giving you this insight, right of there are more influential females part of the civil rights movement that we don't know about.

Now, when your teachers gone, or if that teacher doesn't write something about those people to help teach us that information has gone with that person.

Right?

So how do we then start to get people to share that content?

Right?

Is it that teachers responsibility to teach you and then you expand the content or?

Right, because otherwise, we know without all these things, and we're cultures that oral history is kind of what we tend to do, right?

But it's never written down?

Right?

So it's, it's something that we share, and we pass along, but it's a story.

And it's things like that, but then we if it's not written how does an algorithm or AI start to scrub and learn from it?

If it's something that's inherently within us, right?

So how do we start to teach this stuff which is so embedded in personal history and personal culture?

That like you said, it's almost counteractive to what in white supremacy, you know, it's embedded, right?

We're going to share this information.

We're going to make sure we just get everything out like that.

That to me is an interesting thing that we have to learn how to do to embed some of these stories.

Mutale Nkonde

Mutale Nkonde : I have so Many thoughts about it.

And one of them has actually led me down the path of going to get a PhD so I can publish my dissertation.

I think, number one, books have to be written books have to be written articles have to be written podcasts have to be made.

So I am like many graduate students in New York City, I commute to school.

I live in Brooklyn, my school is in Harlem.

And one of the things that he said to us that really stood out to me was, he was so frustrated with Spike Lee's Malcolm X movie, and I live downtown Brooklyn near those production offices, I see spike often and had been to a community screening of the 35th anniversary of the X movie.

And he pointed out that Yuri Kageyama, a Japanese American woman who's very close to Malcolm, is actually the last person that ever held him.

So when he was shot at the ultimate ballroom, she was there, she doesn't appear in the movie at all.

And not only does she not appear in the movie, a large part of that movie is is when he was a hustler.

And there isn't as much of that movie of when he became a minister when he fell out of favor with the nation.

As he was moving towards King, there was no meeting between him and Kane, as he was moving out of the nation.

And his complaint was, as we're developing this content that is going to feed future generations, if we look at the writers strike right now, a big part of that is that they don't want their scripts fed into AI systems that will then generate scripts for Hollywood, they don't want their movies that will go into Dali, that will show the iconic shots that were made and then recreated by AI systems.

But if we do not correct the record, then we're going to have the systems that we deserve.

And I wrote an article about this for Harvard Business Review, excuse me in 2019.

And one of the things that I said was all AI is doing is highlighting the imperfections of our past.

And the only way to make these systems better is that we have to have a multi pronged strategy to improve these ills of society so that we can have the machines that we want rather than the machines that we deserve.

And I hold true to that.

George Garrastegui, Jr.

George Garrastegui, Jr.: I love the fact of that we have to be responsible to correct the record.

Right.

And I think this idea of doesn't mean it doesn't take the artistic license out of the ability of Spike Lee's, you know, Malcolm X, but it also needs to say that there needs to be we know, obviously, movie making is storytelling.

And it's not always totally accurate, right?

Because there's there's all these things, but how do we have conversations and fill in the blanks?

Right?

So to expand that, so nobody look, because it's not a documentary?

It's a movie.

Right?

Right.

So it's from my perspective, like you said, it's talking about the hustler, you know, movement versus his transition, which would be an almost a secondary story.

I remember cutting school to in 1992, to go see that movie.

I mean, it was

Mutale Nkonde

Mutale Nkonde : Malcolm X and not last, that

George Garrastegui, Jr.

George Garrastegui, Jr.: that whole last scene were all the people are just saying, I am Malcolm X,

Mutale Nkonde

Mutale Nkonde : X, and it's the next person and the next.

And then you learn these were the people that helped that movie get finished.

And I'm somebody who all my work, I work through the lens of popular culture.

And specifically, I always say it's black popular culture, but I've been living in New York, nearly 20 years now.

And I'm telling you, my friends from the Caribbean, my afro Latinos, from the Caribbean, they be doing stuff too.

They be doing stuff too.

And so it's like these black and brown poor people that have created these cultures that have brought the world alive, as I look at AI as a space, as we do our policy making work, particularly because my particular interest is the creative industries.

And what does it mean for IP?

What does it mean for creative culture, that's what I'm really kind of interested in.

And that's where a lot of AI for the peoples doing their work.

It's because we don't want to lose that into these.

I'm just going to call them whack formulas, where it is like this word over here and this word, and it's like, yeah, but hip hop was created in the Bronx because people didn't have electricity.

So they tapped into the line.

When mixing came in.

The reason that hip hop was even allowed in my house was that my mother and father had the beats and the melodies from the 70s.

And then I had the person speaking over it from the 90s.

And so we could connect it we could connect if everything becomes standardized for through these formulas, that's not going to happen again.

But I'm looking at these, the Supreme Court affirmative action decisions.

And I'm like, we are about to enter a black Renee songs like we have never seen black and brown kids in the streets that don't have access are going to create things that we have never imagined and never seen.

And if the tools are right, AI is going to really just help us through that.

And I, you know, I'm getting to be an old head, I want to be able to help people do that in a way that is safe for them.

They can maintain their privacy, and they can maintain the IP and make their money.

George Garrastegui, Jr.

George Garrastegui, Jr.: I mean, when we look at design and creativity, it's always born out of limitation, right, we are able as creatives to work around the limitations.

And that limitation is what allows us to advance, right, because like you said, there wasn't electricity.

So we steal it from the light post, we didn't have rhymes, we didn't have music.

So we were able to take a break, beat and extend it for four minutes.

So all of these things are the ability for us to learn how to make do with what we have.

And now that a lot of creative tools are being so much more democratized.

Right?

Like you said, when they start creating things, it is the Black and Brown creatives who two prong part of this is they create the culture.

But also unfortunately, don't get credit for the culture because somebody else coops it and then makes it more popular.

We look at tick tock dances, right, and all the people who you don't see who created these dances, you know, are the ones who are the ones that are getting money off of this when the dancers or even fortnight like a video game, putting dances into their video game and not actually acknowledging the original person who created the work thinking that it's actually just free rein, I love the fact that we need to kind of as researchers counteract this, this movement, right not stifle it, but also bring in the other lens, bring in the other aspects to bring a broader picture.

Because I think that's where it's going to be, in my opinion, and just listening to you talk about it.

That's where it's going to adapt, because it's no longer just, oh, it's fed into this system.

And like you said, jet TBT is doing exactly what it's meant to do.

If there's nothing there, it can't make something.

Mutale Nkonde

Mutale Nkonde : And I think as well as I'm Generation X, so I'm the person that we are the best.

There's not many of us.

But we know how to live out here in the street with these younger people that they're all stressed out.

We're like, our parents didn't even feed us and we're fine.

You will be by so I'm the generation who I will never forget watching my first mace video.

And I saw mace.

I saw shiny suits.

I saw Diddy, and I just knew I was in the UK.

But I knew I was going to go to New York City and people who don't know that video because you are young.

I am not.

It's shot in Time Square.

So if you are from New York City, you're trying to find that shooter and Alicia Keys, like did it a generation later people have recreated that scene.

I am also old enough to know that people from that generation who I really liked, died poor and they died poor because of the contracts that they signed.

So a couple of years ago if you don't know DMX New York rapper, loved DMX.

Unfortunately, life didn't happen but died poor Craig muck died poor.

These people that were kind of my generation when I was young died poor because of IP.

I'm now in a position where those videos aren't being made anymore, but I'm seeing them every day on Tik Tok.

I'm seeing them on Instagram, I'm seeing versions of them.

And I want to through AI for the people and through our focus on policy, make sure that the people that are creating now not only own that, they can market it, that they don't lose it algorithmically.

The big thing.

I remember, a couple of years ago, somebody was asking me about black lives matter in an interview.

And I just happen to say but they don't own the hashtag.

And they were like they don't own the hashtag.

And I was like, No, you there is no rule to IP hashtags.

It was created on a public site.

Everything on that site belongs to the company.

Now if Facebook decides that they're going to sell advertising space against the BLM hashtag which they definitely did.

And say if you want to target this type of consumer is going to cost you $2 million.

That money goes to Facebook, we are the product because we're on social media, the hashtag get some clout.

People don't understand that.

And I think I definitely would like them to understand that.

And I'm only using one example.

Right, I'd like them to understand that I really like that the writers union has understood that and made that one of their demands.

I really hope that any of us that are in creative industries, I, like I said, I'm going to be I'm pursuing a PhD because I want to turn my dissertation into a book, long story was I was approached to write a book, I was like, okay, I can write a book, then I was like, How can I make the most of this moment?

And I was like, Well, if I apply for a PhD program, I not only get to draft the book, and I get fancy people to tell me that I'm very smart.

But I can mark it that because a PhD, which is different as marketing, that book is a girl that's been on the internet, even if it's the same thing.

And so I'm writing that book, because I want it to be in the corpus.

But within my contract, I'm gonna make sure that there's stipulations around how this book can be repurposed by a I

George Garrastegui, Jr.

George Garrastegui, Jr.: mean, there's so many things.

But what I love the most right is just hearing how we as content creators, and even the hashtag, right, like we don't own those things, and what AI for the people is trying to identify what those things are, and how to build policies that because we're making the company's money, but we're not as the originators of content, like you said, we are the products versus, like, how do we start to get people to advocate for that type of understanding?

Like you mentioned, right?

DMX Craig Mack, all these people who passed away, you know, unfortunately, had bad contracts didn't make all the money that they deserve.

When you think of flavor in your air.

It is like it's the soundtrack to a summer in like 93, nine before that, I'm just like, is amazing.

Mutale Nkonde

Mutale Nkonde : As soon as we get off this podcast, I'm going to listen to flavor in your air.

I'm going to listen to the Roughriders Anathan.

Unknown

Unknown: Don't, don't do.

Mutale Nkonde

Mutale Nkonde : And I live by the and I live by Barclays and I live by Barclays when that funeral was going on.

And I mean, it was it's new Kanye, but he was like old Kanye for that moment.

George Garrastegui, Jr.

George Garrastegui, Jr.: I mean, it is it is so disheartening to learn that the things that we I mean, it's in my blood, it's in our I mean, it's definitely the things that we we connect to end to find out that these people don't really get the props they deserved, or at least live as comfortably as the videos looked, and all of that.

So how do we start to one maybe advocate for this stuff, learn more about this.

So we can be a little bit more prepared?

Obviously, we know education, schooling and really rules, right understanding where the things come in to.

But as content creators or people who start to create this stuff that people start to resonate with, what's one of the steps we should take to gain some autonomy with this content we create?

I always

Mutale Nkonde

Mutale Nkonde : think about advice for creative people in three ways.

Number one, what is your purpose?

I think one of the things I've really loved about this conversation, and the will resonate with me, is how much you are pushing me to tell you why.

Right?

You're not being aggressive with it.

You're not when I say that you're pushing me.

But just the way that your phraseology is always like so how did that happen?

Why did that.

And I think that we have so much content creation that just goes on because people want to feel loved by lots of people they don't know.

And we need to kind of get away from that.

Because the like, isn't going to help you long term, the clout that you build.

There is no real market for cloud.

And I'm not saying that we should create for capitalistic name.

But if you know that you're creating because you have this bigger vision, for example, I launched AI for the people and the reason I say I launched that is that I really wanted to stop exploitation of black people and brown people via technology.

GE.

And the reason I wanted to stop that is that I found out so early in my own personal career, that technologies can express racism.

And the way I found that out was a friend of mine had been using Google pictures.

This is Google picture, I think it was before it was called Google image.

It was in beta.

And it was a picture of two black people.

And they were labeled as apes.

And that was in 2015.

Artificial Intelligence isn't going to be able to know that people of color are dehumanized by being compared to animals.

It's not going to know that.

It isn't sentient, it doesn't have memory, it doesn't have history.

Therefore, there was something in the engineering process that was bringing this idea of eugenics to bear.

So for me, I knew that I wanted to interrupt that, that made it really easy for me to work for AI for the people for free back in the day, it made it very easy for me to have sleepless nights, it made it very easy for me to push through.

Because even in my own personal story, I build AI for the people while being the single mother to two kids, and us not really having it like that, when we needed it.

But because I was like the people of the Bronx with hip hop, this is going to happen one way or another.

And then the research community caught up to me, and the people that caught up to me with a Harvard's, the Stanford's that, and they gave me the credibility that I needed to go forward.

And I eventually, I'm a writer, so I was very prolific.

I eventually got funding, shout out MacArthur Foundation, your real ones.

And that's kind of what got me going.

But I had this bigger vision.

So I asked, What are you creating for?

The second thing is, what do you want this creation to do for you?

Do you want it to be if you are in the video with the Maserati and the shiny pants, and the yacht?

And what do you want it to do for you?

If you want these creations to be part of what sustains you, not just financially but spiritually, emotionally?

Then what safeguards do you need around that?

Because if you want it to do all of that it becomes an asset?

And how do we treat assets?

And if the rules aren't there, are you going to be prepared to negotiate them in Samuel L.

Jackson, for example, will never sign a contract that says that his image can be used artificially.

And the reason he doesn't want to do that is that he saw how the Fast and the Furious movie was able to be shot after the actor died.

And summer L.

Jackson was just like, they're trying to draw actors in.

Oh, no, no, they won't be doing that to me.

So what do you want it to do?

Then I think the third thing is, you really need to lean into your education.

And if you don't have it formally, then you need to pursue it.

And you need to start thinking about the history of black and brown creators, how would they treat it?

What led to that?

And I think the conversation we've just had around DMX and Craig Mack, it doesn't have to be a deep history.

It doesn't have to be a deep history.

We know that they were exploited and they were exploited.

In those cases by P Diddy.

You know, they were just exploited by somebody else who wanted to be a capitalist in that in that region, and all of those things and if we think about AI, look at Addison Ray, who is this white dancer on Tik Tok, she did the renegade dance, she made it famous.

The dance was actually created by this black girl who was in Georgia Addison Ray got representation.

She got Netflix deals, she got all of this, based on the fact that she was this white girl that was dancing on Tik Tok?

Would she have got those same crops?

If she looked like me?

Unknown

Unknown: No.

Mic drop right there.

Mutale Nkonde

Mutale Nkonde : I mean, think about the millions of dollars that Facebook Twitter made off advertising against that BLM hashtag.

George Garrastegui, Jr.

George Garrastegui, Jr.: Right.

And those people never seen a dime of it.

Mutale Nkonde

Mutale Nkonde : No, you know, and maybe be a bad example, because they ended up getting, like raising money in 2020.

But it's every hashtag that goes viral,

George Garrastegui, Jr.

George Garrastegui, Jr.: right?

And just an understanding of how large that hashtag was.

And to be like, if an organization like social media is making money off of it, right, but you have no rights to that, because the platform holds the content versus you.

Right?

I think that's one of the things that is so interesting, right?

Where people were like, I want free speech.

I want this.

I'm like, all these platforms are free because they're owning your rights and images.

If you had to pay for it, you wouldn't pay for it.

But then you'd own your own information.

Right.

So like there's a reason why it's free for are to be free, we have to give up some, well, we don't have to.

But that's the contract, the contract we build in is, we are going to be part of this so that we don't have to pay.

Mutale Nkonde

Mutale Nkonde : We're giving up our ideas, we're giving up our creativity, we're giving up.

Mark Zuckerberg doesn't post a thing.

You know why his time is valuable?

He is going to get em compensated.

George Garrastegui, Jr.

George Garrastegui, Jr.: Right.

But he doesn't need to know, right?

When you think about all of these people who do you know, he's also not totally egocentric, like Elon Musk, who needs to be out there in the zeitgeist all the time.

Right.

But you know, I think that is that he knows what he's doing, and has the right people doing the things that he thinks needs to happen, and just rakes in the dough.

Right, which a lot of bigger people are doing that you don't really know most of the people in these big companies?

Mutale Nkonde

Mutale Nkonde : You don't know.

And I think AI for the people is kind of my answer to that, in the sense that I have this big interest in social media.

But rather than mini blog about it, on my own platforms, I can actually go in, roll up my sleeves, develop an organization that looks at some of these questions that that tries to create rules that provide opportunities for more people.

And it can sustain me, it can stay my lifestyle, it can be my work.

And I think if you're going to be a content creator, you have to look at it through the lens of work.

George Garrastegui, Jr.

George Garrastegui, Jr.: So as we start to get towards the end, and I appreciate you understanding how I like to dig into why we do things, but part of this conversation, right is you're going to be presenting at the White House conference being held virtually in October.

And I want to ask you a couple of questions that are more specific to that what that conference is about, and potentially, you know, giving us a sneak peek into what you may be discussing in October.

So one of the things that we need to think about is the prioritization of diversity, equity inclusion in organizations, right.

And it's obviously, you're thinking about in policies, but how do you think that organizations need to make sure they prioritize this in the way that they focus on different aspects of their business?

Mutale Nkonde

Mutale Nkonde : I think the first thing is that I reject this idea of diversity, equity, inclusion being somewhere else doing something else.

And I think that companies need to really think about it in terms of compliance.

And they really need to understand that if you want the most innovative product goods or service that is going to take the world by storm, then you need to make sure that you have a workforce that draws from every single part of the the talent pool, and in drawing from every single part of the talent pool is going to make you compliant to non discrimination laws.

Because specifically, if you operate in the United States, your company could actually be fined for being non compliant.

And we have really good examples of this, for example, Facebook, which was Metis, former name was fined by the FTC, some huge amount because of housing discrimination and its algorithm.

And what they had done was to create an algorithm that enabled people that were renting houses to not show it to black people that was in violation of the Fair Housing Act.

And they got fines, would that algorithm have been built and got through their compliance team, if there had been people within that team with expertise within civil rights and civil rights law?

George Garrastegui, Jr.

George Garrastegui, Jr.: So with that, right, and obviously, we always know that the inclusion of of the proper people in these organizations helped make sure that that some of these conversations that are being held are being done, the more correct way is the landscape of D AI, in the future?

Shifting?

Do you think that it's, it's going in a positive or a negative direction,

Mutale Nkonde

Mutale Nkonde : I think is going to go in a completely negative direction for a while because politically, people are not accepting of it.

I think that there is a lot of momentum for what people call anti woke.

I'm not sure what that means.

That I think we're going to see the types of firings that we're seeing in the movie industry, for example, here in the United States, there have been for a diversity and inclusion heads that were all fired within two weeks of each other.

I do think that that's going to hurt the experience of the products, the goods, the services that those companies use.

I do think that that is going to hurt share price, we have seen the that in Twitter, for example, Elon Musk bought the platform in April of 2022, for 44 billion, he got rid of many of the people that were doing diversity and inclusion work, he got rid of their responsible AI team.

And the experience of the platform changed and people came off the platform.

And it was recently valued of 15 billion.

So you're talking about a $30 billion share price drop.

And I think that his shareholders are going to be like you need to have these people on because they're making it safe for more people to be online.

And I think that once we get to that point, we can start right sizing, because so much of what I've seen of DIY, specifically in the tech space, is around like workshops and books that are just about people being nice to each other in the interpersonal.

And they're not really about answering structural questions like, how do we search for bias in our own datasets?

And then what do we do about them?

George Garrastegui, Jr.

George Garrastegui, Jr.: Right?

Is a lot of times surface level.

A lot of times also, it's not even including the right people in the room?

Can you give us a little bit of a sneak peek of what you're going to be talking about what if

Mutale Nkonde

Mutale Nkonde : I have not even started to plan it, which is not unusual for me.

But one of the things that I'm hoping to talk about, so I would have just been back from the Congressional Black Caucus, and I'm on a panel called AI for the culture, which I thought was hilarious.

And they were like, we can't we love the name of your organization.

So we invited you and I was like, Oh, you have no idea why I wanted to go on that panel are going to be folks from Universal Music Group, because we very famously had the AI generated Drake and weekend song that was released, and in my opinion sounded better than Drake or the weekend.

And so I think it's going to be based on whatever findings, I can come back on the music end.

And hopefully, I'll be able to play some of that song.

Also, I've just come back from a week with Tik Tok.

And we were looking at their algorithm and content moderation.

And I'm gonna figure out what I can say publicly about that, because they're doing some really interesting things with technical design that get to this idea that they literally won every kid in the world to be ticking and talking.

And they have to change the platform to do that.

George Garrastegui, Jr.

George Garrastegui, Jr.: Wow.

Well, I definitely can't wait to see what you present on that day, as we all tune in virtually.

And last but not least, where can our listeners find out more about you AI for the people, and what we should be looking for in the future.

Mutale Nkonde

Mutale Nkonde : You can find me on LinkedIn, I, every other social media platform, I don't know, because I was a big tweeter.

And now, I'm not really down with kk k like that.

So I'm not trying to be on Twitter.

But definitely find me on LinkedIn coming up, you're gonna see a lot of conversation from us around culture, and AI.

So in November, I'm going to Milan to be at the Vogue photo summit to talk about art and AI and pictures and images, and AI, I'm doing this in October.

And I definitely am going to do some writing some journalism just around where we're going.

You know, I always say I want AI to be for the people.

That's why the organization is called what it is, I don't want to get rid of it.

I just want to make sure that every single person that interacts with this technology has at least some opportunity to to benefit from it.

And and not this kind of eat or be eaten dynamic that we have right now.

George Garrastegui, Jr.

George Garrastegui, Jr.: Yeah, I mean, you know, the biggest resonation with me is the fact that as content creators, you know, being able to control the content and using these tools, these tools to elevate what we do to accentuate what we do, but not to replace what we do.

And a lot of the conversation that you're talking about, right is people think that, you know, AI can replicate what we do.

But people are so scared of that, because they think that as a learning model, like you said, it's sentient and able to connect dots.

And it's not.

And I think one of the big things is that we need to be here to connect those dots and fill in the blanks.

And I love hearing just kind of, you know, you stating AI for the people and really understanding what is artificial intelligence for the people mean, and how we can start to, I guess learn to train the system better to start to include some of these things, which tend to be the emotional things that are connected to What these very surface level things are?

Right, right, because you can't teach emotion.

Mutale Nkonde

Mutale Nkonde : Lastly, what would it look like to join a social media network?

And within the Terms of Services, say, if I create a hashtag that goes viral?

I want a percentage of advertising, you wouldn't make so many people rich on their couches.

Right?

And rightly so.

Right.

You know,

George Garrastegui, Jr.

George Garrastegui, Jr.: they are the content providers of why you're on the platform in the first place.

It's not the platform, it's the people.

No,

Mutale Nkonde

Mutale Nkonde : no, it's the people so and it's gotta be for the people.

I keep telling y'all it's gotta be for the people.

George Garrastegui, Jr.

George Garrastegui, Jr.: Of course.

And, of course, we need to end on that, once again, with Holly, thank you so much for this chat.

I really appreciate it.

I really love the down and dirty Hip Hop references, I could do that all day.

And we're both of them same generation, where we just understand that how this whole thing is shifting.

And I can't wait to hear what you come up with and your presentation.

And I understand that it's, you know, too early in the in the calendar year to figure out what you're doing.

But there's so many other things that will happen before that will, that will definitely influence your presentation that day.

But I can't wait to hear what you're going to do for the what if seminar, October 3 or fourth, I forget what day you're presenting, check out my show notes to get a code for the event.

And hopefully you can also see Metalia and some of the other people that will have on the podcast during the two day event.

But I appreciate the conversation, learning so much more about kind of deconstructing my own biases against AI and what it is or the lack of understanding and thinking it's just this kind of fun for all tool and really understanding how tech and the policies and the IP and things that go into it.

You know, as content creators, we need to be more responsible of and start to own.

And I love the fact that you're trying to get us to that point, with one knowledge and then to focusing on ways for it to be the shift that needs to happen.

So thank you so much for that.

I really appreciate this conversation.

Take care and this has been Works in Process.

Oh wow.

I just want to thank Metallian Conde for enlightening us on AI the accountability and bias that will occur with new technologies.

It's really interesting how we can start to be the owners and controllers of our own technological destiny and I want to see how AI for the people is going to help us envision that and if you want to learn more about the various projects or people organizations mentioned in our conversation, please check out the show notes on your podcast player or at our website w i p dot show.

The Works in Process podcast is created by me, George Gary sticky Jr.

and the content transcriptions and research have been done by or Szyflingier and this episode has been edited by RJ Basilio.

You can find the Works in Process podcasts on all media platforms such as Apple, Spotify, Google, and more.

And if you liked the episode, feel free to give us a five star rating on Apple podcasts and Spotify.

And if you're feeling extra generous, write us a review.

It helps other people find the show.

And you know what, just describe on whatever platform you're listening to right now.

It's that easy.

Follow us on Instagram or LinkedIn to stay up to date on the new releases of episodes.

I really appreciate you taking the journey with me.

And I hope you enjoyed this conversation.

Until next time, remember your work is never final.

It's always a Works in Process.

Never lose your place, on any device

Create a free account to sync, back up, and get personal recommendations.