Navigated to Artificial Intelligence, Data Privacy & The Future of Work with Dr. Calvin Lawrence - Transcript
Perspektives

·S3 E23

Artificial Intelligence, Data Privacy & The Future of Work with Dr. Calvin Lawrence

Episode Transcript

Speaker 1

It gets no better than this.

Speaker 2

You are now in June to respectives with big bang bang.

Speaker 3

Let's get straight to it.

Speaker 1

Welcome to Perspective bank.

Speaker 3

Man.

Speaker 2

Today, I'm sitting down with a guy that I've been waiting to sit down with.

Man, I met him through my friend Grace, my sister Grace.

You know what I'm saying, doctor, doctor D law doctor Kevin D.

Speaker 1

Lawrence.

Speaker 3

Right, yes, sir.

Speaker 2

Now, we had a conversation that was that was kind of crazy, and I want him.

Speaker 1

I brought him on to give it to y'all.

How you gave it to me?

You ready it out?

Speaker 2

How you feel the first I want to I want people to let you introduce yourself to my audience.

Speaker 1

Let the people know who you is.

Speaker 3

Yeah, definitely, definitely appreciate that.

Hey, look, I've been looking forward to this forum, to this vehicle.

Okay, I'm on a lot of different platforms, but uh, those audiences not really the audience that I seek to address.

Okay, So I'm excited to be here.

So thank you now you bro.

So Yeah, definitely, Calvin D.

Lawrence.

I have a PhD in AI, So PhD in AI.

Yeah, so I'm one of the field.

There might be another one.

I don't know who have who have a PhD in AI, but I'm really get caught up so much in that academic and academics out of it.

But I'm an engineer, so I've spent the last twenty five years or so building real life systems around artificial intelligence.

So the people in the industry, you know, pick your favorite tech company, they kind of know who I am.

So I actually went back to get my PhD during the time I was writing a book.

So I've had my PhD now for a couple of years now.

But long story short, you know, I'm one of those guys who've been out there in the foxhole building these type systems, like real AI systems.

Everybody say, no AI now, but these are like real machine learning systems that we've been building over the last twenty five years or so.

So's a lot of stuff that we see now it's seemed.

Speaker 2

New, It seemed new, But but y'all been seeing all this stuff, huh.

Speaker 3

I mean, I think that's the scary part.

That's kind of why I'm here, right, kind of I kind of changed my perspective and my view from that corporate component of it, which I mean, obviously I still do.

But really just talking back to the folk that I grew up with, like from the hood, from the places that I came out of.

I'm just seeing some of this technology impact them in some ways that I'm not really happy with.

Speaker 1

So what was the sit?

Speaker 2

Oh, look, before I get into that, I want to know, who were you before you start started your journey on the AI journey of your pH D journey?

Speaker 1

Who were you before that?

Speaker 3

Well?

Man, it's it's you know, I got some age on me for sure.

So I've been in this space for a minute.

But you know, I grew up.

I grew up in the projects.

So I'm from a little small town called Everton, Georgia.

It's where I kind of grew up, and that was a place we moved at.

But it's right outside of Athens, Joy, Okay.

And so I mean I was the first person in my whole neighborhood to go to college, not in my family, but but then walking distance miles.

So you know, it was a big thing back then when I went to college, which really kind of impact that was later years and kind of make me do what I'm doing right now.

It was that background piece I grew up with a probably like most of the folks are on your podcast, single mom, alcoholic dad, mom, had four kids, struggling two three jobs to make it happen.

I end up going to college, and you know, the rest of it is just a talk that I've been doing for the last twenty five years or so.

Speaker 1

So what coming from those circumstances, What drove you to technology?

Speaker 3

Though?

I've always been kind of a I've always been a mad guy.

I've never been really good at speaking or I've never been really good at writing per se, which is kind of weird now how life has kind of taken a three hundred and sixty degree approach.

But I've always been good at mad.

So I touched my first computer in college, so I'm not gonna say that I kind of dreamt of being like some computer tech wieds because I ain't touched a computer to college.

And the reason I went I went into computer science was that somebody told me that you can make more money in computer science.

So that was the motivation behind doing it, and got theres started there.

You asked the question earlier.

Kind of that led into the AI stuff.

But I got my master's degree working for NASA in the mid nineties.

Wow, So I've been doing AI on a very heavy scale for some years and then obviously AI died out, but we were still doing it, but the normal person did not know it.

So now when you see AI, like everybody know it.

You can pull your plumber to the side off from the side of the road and ask him what AI is.

It will at least be able to tell you how.

I have a high level what it is.

But I've been been in this technology space for longer than I can acknowledged.

To be quite honest, I don't really know too much of anything else.

Speaker 2

Yeah, So when you first start going to school, what were you going to school for?

Speaker 3

Like, what was your first So you know, back then you didn't really have to go to school for any thing.

You just wanted to get out of wherever you came from.

So he just went to school like you.

I didn't even have a major in my first two years.

I was just going to school, going to class, knowing that if you drop out, you you about to go back about to go back home, You about to go back to your mama's house, which is the projects, and you had to go back to the streets.

So it really I really didn't even have to have a major.

I just needed to get needed to get away from that.

It was I think maybe my junior year when somebody told me what I just said earlier, if you get into computer science, if you start playing with these computers, boy, you will stop me.

You'll make a lot of money.

And it has turned out to be that way, to be quite honest, for sure.

Speaker 2

Like, so, what was the reason for AI in the beginning, Like, what was the first reason that people wanted to even make AI?

Speaker 3

Yeah, man, that's that's a really good question.

The military was using AI way before it become fashionable.

You kind of think about what AI is, you know, let me let me just start off with my definition of AI.

Now you'll get all types of schools of thoughts from lots of so called geniuses.

For a person like you and I, AI is is simple, simply the science of making the machine that can think, reason and learn like a human.

So it mimics human behavior.

So way back then, uh, even when we first started it, it was it was still kind of for that reason.

Now, there were some issues.

It didn't look like it it didn't look like it looks today.

Back then we had some issues with it right, because AI needs three things in order to do well.

Uh in these high speed, high capacity and you know back in the day, you know your your phone to they had more memory than the computer that a big bank would be running on back then, so you couldn't really couldn't do a lot of the stuff you see today because of memory.

The other piece that it did not have was data, like we didn't have.

In order to build any of these systems, you gotta have lots of data, and at that time, we didn't have a lot of people weren't giving us data, and we didn't have any place to store it.

But now, as you know, like we got too much data, everybody giving their dataway.

Like I'll let you follow me all around for a whole month and get all the data if you give me some free, if you give me some free, then my privacy don't really matter.

So back then, you know, like in your mama's world, even like privacy matter, you ain't gonna have nobody following you around.

You wasn't going to allow people to basically track you.

Well, today they track you.

Speaker 1

Everybody becked.

Speaker 3

Everybody's being tracked and it's being monetized.

It's been monetized which way they monetized tracking.

So so you think about this, right, I give you this apple.

So I was working with the system with a major car dealership.

Right, so they used to say BMW for sake of conversation.

So BMW have inside of their computer, right, a GPS system.

So if you're say, you're living in New York and you got you, you know, you, your wife and you got three kids and you're gonna drive down to Disney World from New York.

All right, so you put put it, you know, put Orlando, Florida, Florida, and your GPS system.

It gives you five routes, the shortest route, quickest route, the route with less expressways.

You have all those routes.

But what if somebody was incentive to get you to pick their route?

So, so what if McDonald's Hilton Hotel go to BMW and say, hey, if you can get Bank to choose the route we want, we'll then send you and Bank.

Speaker 2

So how they choose a route You're saying like with the basically like the suggestions exactly.

Speaker 3

So So really this is a really good question.

I'm glad we get into this.

Right, they're gonna send you, So what if what if Disney World tell you bank if you pick the route I want you to pick, I'm gonna give you thirty five off your Disney World tickets.

Would you do it?

Yeah, you'll do it right, most people will do it.

Well, that's what they want you to do.

So you picked that route.

Now how today monetize it?

To your question, Well, if you go a certain route, they know that you got three kids, You've given them your ages of your kids, and you know that from New York City to Disneys through Orlando might be what twenty hour ride?

You know, after about seven hours you're gonna be going by.

But the route I had you pick had Hilton's all the way across it.

So now I'm gonna sent you when you get close to Hilton, I'm about to send you again to stop at Houghton.

And how did it all started.

It's all started because some AI algorithm gave you five options, and then your real world you could pick which one you want.

But now no, I want you to pick the one that I want you to pick, so you monetize it.

Look, this whole system really is about the monetization of AI.

I mean, that's the idea we don't really see it that way.

That's kind of why I love platforms like yours, because at the end of the day, most of us we think AI is chat GPT.

Yeah, you ask most of us, like, what is AI is?

Chat GPT?

The reality of it For people like me who've been building these.

Speaker 1

That's the smallest, that's the littleton.

Speaker 3

It's like calling an app on your iPhone the iPhone, Yeah, it's not the iPhone.

It's just now, Chat GPT is not really even a I if you really want to think about it.

It has none of the premises that I talked about earlier for AI.

But folks would have you think that it is.

And they's the reason why they want to desensitize us as black people.

And I'm not being a conspiracist at all.

This is like real, real stuff, and I know I've worked you know, it's interesting you say that, right, It's it's know you interesting you say that because, uh, it's kind of weird.

It kind of like I'm I like most most other black people who work in corporate you know, we we those of us who came out of nothing like that was my that was my that was my competition with the folks who was all of my friends who were selling drugs.

It was like my Mom's like, if you go off and do X, you can make as much money as the people who are selling drugs.

And I personally always wanted to sell drugs.

Like that was what I'm not really that's what motivated me.

I'm just being honest like that that those those guys were the people who dressed nice, who drove nice cars.

Speaker 1

They had to figure it out.

Speaker 3

Yeah, they had to figure it out.

But really, when you really think about it, it's the same kind of stuff.

It was the same kind of supply and demand perspective.

So anyway, getting back to your question, you know, and go to college and then I go and work in corporate and in corporate for the company I can say those right, and I will use the name here like I mean, I've worked many years for IBM, which IBM is the top tech company in the world.

We've been around longer, right, And at IBM, I'm what they call a distinguished engineer, a distinguished engineers at IBM, I'm in any tech company is it's not a big thing for me, but it's a big thing for them.

So let's just say IBM has three hundred thousand employees, one percent of those will possibly get to the highest level, which is called a distinguished engineer.

So let's just say you got three hundred thousand, and so let's say you got you know, three hundred or three hundred distinguished engineers or so really it's point h one percent out of that three hundred distinguished engineers, it's only about ten blacks in the whole world.

I'm one of those blacks.

So at the end of the day, you kind of look at when when I tell a story, they care, like, so if somebody's gonna get mad at you, if somebody's gonna hurt you, they gonna hurt me because they know that I'm at that top level, and they know why I've seen all this shit.

They know I've built it.

So it's funny.

When I wrote the book, man, I wasn't I'm not a writer at all.

I started as all, I told you, I don't really like to write that much.

But a company came.

One of my white friends was writing a book on official intelligence about four years ago, and he asked me, would I be willing to write a chapter in his book about how AI impacts black people and it's the white guy who's who asked me?

And I said, yeah, I said, I'll said I'll do it.

So, uh so I wrote the chapter in the book, and when he published it, the publisher was so excited about the chapter that I wrote, they asked him, can you get him to write a real book, big, a book that're going deeper into it?

Because I was what's the blowing.

I was telling very very specific stories about people who was hurt, people that I knew in every industry, policing, facial recognition, predictive policing, all of that policing was that so predicted policing is basically the idea of it.

You definitely need to notice.

I'm glad you asked that question.

Predictive policing is basically this system where police departments are have a AI based system that will predict where crime will appear, and it will also predict who will actually commit the crime.

So you live here, right here in Atlanta, and predictive policing is all over it.

It used to be a time where that's fucked up.

No, dude, I'm just just fucked up.

Speaker 1

You saying they got some shit guessing where you're gonna do it and who might do it?

Speaker 3

Yeah, dude, like dude, like I did as a part of my PhD PhD research.

Right, my dissertation was on predictive policing law enforcement in black communities using AI.

Right.

So, so I I studied one hundred and forty five cities in the United States, and I studied three hundred and fifty law enforcement agencies in the United States.

These are just the ones I studied.

All of them are using predictive policing.

Now, I know, because I built a bunch of those systems.

So I know specifically at the time, I did not understand what those systems were doing.

You know, you know I mentioned earlier right to be quite honest, because somebody would ask the question, like that, Cavil, why the hell.

Speaker 1

You do it?

Speaker 3

Like what you thought you were doing?

Yeah, So, do I want you to kind of empathize in a second, put yourself in my shoes for me.

So when we go, when I go to like most of us, we go and we build systems, we're not really thinking about so much the victim.

We're thinking about the person who's paying us.

They're just like somebody who's selling drugs.

You know, you ain't really thinking about what's happen.

You're not.

So for me coming out where I came from, I'm trying to make that five hundred thousand a year because I'm at that level right So I'm building these systems, so I'm not really necessarily thinking about the outcome normally, as a technologist an engineer, like we think about two things.

We think about what they call functional requirements.

That means somebody go in, say I go to company A, and he tells me what he wants me to build.

Those are functional requirements.

Function they telling us specifically, I want this system to do these three things.

Those are functional requirements, and any other part is non functional requirements.

That means I wanted to be secure, I wanted to not crash.

I want the app not to crash.

I don't want nobody tapping into this bank app that you got instilling your money though the non function requirements, that's all we think about.

If anybody who sits in this seat, who in my space, say they say something different, they lying because you're not allowed to think about nothing else.

You're thinking about the person who paid you to build the app.

Speaker 2

So you're trying to be at the best app you're trying to build a way.

Don't crash with it, don't do it.

But you you engineering.

Speaker 3

Basically exactly, and and you get caught up in that, right because you've been taught and trained to do that.

But then some stuff happened for me personally that changed the game.

And that's what made me like, take those risks that you just talked about, because I'm telling you, yeah, you're right.

People are like what not just white people, I mean people of color.

They're like, dude, you're crazy, Like write a book.

Yeah, why would you write a book?

You're making that much money, why would you write a book.

But no, dude, it's real because the people who sit on your on your sofa, on this seat, they're the one that risks.

They're the one that risks.

Right.

It used to be I was listened to one of your podcasts, dude, and you was.

I think he was interviewing uh.

Speaker 1

Little Woody last night.

Speaker 3

Yeah, well last night.

Well man, that was crazy because I didn't.

Speaker 1

Know it was the last night it said came out last night?

Speaker 3

Okay, cool, Yeah, I was.

I was watching it and y'all talked about y'all mentioned, uh Fanny Willis.

I'm a huge fan of Fanny Willis.

Hey, I have made Fanny Willis's job totally easier because it used to be a time and I heard, I heard, I heard Little What Woodie, I heard, uh Doug, I heard other folks who been on this platform, and I know people in my own hood, in my own area.

It used to be that if you didn't have an eyewitness, or you couldn't get somebody to snitch or to lie or to do some then you really didn't have a case.

That ain't the case no more.

The reality of it is I got a better eyewitness than Little What.

Those cameras, and not just the cameras, but I also got I also have basically kind of like a I have not just a camp, but I have a kind of a heuristic device on top of the camera.

So it's not just watching you, it's listening to you.

And we all know that those cameras are people who build it, like me knows that the cameras like facial recognition, those cameras or they are a hundred times less likely to get you correct than it is a white person.

And that's how it's trained.

That's that's just nobody.

Speaker 1

You said like they got it basically for us.

Speaker 3

You know it, man, Look man, the idea behind it, right, no matter and you know you when this go out, you're gonna have police stations, they gonna know, they gonna know me.

Even in this city they don't know because they've seen me.

They're managing teams like their whole idea, right is to solve crime faster by any means necessary.

So right, So if they had cameras in the neighborhood and they basically the idea behind the cameras is to determine where crime might happen and who might commit it.

So it's not necessarily to figure out who did it, per se.

But you've already been typecast.

It's kind of like remember that definition that gave you earlier.

I said, AI is the science of making the machine that can think learn like human.

But what do humans do?

They already think you like they already.

You go to any neighborhood like mine, you're gonna see more police cars circling the street.

Prejudice, man, the whole deal behind it, right, it's Look, I wouldn't say because because somebody would argue the point, right, and I would argue the point the camera's not prejudice.

It's the people who build in the systems program the systems.

Like I told you, like, dude, like it ain't no lot to me.

I've managed teams of people and there was no blacks.

I was like the only blacks.

I said in my book that I've been on hundreds and hundreds of teams inside and outside of corporations, and only one time have I been on a team with ten or more people that there was more than one person.

So you think about it, like these systems that are being built, they're being built by people who I wouldn't necessarily color them prejudice.

They are people who don't have the same optics.

They're on the same view.

I'm gonna give you the point.

You're gonna appreciate this one.

A company came to me a sheriff department.

I don't tell you what city that you might be able to guess.

So in this shaf department they basically had a it was it was like beat down didn't have enough space, you know, bed mights.

They had gotten a lot of negative publicity in regards to the sheriff department the jails.

Okay, I'll let y'all pick whatever you're gonna pick up.

But okay, So so they come in and they say, hey, we're gonna do something real cool.

We're gonna let people out.

We're gonna we're gonna have we're gonna have every month, we're gonna allow people to to get out.

We're gonna let three to five people every year, every month go for early bond.

And we want to have an AI algorithm to select who's gonna get out.

Follow me so far.

So as opposed to having a some group of people say who's gonna get out, we're gonna let an algorithm do it, real real stuff.

Right.

So the algorithm, basically, when we build an algorithm, may take in lots of different parameters, tell you what kind of crime you did, who got injured.

All of those things go into the training model to predict who might get out.

Let's just say, for instance, you basically have ten arrest You've been arrested ten times, so take a step back the algorithm there.

The more the higher your score, the less likely you're gonna get out.

So we score you based upon all those factors we talked about.

Okay, so the higher your score is, you go to the bottom of the line.

Right, Let's just say you come in and you got ten arrest You've been arrested ten times.

And let's just say you got a white friend, and the white friend comes in at the same time you did for the same crime.

He got three arrests.

So you got ten arrests.

You he got three arrests.

Now the algorithm, the algorithm ain't smart enough to determine nothing other than what you programm it.

It's just no, ten is higher than three.

Anybody know that.

So the algorithm says, the person who got three arrests gets out fastest quicker.

So the people who building the algorithm, I'm like, okay, you got ready to think about this.

Now you got to have different optics, and I told you, what do the team kind of look like they're gonna be white.

The programmer is right.

So the algorithm comes out and they let the guy out that has three arrests.

Makes sense, right?

Not so fast?

Shouldn't make sense to you because the person who got what if I got arrested ten times for protesting and they threw it out, So I got arrested, but I ain't get convicted.

So I got arrested, and each time I got arrested, and you know, as a brother, black people get arrested for shit we ain't do all the time.

Excuse me my language, in our neighborhoods, in our community.

That's just what happened.

Be what it was saying, like, they'll just arrest that brother just because they sinning.

They think he ma'am.

But that's the whole mindset, right, So let's just get back to our story.

So ten, he's been arrested ten times and he ain't got convicted, so got thrown out each time.

So ten arrest got thrown out ten times.

What's the score zero?

Right?

My score should be zero.

I got arrested, but I ain't get convict that what if that person and white guy he actually got convicted and he did what he says, Hanni's crime, but he just still got out.

How you get out before me?

Well, the reason he got out and this is a real life situation.

It's not that AI is prejudice.

It's not AI is bad, and it's not even the people who built it.

It's like their perspective because the perspective of these ten white programers really solved as a black person that when you get arrested, you probably did what you did, So in their mind, they built the same system.

Now what they should have done, They should have said, it's not just the rest data we're getting, we're getting conviction data.

Where whereas the conviction data it ain't not the chef department, it's at the court house.

So you need to get your happy butt up from there, and you need to go over to the court house and you need to compare arrested.

That's not happening.

So the reality tea of it is, and these algorithms are scaled all the way across the world and always certainly across this country.

So the same algorithm that I just told that was built that brother who got arrested ten times.

He ain't about to get out, He ain't about to get out, but he ain't really did nothing.

He ain't really did nothing.

Any other options.

I'm talking without you asking question.

But the reality of it is, it's a feedback loop.

You said it's a feedback loop, right, So this is the idea.

This is how it works.

So if you basically, if all the police is in your neighborhood on the east side, west side, said police department got footing County, We use footing county.

As an example, they just said, it got a hundred cars for the night from two o'clock to six am in the morning, and all a hun it is in your neighborhood and none is on the north side.

What's the chances of somebody getting arrested on the north side?

Zero zero?

It's common sense, ain't it?

Okay?

So now you so, now you give me.

Now, I get ready to go to that police department.

And they want to Now they want to bill used AI because they don't have enough police, so they want to scale up to use AI to get to have more people.

They want to be able to use AI to tell where the police department, where the police should go, and how long they should go, how many people like that's the use of AI.

We want to predict where crime is gonna happen, and we're gonna predict who it, who's the person.

We're gonna predict where the crime is gonna happen, what time is gonna happen, who committed the crime, but also we want to commit who the victim might be.

Is that crazy?

No, no, no, no, no, this is real right, this is this is real stuff.

Right, Nobody ain't gonna argue me.

They're not gonna argue with me on this.

You got police people who are gonna be listening to this.

They're not gonna argue.

That's that's how you use AI.

But this is the problem, right In order to build that algorithm, the first thing I'm gonna ask them is give me all of your data.

Give me all of your rest records from the last five years, because if you don't give me that, I can't build it because you have to train AI on data.

Speaker 1

Yeah, you have to train it.

Speaker 3

So you're gonna give me your rest records.

Speaker 1

Are you basically gotta give it a memory?

Speaker 3

Yes, So if you give me all of your all of your data that you've been all of your rest record, who's in that who?

What population is in that rest record?

That you guess?

Okay, So what do you think AI is gonna predict?

Why would AI predict that a white person committed that crime when everybody in the data pool is black?

So AI is gonna tell you exactly what you want to hear.

You you just want AI to justify your Yeah, So you don't want to be on the cop on the on the cover of the A J A C.

Or or you know watch them.

So it's a justification system.

That's the reality.

And that's why when people start talking about what is AI is chat GPT, like you lost your mind?

What are you talking about?

Just like that?

Oh I use it too, I use the tool.

But but but my point is this, though, you also can relate to what I just told you because the people who are gonna be impacted with that.

But it ain't just it ain't just policing, dude.

It's like there's tools right now that's out there, like our name drop, like Compass.

Compass is a tool.

Compass is a tool that they use to predict bails or to predict how long you're gonna be in jail, in your and your ability to reoffend.

Like the reality.

If you got a judge sitting in that seat as a human, he's gonna do that same thing.

He's gonna be like, oh, you've been in here a lot of times.

I'm gonna give you a higher centers because you have a higher ability to reoffend.

Okay, So tools like Compass they come in and now these courth houses are using AI to do what a human would do.

The problem is that that judge has some have some level of empathy.

There's a goblin.

Speaker 2

They can see you in that they can see you and yeah, I see your record.

I'm reading your record, but I'm seeing you.

You said, AI can't.

They don't know nothing.

But he done been in him five times, ten arrest but no convictions.

Speaker 3

Slapping too little dude, Man, I said this on my social media.

Man, I do WI because I I'm that kind of dude in points like I like to antagonize my peers.

So I would do on my LinkedIn profile.

I'll have like all of my I got lots of friends who got PhDs from Harvard and Stanford and MIT and all of those different folks.

So I'll make a statement and I start a debate because I want to bring them in.

And I asked the question I said about three months ago.

I said, AI is not capable of an original thought.

Chat GPT ain't anything that you get from chat GPT is in the database.

It's like this water, It's like this bottle.

It's already in there.

It's not gonna create.

It might seem like it writing that paper for you differently, but it ain't.

It's just basically taking stuff in there and rearranging in it, getting it to a point where you it just did your job for you.

So you're happy with it.

You happy with it.

You wrote the paper, it redid it, blah blah blah.

That's how chat GP works.

So I said to this group of people, all of these distinguished professional professionals, who from all of these high price, high lofty schools, I said, AI is not capable of an original thought.

And it goes back to what we talked about earlier.

That judge is capable of an original thought.

He looks at different things, He look at your circumstance, he looks at blah blah blah, and he said, like this dude was here, but now he's here.

So I'm gonna give him one chance, because I've seen growth.

AI is not gonna do that AI that just replaced that judge or replace that parole board, is just gonna look at what it considers to be facts.

And the facts is what's in that database, what's in that record, what's in that data storage.

So I asked the question, it's not capable.

And as you might imagine, every lots of folks, they are at the point because they don't what me and you are talking about here Now most of the scholars don't agree with it.

They don't think AI can be by at all.

The white ones.

Now that's logical freaking.

They control bias, they control right because they're biased.

I say all the time, if AI.

That's why, no matter where I am man, I don't care if I'm on Good Morning America, I don't care if I'm on CNN, I don't care where I am.

I always start the interview off with the same way I started this one.

AI is the science of making a machine that can think, reason and learn like a human and mimics human.

So who is it mimicking?

If that's true, who is it mimicking?

Speaker 2

So you don't think AI can get smarter than a human?

Oh, that's a really good question.

I think ultimately, when you put enough data in it, fundamentally it can't.

Fundamentally can't.

But if you in fifteen years, twenty years, when the data pool is so big that you.

Speaker 3

Can crunch numbers, you can crunch the data, the trends of the data, where it will seemingly be smarter than a human.

The question is like, what is smarter like if if you kill twenty thousand people, if AI could do that would it do it?

Would a human do that?

Yeah?

Some humans would kill twenty thousand people.

So it's not necessarily smarter.

It's just doing stuff outside the guardrails and outside the boundaries that a human might.

Speaker 2

Not do without without inpat without That's what I was trying to ask you.

I'm saying, like, do you think not smart?

Do you think they.

Speaker 1

Can out smart humans?

Speaker 2

Like like when they load up all that data, Yeah, machine like just take over.

Speaker 3

Yeah, yeah it can.

I mean your calculator can all smart you, right, because you can't think as fast as a calculator.

You put a thousand numbers in there and you hit ad calculator.

Because of the speed, because of the hardware and the memory and the logic, it will give you that answer.

It's not smarter than you.

It's just able to do something faster than you.

That's the whole idea behind why people want to use it.

Like any company in every company, trust me, every retail company, every pick a company, they all want they want to use AI, and they want to use AI for two reasons.

For two reasons they want it.

AI makes them more efficient, makes them more efficient, meaning that AI can do stuff faster than a human.

I don't care what you do.

I don't care what your job is.

I don't care if you're an assistant or administrative assistant.

I don't care if you're a call center person.

I don't care your police I don't care what you are.

AI can do that faster than you.

I can train it and program it to be faster.

So a person who has who owns a company, it's like, do I want to pay five banks one hundred thousand dollars a year?

That's five hundred thousand.

I got to pay you all kinds of other stuff you might get sick, got to pay your insurance, Or do I want to replace you with an algorithm?

So they call me in and they'll say, well, I'm not really replacing a bank.

I'm gonna just have bank do another job.

Like really, if bank could do that other job, he would you would have had somebody else in that job.

No, you eventually about to replace him.

You just don't like the optics and how it sounds to say that you're replacing five people with an algorithm, But the reality of it is that's what everybody wants to do.

That's me too, that's absolutely that's because it makes sense, right, it makes sense.

But look, I'm gonna check you out.

I want you to think.

I want you to think.

I wouldn't say differently.

I want you to look at it from the little bit.

I want the fourteen year old, the fifteen year old, not the bank that's right now, but the fifteen year old.

He sees the world differently, sees the world differently.

Right, So I'll ask the quurt I'll give it analogy.

So I got called to a major retailer.

I want to use the name, but you would know it.

That's everywhere.

And they're in Arkansas.

That's where the headquarters at.

SO went into this sold My team was there, that's building the system.

And they have this store the future concept that's gonna come out real soon.

In this store of the future, in this major retailer that's got offices and location retail spaces all over the country.

This store of the future basically doesn't have any people in it.

So no people, No, you go in there.

You don't need a you don't need an assistance this one.

You don't need to ask, no questions, you don't need when you check out.

There's no people there, there's no security in there.

Nobody to help you.

And you go in this store and you shop, and then somebody said, well, are you gonna keep people from stealing?

Same way they do it at the airport.

When you go through there, they assess you what you got on, they store what you got on there, and when you come out, if you come out with anything different that you ain't paid for, you don't walk out into the street.

You walk into a staging area, and when you walk out with something else, that's when the police come and check you.

So you don't really need a person.

You don't need.

So my question to you would be, now, if you that company, you about to save a trillion dollars over the next twenty years.

You about to sell so much money because how many of us work in those wal marts?

You got huh no, no, no, Now that's real though, that's real because we work in the walmarts.

We cashiers, We stock shells like people pay their kids college tuition.

So I'm saying the fifteen year old, the fourteen year old Calvin, we can see the world like that.

That's an empathetic that's empathy associated with this.

Most people are about to lose their jobs.

If you work at you work at one of these retailers.

Now, look, it ain't just this retailer, because this is a copycat market.

If one retailer does it, then all of them are going to do it.

So if you're in retail right now, are you kind of happy with a yeah?

Yeah, kind of you like using this tool called chat GPT.

It's like a little toy for me.

It helps me.

It's assistan, it's like an administrative assistant.

And I'm not a sense suggesting to anybody on this podcast and the Internet world don't do it.

Look, I build AI apps right now.

I have an AI company, So I'm not suggesting at all that you shouldn't do these things.

I'm saying there are ways to do it.

There's ways to do it right there.

It's this empathy component that one needs to consider, and not enough of us are considering that or just looking at okay, well okay, but now if that's your mama that works at one of those retail places, that's your cousin, it's not like they get to leave there and go to another job.

No, some of those places, they've been working there for fifteen twenty years.

There's no other place is gonna pay them what they make there.

But then when the money comes and they say, we don't need you no more, and then that's just an example though, you know, that's just an example for all of us need to be in a sense where we have to be more conscientious.

Speaker 1

People that look like us.

Speaker 2

What you think the biggest what's your biggest fear with AI gonna do for the people due to us?

You mean to black people just period, Like what's your biggest fear?

Like what made you be like.

Speaker 1

Fuck this shit?

No?

Speaker 3

I think about my own personal stories, man, my own personal stories like you see here, Like I'll answer your question.

I think it's going to Uh, my biggest fear is that we're gonna be lulled to sleep, that we're gonna be We're gonna be We're gonna be put to sleep like we're gonna like I said earlier, like when you you kind of think about our generation, like, uh, my generation, your generation, we was kind of handed a lot because of civil rights.

People died for the right to vote, people died right, So we came up in the civil rights you came up, and you know that for me, it was you know, seventies eighties, and we had often call affirmative action, so affirmative action.

Basically, any black person who went to college and they say that they ain't benefit from affirmative affirmative action, they probably lying because it's not just the money.

It was just the opportunity that you had that you that your parents did not have.

Well, you think about it, we don't have affirmative action anymore.

That was that was like that we was gifted affirmative action, but we would sleep at the wheel because we don't have affirmative action.

Now.

So I'm giving another example.

Right, Let's just say, uh, you getting ready to go and get into Students getting ready to go get into a college, and let's say they had one hundred spots slots to get in.

Typically the way it works is that you got one hundred slots to get into this university.

Pick your favorite university.

I won't not Georgia Tech.

I said, I wasn't gonna name y'all, but let's say Georgia Tech.

This is just an example, by the way, So Georgia Tech got one hundred slides, and when affirmative action was in place, Georgia Tech, no matter what they thought, the law said that some percentage of those had to be black.

Right, So let's just say the first ninety comes in and they white.

They go to you know, North Atlanta High and this you know High eclanuted school.

The ninety affirmative actions say that Georgia Tech, the next ten has to be diverse, otherwise you're breaking the law.

Okay, that's the way it worked.

So if you want those ten, you can say, well, I deserve You did deserve to get in, because if you wouldn't have the right criteria, you wouldn't have got in.

But the reality of it is, no matter what Georgia Tech thought, ten percent of those needed to be people of color.

Let's take affirmative action away now, so now you got a hundred.

Still got that hundred, and it used to be these ten people who was on the emissions board.

But let's just say that story I was talking about earlier.

Georgia Tech don't want to pay those ten people anymore.

And he's like, Calvin, come in and build an algorithm for me to determine who gets in.

So now those ten people go away.

Now I have an AI algorithm that's determining who I can program that algorithm to ensure that one hundred percent of those people are white, easily easily.

Look, I can help y'all here.

I can't help y'all here.

Say you're trying to go to grad school and I do you bring in your mission form and you say, I ask you what type of extracurricular things you've done.

Oh, I'm a part of a sorority.

I'm an aka.

Oops.

Oops, Now my filtering system know that you're black now, even though you hadn't put that on your resume.

I don't have to ask you whether you're black or not.

I can derive.

Speaker 1

Do we call them proxies, Vanessa?

Speaker 3

I can I know that you're a female by what you buy from Target, if you buy lipsticks.

See AI as it works.

It's about probabilities.

It's not about actuality.

It's about what's the chances of you being something?

So getting back to my story, right the school system, because this is important, because this happens, we like, we look at this, It's like, no, nah, this is real.

This is happening right today.

So you get ready to go in until now AI says that no, all hundred of those we're gonna be really fair here.

It's not about d I.

We are all hundred of those are white and they get in and then we and we would say, man, how you gonna let your whole freshman class is white?

You know what Georgia Tech is gonna say, I ain't do it.

It's the algorithm, Like I ain't racist.

Hey, look Georgia Tech.

I'm not fighting.

I'm not saying that happens at Georgia Tech.

You're seeing it happen anywhere.

Just using them as an example.

That's real.

Like that's real.

People's like, no, that really?

Who are you talking to?

Talking to me?

Or you're talking to somebody who say they no AI because they play with chat GPT.

That's not chat GPT.

I just told you about.

That's an algorithm that impacts people of color.

Speaker 1

Do you think most war is gonna be fought through AI?

Speaker 2

You know?

Speaker 3

I mean it's certainly they're they're being fault right now, right you, Jones is as close to AI as you can get.

Speaker 1

Right.

Speaker 3

It has all of the features that we talked about, and there's no w you are right now.

I mean you look at hamas you look at what's going on in Israel.

I mean, drones is a part of it.

Right, Almost every police department that you go to right now, have a drone division.

You don't know one that doesn't, By the way, pick your favorite one.

Speaker 2

You said, like we're looking for the police cause and looking for who looking strange, but the mother right there in the sky looking straight at you.

Speaker 3

You think about it, man, Like I said earlier, like how we started off.

It used to build time.

I remember as a little boy growing up, not just a little boy, Like if I go to my neighborhood right now, police would ride by and they would see you, and they would pull you over and ask you questions no matter what, trying to get you to say you saw something, bring out a hat, bring out a piece of shirt, you seen anybody where this is blah blah blah blah blah blah.

Uh.

That happened then, and it happens now.

They don't really need to happen anymore.

That don't really need to happen anymore because now with predicted policing, with facial recognition, with the cameras, with the detectors like you, you you probably want to get out the streets right now.

I'm gonna be honest with you.

You need to think about something else, right because back in the day, it was it was kind of easy, like if you just yeah, somebody had to you said earlier, like you was talking about when you ever started to go back to Fanny, you said, I thought it was very very insightful.

To be quite honest, you said, Hey, it's Fanishs job to catch us in our job not to get caught.

That's that street tall, that's the way.

That is the police job.

You get mad at the police to catch you.

Now, you can be mad at them for how they're doing their job and how they're targeting you, but it's your job not to get caught.

Well, what's your chances of not getting caught now?

So you know what I'm saying, Yeah, because you because I don't need now to go and talk to your neighbor and say did they see you or did they hear you?

But the problem with that ist just though this is the problem, right, it goes back to the facial recognition piece divorce activation piece, like uh, shot spider, Like right now, have you have you ever heard of shot spider?

Never?

Okay, So shot Spider is a policing technology that uses AI that is, as it detects gunshots.

So the idea behind it is that so police would come in as supposed to putting a camera on a poll in your neighborhood.

Say, let's just say Capital Homes.

It's a kept home.

They put They put shot Spider technology.

There's a company that sells this technology.

It's a voice box, a small voice box, and what it does is picks up a shot so you'll be anywhere, and as soon as it hears the shot, it's programmed to alert the police.

So facial recognition tells the police it's you.

Predictive policing tells the police this is where you most likely going to be, and shot Spider tells the police this is where the crime happened.

Because if you shoot in the air, shot spot is gonna capture it.

Look I did when my book, I interviewed a guy in Chicago.

His name is Robert Williams.

I'm probably gonna have him.

I'm sorry.

His name is Michael Williams.

We interviewed Robert Williams, but Michael Williams is he was arrested.

He picked up one of his friends about one o'clock at night and it was driving along picked them up and Chicago Police Department was using they had shot Spider around and somebody shot and they arrested him.

They said that it was him who did the shooting.

But what they don't know is they don't take in consideration is that that technology is about twenty percent accurate.

Twenty percent acurate.

They arrested him, they kept him in jail for a whole year.

You tell his story, he tells his story.

He was suicidal.

It wouldn't him.

He ain't doing anything like that's real, like that ain't something.

You can google him, Robert Williams, you can google him, you can bring him to your show.

He's gonna tell you like that's real.

That's real talk.

It's real talk.

Right, So this is just we've just been talking about one one industry here, right, policing.

But I can I could say the same thing about getting alone, getting alone like like in my book.

I'm not trying to pluck the book at all.

It's not my point.

So I wrote this book called Hidden in Whitesidekay, how AI empowers and deepened systemic racism.

So it turned people on the head.

It just got people all disruptive crazy.

But guess what, So I wrote the book.

The book came out in twenty twenty three.

So I finished writing it in twenty twenty two.

Everything I'm talking to you about today it's in the book, So you think about it.

That was way before we start talking about AI.

We weren't talking about AI in the Jon Public twenty twenty two period.

But this book specifically.

Now, when I wrote the book, I had actually had the book launch right down the street, Right down the street.

CNN came to the book launch.

C SPAN came to the book launch.

Why would they come to the book launch.

I was nobody.

They read the book, they knew.

But see, the thing is, when and when I wrote the book, I didn't write the book for it's not a TechBook.

It's not a TechBook at all.

It's a regular book for you.

It's telling you stuff that's going to happen to you with this AI and the powers to be didn't want me.

I wouldn't say they didn't want to, but obviously they probably would have preferred me not write this book.

But the reality of it is that there's lots of reasons why I wrote it.

Because everybody I talked about in the book.

So I just talk hypothetical kind of stuff AI techie kind of stuff.

No, I told like real stories with real people, with people that I know, people that I know.

I don't give you another example and another industry because we've been beating up policing.

Let's just say getting a house loan.

So so have a friend who her and a husband.

They went to her husband went to more house in her and she went to Spelman and they had a condo in New York Manhattan.

They was moving to Georgia, so they already had a house there.

Moved to Georgia and they applied for a loan.

Everybody was pretty excited.

You know, they had gotten preapproved for the loan, and you know, like anything, when you get ready to get along, you kind of know whether you're going to be preapproved or not.

It's only like four things.

Is they gonna check, what's your credit score?

Do you work?

Where your debt ratio?

Like basic things that a human underwriter.

Those are the justice things that human underwriter can tell you.

Do well on those four things, you probably gonna get the loan.

So they had already gotten pre approved.

So they come down and about a week before they get ready to close, she gets a call and gets a call and say, hey, there's something going on with your loan.

You've been flagged by an algorithm.

She said, you've been flagged by an algorithm.

Now she knows me, she knows them in this space.

So she picks a com phone.

Speaker 1

Up.

Speaker 3

We don't know nothing, boy, no algorithm.

Now mind you, this is like twenty twenty, Like, we don't know nothing by no algorithm.

So they called me up.

I was like, man, I remember like working on this with this bank, on this, on this application.

I was like, you need to find out exactly why they say you didn't get approved, because by that time they had told her that you had gotten caught like you, like you, you basically had missed one of the criterias.

Yeah, right, And so they told her that the criteria she misses that she hadn't had been on her job long enough.

She's working for the state.

She's working for the state of New York, and she came out here.

They don't make any sense.

You know, you don't like that's not so we knew immediately that that wasn't the case.

But check this out though, right, See, like when under human on the writer're human on.

The writer only has the capacity of being able to assess about four or five things, but not my algorithm.

My algorithm can take in thousands of parameters and I can sess you.

Because the bank they want to reduce their risk.

They want to reduce their risk.

So the bank can basically say, huh, who's living in the house with you when you was there.

Let's just say your brother who just got out of prison, was living with you there.

You think the bank care about that?

Yeah, you got a felling in felling in your house.

But should that matter?

Shouldn't?

But does it matter?

In their case?

It did so because his brother had gotten out of jail and was living with him, even though he didn't put it on the application.

Then you might ask A good question would be how do they find out about it?

Real easy?

The bank's got relationships with the Census built.

All of these companies got relationships.

So if you took the Census, I know who's in your house, And the bank is saying, don't be mad at me.

Well, in our neighborhood, in our communities, I don't know about you, But at my mama, I don't care what my brother, my sister does us.

They always can come back home and stay with mama.

You get out of jail, you come back home, stay with mama.

So who who do that role.

Speaker 1

Will hurt But everything everything you name it hurt us.

Speaker 3

So my point is this man Like, I'm not saying I might go and build an algorithm tomorrow, but I'm gonna be more sensitive to that when that bank caused me, I'm not gonna be one of those.

For sure, I do.

For sure I do, like I wouldn't be doing this podcast, like I'm still in this I'm still in this business.

No, I'm telling them that you can do these things right.

You just need to look at it from the perspective of the victim.

Speaker 2

What if they tell you now I want my ship?

They got there lean towards this.

Speaker 3

No, it's it's interesting you answered your own question on why I sit here now and and why I resign and why I else.

I own my own company now, I decide what I want to do and what I don't want to do.

When you work for a corporation, No, you ain't turning What do you mean turning something down?

You don't get you don't get that option.

You don't get that option.

But you know what I learned though, this is the craziest thing.

I didn't realize my mom would always tell me as a little boy.

She's like, boy, you can make a million doll or seven rolls is on the side of the street.

You don't have to you don't have to be nobody.

You don't have to do nothing you don't want to do.

And that resonates now because I'm man.

That's so I can make as much money as I want.

Like, there's always somebody who's like, we want you, some company.

It's calling me saying I turned down, I can.

I'm still saying these same things and I'm still making more money than I made before.

Speaker 1

Yeah, it it ain't affect you in that way.

Speaker 3

It's not gonna affect anybody being being mortally strong.

Being mortally strong ass to you.

In fact, don't take away from you if no, you can't.

Look, man, the idea behind it is the reality of it is like, and I ain't calling nobody out.

I ain't calling nobody out but you.

It is a plantation, mentality, corpor It is a plantation.

Look at me, right, I grew up.

Look, I grew up in the projects.

Got a big old house, drive any kind of car I want, got you know, money in the bank.

Like you you you You're not gonna risk that so you you kind of like on a plantation in a sense, because you gonna there's certain things you can't say and certain things that you can't do, because if you do, you lose all of the stuff.

You know.

A friend told me this.

Man he said, uh, and it was recently, it was within this past year.

He said, dude, he said, I'm proud of you bank.

I thought it was weird.

It's like grown man talking about you, proud of me, Like, what are you talking about?

And he said this to me, he said, he said, you know, I read the book.

He said, you broke away.

You did something that very few people do.

He said that most of us who grew up kind of like I grew up, went into corporate, made a lot of money, you know, doing different things.

We're afraid.

That's his fear.

And he told me to us and he's like, what do you was like, man, I ain't you mean afraid.

I ain't fraid nothing, you know, I ain't even say nothing.

I said the word like no, and he said no.

He said, fury is he said, being afraid that you're going to lose everything you've worked so hard to get.

And it hit me like a ton of bricks.

It's true, right, because you you know, you as a as a black man, you can't make mistakes.

The many you make a mistake, you get cut off at the ankle.

So you almost being in this space, you being squeaky clean.

You ain't going to jail, blah blah blah, he said, Being afraid that you're gonna lose everything that you've got.

He said.

If that don't get you, the other one sure will sure to get you, he said, being afraid that you're not gonna get all the things that you want in the future.

Like I like the idea that I can just go to Amazon and just bring out my phone and click on it.

I don't care what the cost is, just clicking.

I like how that feel.

Like most people in my family they don't have that same feeling.

So it was harder for me to do what we're talking about doing here than it would be for a normal person because all of that stuff that I've attained came from this world that I've created.

But in the end of the day, you know, you know, I firmly believe that freedom is the reward you get for telling that you.

Speaker 2

Some facts, no facts, right, what's the name of the book one more time before we get out of here.

Speaker 3

Hidden in White Side, how Ai and Powers and deep in system bring me one?

Hey man, I meant to bring you one, and I left it on the table.

I left it on the table, But I'm gonna bring it back.

I got a feeling that you're not gonna be chopping.

Speaker 1

Out for sure.

Speaker 2

For sure, because we ain't even We're gonna were gonna touch on the because I'm coming on your pocket.

Speaker 1

We're gonna touch on how to affects some music too.

Speaker 3

Oh man, that's that piece there, dude, that piece there.

Man, record companies they called me all the time.

Matter of fact, it's in l A not too long ago.

This stuff is big there because if you really think about it, content is key.

Content is keen.

Like if you if you're in the record industry, you need to be concerned.

Speaker 1

For sure, we're gonna we're gonna touch on telling people to tap in with you.

Speaker 3

Hey man.

They can.

They can get me out at uh Calvin d Lawrence dot com, and then from there they can get all of the social media components.

Speaker 2

You work your own social media a little bit of it, Yeah, I work, yeah, man, Hey man, Well you're gonna you probably can help me with something of that.

Hey, make sure y'all tapped in Man Winded Man and go lock in with the big Fat Network Lights to stride.

Speaker 1

Coming to next time.

Speaker 2

Another fic episode of Perspectives with Big Bank.

Speaker 1

Follow on Instagram at Big Bank at Yo Yo Yo

Never lose your place, on any device

Create a free account to sync, back up, and get personal recommendations.