Navigated to Exposing Pegasus: How the State Spies on You | John Scott-Railton - Transcript

Exposing Pegasus: How the State Spies on You | John Scott-Railton

Episode Transcript

You are sideways to the interests of a state right now.

There's like a reasonable chance that something is on your device.

And the thing is, you won't know.

Nothing you see, nothing you do, no flickering screen, no sudden drain of battery, no warning sign, no link to click, no attachment to open.

You're just compromised.

There's nothing behavioral that you can do to protect yourself.

What scares me about this conversation as we apply it to the world of Bitcoin is that many different players in this ecosystem, I think, are going to discover many of those same incentive structures if they haven't already.

What they're actually asking for is something that changes the structure of the internet.

And you get a situation where you're going to need a passport to speak and a passport to post and a passport to listen.

If you build systems that allow for control and access, the temptation is just too great.

Welcome.

It's very good to have you here, John.

This is a show I'm really excited about.

I do so many Bitcoin shows.

I love Bitcoin.

I love talking about it.

But stepping outside the box into these tangential issues, and I do think this is tangential in a lot of ways, is always pretty refreshing.

So I'm glad to do this one.

We met a couple of years ago.

We did.

I've been kind of following your work loosely since then, and you're a very interesting person.

So you were one of the team that in 2016, I believe, discovered Pegasus.

or exposed Pegasus.

Yeah.

Do you want to start by maybe explaining what Pegasus actually is, and then what happened in 2016?

So think about your phone in your pocket right now.

It contains a large part of your external brain.

It wasn't always so, right?

It was only like 15 years ago that we didn't really have a pants computer that we carried around with us that contained all this sensitive information.

Well, as mobile technology has proliferated, governmental desire for gaining access to that technology has exploded.

And one of the many ways that governments do that is something that we call mercenary spyware.

So this is the ability, silently, covertly, to infect your phone and turn it into a spy in your pocket.

Activate the camera, the microphone, read your contacts, listen to your signal calls, read your encrypted messages, look at your photographs.

Anything that you can do on your phone, it can do.

And some things you can't, like silently making the phone a hot mic to bug a room, recording from the video, accessing your cloud accounts.

Pegasus is one of those technologies.

There are many.

Pegasus is in many ways the most notorious and the most well-known, partly because of its market success and selling to lots of different governments, and because we keep finding it.

And did that spyware, where did it come from?

So Pegasus originates from a company called NSO Group, which flies different name flags, but we'll call them NSO Group for the purposes of this conversation.

They're an Israeli company, and they took, I would say, a set of technologies and skills that came out of Unit 8200, which is the Israel sort of like NSA equivalent military entity, and other parts of Israel's military intelligence complex, and turned it into a profit-making product.

People call Pegasus software, but it's, I think, more effective to think about it as a service.

Basically, imagine you're a government.

You've got a leader in his 80s, doesn't want to leave.

You're worried about your governmental stability.

There's a bubbling opposition somewhere.

What are you going to do about this?

Well, you want to monitor that opposition.

So you go to NSO group and you say, listen, I like to buy Pegasus.

I want to get on these people's phones.

And the contract that you'll get, it's so interesting.

It's like DRM for spyware.

You get a number, not of seats, like with Microsoft Word, but concurrent infections.

So you might have like a 20 concurrent infection contract.

And then for the period of a year, NSO is going to basically guarantee you, look, for this year, we promise that we'll do our best to make sure that you can hack iPhones, that you can hack Androids.

And then it's kind of up to you how you use that.

Some governments that are just like afloat in cash will buy like 100 concurrent licenses, big numbers.

Others that are like tightwads will get their 20.

And then they will just grind on those licenses.

It's like hack like 20 people in the morning, finish those infections, 20 people in the afternoon, do it again the next day, right?

And when they infect a phone, can they essentially download everything they want from that phone and move on to the next one?

They can.

And in fact, there are these cases that we see.

So when we do forensic analysis of devices, we can often see numbers of infections, right?

So like on this date, your phone was infected.

On this date, your phone was infected.

And there are journalists, for example, who earn the ire of certain governments.

Their phones might have had like 20, 30, 40 infections over the course of a year or two.

So they're just checking in periodically, sweeping data, moving in.

It's like your top up, right?

You know, like the state's just gonna go poke by and look in your underwear drawer.

You know, it's a Monday, let's go check it out.

And this, like you said, this came out of an Israeli company, but they're selling this to basically every government around the world.

Is that right?

So there's a whole ecosystem of these players.

And I think we can talk about NSO kind of like as a stand-in for a lot of them.

There are certain restrictions that the Israeli government places on offensive technology sold from Israel.

And so there's certain states that are not going to sell to like Iran, for example, or maybe Russia, because maybe the Israeli government sees a diplomatic risk in doing that deal.

but they sell to a lot of governments.

And the consequence of this has been the massive proliferation of this capability to governments that previously had no technical bounds.

So think about it like this.

You know Neapolitan ice cream?

Of course.

Strawberry, vanilla, chocolate.

Do you like Neapolitan ice cream?

No, I'm not really a fan.

Me neither.

But it's seared into my memory, because as a kid, I hated getting my hair cut.

My mother cut my hair.

And to try to distract me, she would put a bowl of ice cream in front of me.

There's always Neapolitan.

So when I think Neapolitan, I think of sort of hair in the ice cream, you know, tears, terrible things.

I just missed the days of hackers.

But let's- We're all getting there.

So let's think about these three flavors.

So strawberry, governments that have a deep STEM pipeline.

They've got cryptographers.

They've got mathematicians.

They can develop their own technologies for doing this highly sophisticated hacking.

They can develop exploits.

They can do this.

Then vanilla are governments that don't have that pipeline, that don't have those mature security services, but they've got a checkbook.

Chocolate is like pariah states.

Syria would have been an example, right?

Where they can't necessarily go to the open market.

They can't go to Israel and say, hey, we'd like to buy your best toy.

And so they'll find other kind of clever cobbled together, my cousin knows computers, ways of doing hacking.

But whatever the flavor, the root is like to your phone, to your webcam.

And what's so interesting about this problem set is that when states, especially sort of like states that don't have a STEM pipeline, suddenly get this technology, they're like technically punching way above their weights.

And dissidents, activists, and politicians are not ready for it, right?

One day, their government had bumbling security service that couldn't get its act together and was shot through with corruption.

The next day, right, like, you know, their local intelligence officer is like sitting on their phone.

It's a crazy change.

Okay, so who are the people that are abusing this technology the most?

Like, is this the kind of behemoth countries like the USA, or is it the people who are trying to fight dissidents in their country, like authoritarian regimes trying to fight dissidents in their country.

Yeah.

The answer is like, yes.

Everyone.

And different countries have different trajectories with this technology.

But I think a lot of people listening to this will have heard of Pegasus.

And probably in the back of their mind, they associate it either with a prominent case, like the murder of Jamal Hashoggi, or with the idea of spyware being sold to dictators who then abuse it.

The truth is there are two distinct piles of abuse cases.

One set come from dictators who predictably use this technology to monitor their opposition, or like the then president of Panama, like monitoring his mistress too, and maybe his business rivals.

Why not, right?

The second category is democracies or democracies on paper, teetering on the edge of authoritarianism that acquire this technology.

And the temptation to abuse it is enormous.

Think about it.

And think about what we know from history, right?

When a state gets a secret surveillance power, they will often abuse it.

It's a matter of time.

And so what scares me about Pegasus and similar spyware is not just, oh man, the dictators of the world can flex their power and do so far across the constraints of their geography, right?

But that democratic societies are also at risk from this technology.

They're at risk because this temptation to have a Stasi-like capability is just way too big.

Way too powerful.

Way too powerful.

And this is kind of a hard question, I think, to answer.

Is like, where is the line in terms of state surveillance being not necessarily okay, but acceptable?

Like nation states are going to try and monitor and track criminals and like terrorists, whoever that might be.

But when does that creep over the line to be they're spying on all citizens just in case you're a terrorist or a criminal?

Just in case.

You're asking a really good question.

Now, if you look at the marketing materials for NSO or similar categories of spyware, we've been doing work on this spyware for more than a decade.

And what we find is that the marketing is, they call it lawful intercept software, right?

The premise being, well, if a police service is doing it, it's lawful.

The implication is, well, police need to be technologically enabled to chase bad actors down deep all the way to Hades, right?

So it's an intuitive thing for people.

The challenge with the technology is that it often arrives into countries that don't have anything like the legal oversight mechanisms, judicial mechanisms, warrant systems, to ensure that people in that country, like their rights, their choices about the kinds of power that they want to give their government are being respected.

Moreover, because the industry is itself like just like an absolute morass and a pig maw of secrecy, people don't usually know when their governments are using this technology until we, my brilliant colleagues, other people who are working in our space, investigate and we find abuses.

That's a problem.

I believe firmly that citizens need to know the constraints, the limits on the power of the state.

How long are the state's digital arms?

And technology like Pegasus means that those arms are often way longer than people realize.

with huge implications for their freedom.

Bitcoin is absolutely ripping and in every bull market, there's always a new wave of investors and with it, a flood of new companies, new products and new promises.

But if you've been around long enough, you've seen how this story ends for a lot of them.

Some cut corners, take risks with your money or just disappear.

That's why when it comes to buying Bitcoin, the only exchange I recommend is River.

They deeply care about doing things right for their clients and are built to last with security and transparency at their core.

With River, you have peace of mind knowing all their Bitcoin is held in multi-sig cold storage, and it's the only Bitcoin-only exchange in the US with proof of reserves.

There really is no better place to buy Bitcoin, so to open an account today, head over to river.com forward slash WBD and earn up to $100 in Bitcoin when you buy.

That's river.com forward slash WBD.

What if you could lower your tax bill and stack Bitcoin at the same time?

Well, by mining Bitcoin with Blockware, you can.

New tax guidelines from the Big Beautiful Bill allow American miners to write off 100% of the cost of their mining hardware in a single tax year.

That's right, 100% write-off.

If you have 100k in capital gains or income, you can purchase 100k of miners and offset it entirely.

Blockware's mining as a service enables you to start mining Bitcoin right now without lifting a finger.

Blockware handles everything from securing the miners to sourcing low-cost power to configuring the mining pool, they do it all.

You get to stack Bitcoin at a discount every single day while also saving big come tax season.

Get started today by going to mining.blockwaresolutions.com forward slash WBD and for every hosted miner purchased you get one week of free hosting and electricity.

Of course none of this is tax advice, speak with Blockware to learn more at mining.blockwaresolutions.com forward slash WBD.

This episode is brought to you by the massive legends Iron, the largest Nasdaq listed Bitcoin miner using 100% renewable energy.

IREN are not just powering the Bitcoin network, they're also providing cutting-edge computing resources for AI, all backed by renewable energy.

We've been working with their founders, Dan and Will, for quite some time now and have been really impressed with their values, especially their commitment to local communities and sustainable computing power.

So whether you're interested in mining Bitcoin or harnessing AI compute power, IREN is setting the standard.

Visit iren.com to learn more, which is I-R-E-N dot com.

Can I, this is a little bit of a tangent, but I'm sure you followed the Tucker Carlson story.

I don't know when this was, maybe like a year ago.

I think it was when he was going out to Russia to interview Putin.

And he claims that his signal account was hacked.

I personally highly doubt that.

I assumed at the time that that was actually that he had Pegasus on his phone.

Do you know what the likely scenario that was?

It's an interesting question.

So one caveat, which is I can't speak about cases that my colleagues or I haven't looked at.

I don't know what's going on there.

But what I can tell you is governments everywhere have an appetite for using this kind of technology.

And if there's one thing we know, it's that you, especially if you're a prominent person, may have multiple governments that are interested in you.

So we talk about proliferation, like a term from the monitoring of the proliferation of arms.

right?

Like, it's a bad thing.

Like, if sophisticated weapon systems go everywhere, conflicts are bloodier, and there are more of them.

It's like an axiom.

I think the same is true for sophisticated surveillance powers.

And my view about cases like what Tucker Carlson claimed and others have said is that if you are sideways to the interests of a state right now, and that state is technologically enabled with this kind of powerful stuff, There's like a reasonable chance that something is on your device.

And the thing is you won't know The absurdity of the scenario right now is there's no commercial tool you can buy That will protect you from this.

There's no third-party app that you can put on your phone That will scan it and tell you with high confidence You're clean or you're not clean.

And so you have a scenario where people are walking around With the awareness and sometimes without it Somebody could just be in my shit at any time That's dangerous.

It's dangerous to thought.

It's dangerous to the ties that bind us together our ability to safely share what we think our private explorations of knowledge and ideas on the internet and Ultimately, it leads to self-censorship and it's not like an abstraction.

So the movie the dissident I'm not Brian Fogle really good movie That movie had a huge huge challenge getting picked up by American distributors.

Vogel was just coming off an Oscar for another movie that he'd done.

Why?

Well, it did the story of Jamal Khashoggi getting hacked with Pegasus.

Excuse me, people around Jamal Khashoggi getting hacked with Pegasus.

We don't know the status of Jamal's phone because it's in the, says he like the Turkish authorities, but everybody around him seems to have been targeted.

His wife, his fiance, his friends, his colleagues.

Why didn't that movie get picked up?

They don't want that guy now, then.

Well, I think movie executives were themselves terrified by the story of Jeff Bezos, right?

The idea that your business, too, could be rifled in by Saudi Arabia, even if you're sitting pretty in a studio in Southern California enjoying the Clement Breezes of Malibu, right?

You still ran the risk of your personal world being dumped out on the, you know, shiny aluminum table of a dictator's security services halfway across the globe and then have that used against you.

This stuff has huge global implications because of that idea that they can sort of project power.

We talked about proliferation a second ago.

Like we view the proliferation of ballistic missiles as generally a bad thing, right?

Because of their range in part and because of the things that they can carry.

Well, spyware has like infinite range, right?

This is terrifying.

It used to be that if you were a dissident, like in Egypt, for example, you could seek the protections of geography.

You could, if you get out of that country, you could go somewhere.

You go to the United States and you could be reasonably safe.

You weren't going to have 15 Egyptian security officers in ill-fitting trench coats and polyester suits following you around in like, you know, Memphis, Tennessee.

Yeah.

Couldn't happen.

That's not true anymore, right?

You have this thought in the back of your mind, if you're a dissident, those- You can always be touched.

They, and if not me, it's my spouse.

It's my friends.

It's the global village of people who I'm in touch with.

That's a terrifying and dangerous reality.

So in that reality, is the only solution, if you're like a high-flying tech executive, if you're like a politically exposed person, you're a journalist who's like writing about whatever, to just not have a phone?

Except that's impractical.

And so the reality is everybody's got a phone.

Most people don't know like the status of their phone, but they think about it somewhere, whether it's a pebble in their mental shoe or a general sense of foreboding.

So we check hundreds, thousands of people a year.

We screen them for spyware like Pegasus, Paragons, Graphite, Predator, all kinds of things.

Something really interesting happens when you check a person's phone for a spyware and you give them the result.

I thought when I first started doing like large scale checks with my colleagues, you know, I have these incredibly big colleagues.

And they, they, one of my colleagues, Bill, developed an amazing technique that we automate for checking phones for mercenary spiral, like Pegasus, Androids, iPhones.

I thought, okay, the super majority of people we check are going to be negative.

They're going to be clean for the things that we check for, according to what we know, right?

Like it's very possible that there are things we don't know to look for.

And I thought, well, people are going to feel like their time is wasted, right?

I'm wasting people's time.

I've convinced them to get checked, right?

We got a line of people and I'm giving them, in my mindset, the bad news that they wasted their time.

Most people like getting a result and they like getting a clear result because they have carried with them, whether they realize it or not, some baggage bubbling in the lower bits of their consciousness or way up front.

And suddenly we can say, look, now you have knowledge.

Now, the really crazy thing is I was equally afraid of having a scenario where I would have to tell person the bad news, which I do every year again and again and again, whether it's a phone call or it's an in-person conversation.

I have to tell them, listen, you didn't know it at the time, but your digital world was not your own.

There was somebody in bed with you when you were talking with your spouse.

There was somebody in the room when you were taking a shit.

There was somebody watching as you were thinking about your finances or your health or somebody looking at you as you were trying to understand the challenges of your adolescent kid, right?

It is very traumatic to receive that news.

But there was this one moment.

So I was doing an investigation with my colleagues into the abuse of Pegasus spyware in Togo, small country in Africa.

If Africa is like an ice cream cone with two balls, Togo is sort of on the second ball.

It's kind of over here.

Very, very small little notch of a country, French speaking.

And I had to call up a bishop, a Catholic bishop, and let him know that he had been targeted with Pegasus.

I'd never called a bishop before.

What's that conversation like?

So I call him up, introduce myself.

I give him this result.

It's this sort of long pause.

And I don't know what to do with myself in this moment, feeling everything.

And then he thanks me.

And he says, I'll never forget what he says.

He says, thank you for bringing me this truth in a dictatorship, we don't have a lot of it.

And it's like being in a place without a lot of oxygen.

You just gave me a breath of truth.

You gave me a breath of oxygen.

Thank you.

That's super powerful.

It's funny, when you sort of started that, I thought you were going to say something different.

I thought you were going to say, when people receive the result and it's negative, they would be disappointed in the sense that maybe I'm not pushing hard enough.

Like I thought they would be watching me.

Some people are.

And there's like a category.

I think I would be.

If I should find out.

Should we?

Can you find out?

So maybe I would love to do that.

Can we do it now?

We can do it right now.

How long does it take?

10 minutes, 15 minutes.

Let's do it.

Should we do it?

Should we break and do it?

And then we can see what happens.

Here's what we're going to do.

We're going to check you and we're going to have to turn these things off.

Yeah.

Is that because what you do is...

So the challenge, It's like doing a rapid test for like, I don't know, strep throat.

The difference is that like strep can read all the scientific papers, right?

It would be as if strep was like looking at how the test works.

It's like, all right, I got to change myself a little bit, right?

So there's some chance that somebody from NSO, hi guys, is watching this conversation, hoping that I will emit a little signal that lets them know the ways that we're checking.

So we should turn things off for a minute.

So we can do a check, have the process bubbling along, and then maybe a little bit later, get your result.

All right, let's do it.

Let's break it.

All right.

So we were in- The audio works- You've got your thing running away.

We're going to find out- Chugging along.

If Pegasus is on my phone.

Answers are about ready.

I seriously doubt I have Pegasus on my phone, but I'm excited to find out.

I don't think I'm the kind of person that anyone would want to target.

Let me just do the weirdest thing and be the one asking you a question for a second, which which is I think part of the value of this for people who are watching this is most of us go around every day without the knowledge of what's going on in our phone.

Yeah.

And maybe they're wondering right now, like, is Pegasus on my phone?

And if you're watching this, like take a minute to think about what you're feeling, but what are you feeling right now as you're thinking about this?

Like, what has this triggered in you?

Honestly, I think I'm probably treating it quite flippantly because I'm very confident it's not going to be on there.

Like, I don't think I'm interesting enough to anyone for it to be on there.

But that might turn out to be a really naive take.

But maybe that comes down to, like, who are the people being targeted?

Like, would it ever be just like a relatively, like, just a normal person?

Absolutely.

And I've talked to a lot of them.

Here's the thing.

So it's like dating in your 20s in reverse.

So when I was in my 20s trying to date I was often saying it not you it me Spyware the other way around It not me it you You are as interesting and as likely to be targeted as the most interesting person that you know.

Why is that?

Well, think about it like this.

If I'm a government and I want to catch you in unguarded moments, I want to know what you think potentially, right?

I've got a set of questions about you.

Maybe you're being careful, but are you forcing, Are you enforcing that carefulness on everyone around you?

No, it's impossible.

And so we would call this off-center targeting.

It may be that you are at your most unguarded when talking to the people around you.

I think that would probably be true to almost everyone.

Exactly.

Because we exist in communities and in groups.

One of the biggest challenges that I think people have when they get interested in privacy and safety is they think about it in this frame of my phone.

How locked down is my phone?

But every conversation happens with another person or a group of people, how locked down are they, right?

You can't really ask them to be as obsessively locked down as, you know, the Bitcoin core woman who's got, you know, she's got graphene and she's like only, you know, she's compiling everything herself and she's verifying all the builds, right?

Like that's not reasonable for most people.

And the economics of spyware benefits from that fact.

There's a second category of risk from this technology, which is it keeps showing up being used by actors that didn't develop it.

What do I mean?

So twice now that we know about, thanks to some research from Google's like threat analysis team, um, exploits, which are the techniques used to put spyware on a device.

Yeah.

So very sophisticated exploits.

We call them this case like zero click, which means that there's no interaction required by the victim.

It's just like one minute you're you're chugging along, the next minute your phone is hacked.

Oh, so I had the complete misunderstanding about this.

Oh, let's talk about this.

Because I thought this was coming from, like, dubious emails that you accidentally click away, and that's how Pegasus actually got on your phone.

Let's pull back for a second.

So back in the day, it did.

Back in the day, the way that you would infect somebody with Pegasus, typically, was by sending them a text message.

So in the earliest days, our earliest investigations, the biggest one that we did, me and a group of my colleagues and then collaborators at three different Mexican civil society organizations, organizations working with dissidents and journalists, organizations working with the families of people who've been disappeared by cartels.

We found that the Mexican government had been an extensive abuser of Pegasus spyware.

How did they infect people?

Well, they would send them text messages.

Well, if you're sending a person a text message and you're a hacker, you really want them to click, and you're a government, you know a lot about them.

So the messages would be like hyper-personalized.

Maybe it's like, hey, your name just showed up in a news article.

Check this out, right?

Or a message appearing to come from the phone company about your phone balance.

You're running out, right?

Or, hey, name, your daughter, correctly name, just got in a traffic accident.

I'm just letting you know she's in this hospital.

Here's the map point to it, right?

That shit happened.

Or, yo, name, I just saw your wife having sex with somebody.

You're not going to believe this.

Here's the video, right?

very personalized, very deeply enmeshed.

That was true for years.

And for years, the way that we would find Pegasus was actually by looking for the targeting, right?

Now, when they would infect a phone successfully, they would often delete the message.

But the infections didn't always work.

They weren't always careful about it.

And so we would find kind of like, you know, yeah, you find like the burglar has left his skeleton key in the door, right?

But starting in like the end of the 19, the end of the teens, NSO started selling zero click capability.

So this means you're just chugging along and then your phone is compromised.

Nothing you see, nothing you do, no flickering screen, no sudden drain of battery, no warning sign, no link to click, no attachment to open.

You're just compromised.

And do we know how they actually do that?

We do because we keep catching them despite their claims about being untraceable.

So in 2019, we worked on a very big case where NSO was providing a technology that allowed for hacking people through WhatsApp.

In this case, it was through like a missed video call.

And the first case that we really came across was a lawyer who was representing other Pegasus victims.

And he was like having these weird...

half dreams where he would like wake up in the middle of the night and there'd be like a notification on his phone, like missed call.

And he'd be like, all right, I'll like figure it out in the morning.

And then he'd wake up and the notification would be gone.

And he probably thought, you know, maybe I'm going bananas, right?

Like I've got to, you know, tone down the ambient dose here.

It turned out that NSO had acquired a very sophisticated exploit and done a lot of development to work on it.

And this allowed them to use WhatsApp as the means of entry onto the device.

And then they would put the spyware onto the device.

So what they figured out was like, you know, when a phone connects to another phone on a WhatsApp call, right, there's a bunch of stuff happening.

It's like a handshake, like a modem tone for those who are that old.

You could put all kinds of other stuff into that communication that would be the leading wedge of infecting a And what we've seen time and time again is that companies like NSO and others look to popular chat programs, iMessage, WhatsApp, FaceTime, Calendar Sync things that have a sort of a cloud component and a discovery component where devices are a little vulnerable with each other, right?

They're exposing something.

They've got to do something.

They've got to pass an avatar of the caller to the other phone.

They've got to do something during that call sequence that they can then target for developing sophisticated means to put spyware on a device.

But it means that there's nothing behavioral that you can do to protect yourself, which is a bananas reality.

And it's why this stuff is so concerning.

It really is like push button.

I'm in your closet, in your underwear drawer, in your business.

Yeah.

So, I mean, that's terrifying.

And you have checked hundreds, thousands, however many devices.

Have you ever been targeted by any of these things?

So there's sort of an interesting thing that's going on.

Because you must be a massive thorn in their side.

They'll know about you for sure.

Yes, we're more than a pebble in this shoe.

Me and my very talented colleagues.

I'm going to tell you a story.

2019, a colleague of mine, Bahar, gets a guy reaching out to him and is like, hey, I would like your help coming up with a project to provide financial solutions to unbanked refugees, which is a noble idea.

Bahar, refugee from Syria himself, was interested.

So he takes a meeting with this guy, goes to a fancy hotel to have a meal.

But in the course of that conversation, the guy starts asking some really weird questions.

What's your father's name?

What's your mother's name?

Tell me about your religion, right?

All this like weird, no investors asking those questions.

Bahar's backstory is that he had been imprisoned and tortured by the Syrian regime before he successfully fled He knew what it was like talking to an intelligence operative.

He smelled it.

He could feel it.

And so he listened excused himself and called me and said John, I think I was just in a meeting with his wife So we immediately started digging into this guy's identity and we discovered he was a ghost the name It wasn't real, profile picture, the company.

None of these things seemed to really exist.

So who had targeted my colleague?

Well, we were about ready to have a story published about this.

The thinking was, you've got to protect others from whatever this is.

It's going to be a story, like a blind story, like researcher targeted by unknown person.

And then I think it was the day that the story was going to get published.

I get an email in my inbox.

And it's like, John, you've used this technology to do aerial mapping, which was something I was doing for PhD research a long time ago.

I am an investor in Africa.

I'd like to use your technology, which was kite aerial photography, which was cool before drones, right?

I like to use this technology to map my, like, big investment.

Would you like to have a meeting?

I was like, shit, okay.

You know who else is being targeted?

It's me.

Maybe there's something you can do about this.

Let's run this back on these people and figure out who they are.

So we played along.

We arranged the meeting and we went to it.

You want to hear what happened?

Of course I want to hear what happened.

So I'm not a spy and I don't know how spies think, but I've read a few books and I realized I had to, along with my talented colleagues and help from some journalists, think like one and had to be a version of myself that would be compelling to this person.

I can't give up the game, right?

Like, I'm like an academic.

How am I supposed to deal with this scenario?

Like, I'm not made for this.

made like a necktie camera.

Because I was like, I can't go buy something, right?

Because like, he'll recognize it, right?

If I have like a spy pen or something, I got to record this.

So I ended up working with a group of journalists from the Associated Press.

And we're like, okay, let's play this along and have a meeting.

So we did.

The Peninsula Hotel, Swank Hotel in New York City.

Come to this meeting.

And there's this guy, the investor, right?

And you'd made your own tie camera.

I had modified a nanny cam into a necktie that I bought.

I love that.

Because I was like, well, I have to make my own thing.

But the problem, the hilariousness of this is, I broke, on the taxi ride over, I broke the camera.

I separated it from the hole that I had made in the necktie.

Luckily, I was bringing a group of journalists with me to the restaurant.

They were already gonna be there covertly monitoring.

We were gonna be recording.

I had like recorders on my body.

So we were able to capture this whole meeting.

The transcript's online, if anybody's curious.

So we go to this meeting, and it's an hour of like get to know you and this and that.

But the guy, a little bit like pieces of bad driving, keeps deviating from the kind of like friendly get to know you thing into a couple of different conversational paths.

The first thing that he does, his first conversational gambit, is to try to get me to say something racist, which is bananas.

Almost like the first thing out of his mouth is to try to goad me into saying something really racist and offensive about how Africans speak French.

I'm not going to say it.

Racist and offensive.

And I'm thinking like, okay, I have to like play it cool.

Can't, you know, play into this thing.

But like, wow, this guy is clearly thinking like, he wants this conversation at any given moment.

If it has to stop, he wants to have his stuff, right?

He'll be recording me too.

Oh, he was trying to get it really early in the conversation.

Yeah, I mean, like, you know, you get your first bit early and then you try to develop more things, right?

So, and the hilarious part is the guy was so, in his own way, like bumbling, he had index cards in front of him.

And there were three colors, green, yellow, and red.

And when he was asking me mundane questions, he was like working from the green index cards.

And it was like, so tell me more about this.

And then he'd like cut in with a yellow card, right?

And then like, you know, into the red cards.

And the red cards, like the stuff that he was trying to ask about was he was trying to discredit our work.

And when he wasn't trying to do that, he was trying to find that information about our work on like the case of Jamal Khashoggi.

He was trying to get sensitive information about the lab.

And I would sort of like have to give him little conversational gambits.

Like I'd be like, well, you know, how does the life of that?

Well, you know, we've got a lot of drama.

Oh, drama, he tells me.

I love drama, tell me more, right?

And then I have to sort of chase Lee, pull it back.

So we played along like this for an hour and like a half.

The whole time he's been encouraging me, alcoholic, try the cognac, right?

Like I don't drink, which he didn't know.

which he didn't know, definitely put him at a disadvantage.

And eventually I get like a frantic text message from the journalists who I'm like studiously trying not to look at, right?

He hasn't noticed either.

They're like, our batteries are running low.

Like we've got to wind this thing up.

And what I didn't know at the time was they were like on a journalist budget and all they could afford at that restaurant was like one fish cocktail.

So like two guys sitting at a booth behind me, like eating a fish cocktail.

For two hours.

Like an hour and a half, right?

Sipping their ice water.

So we wrap it up.

I try to distract him and I get him looking out the window.

I'm like, you know, kites in Africa.

Like, well, if you just look out the window over here, right.

And he then turns back to discover a journalist and a cameraman, like in his face.

And of course it's like game over for this guy, right.

It's panic.

He like, the journalist is like, so what are you doing?

He's like, I know what I'm doing.

Right.

And then he sort of panics.

He asks, you know, did you get permission to film this?

And then he like almost knocks over a chair in his haste to leave the restaurant, but he'd forgotten to pay the bill.

So the guy has to turn on his heel, come back into the restaurant, followed by a reporter, a cameraman, me with my GoPro, all peppering him that I've pulled out of my pocket, right?

All peppering him with questions like, who are you?

Do you work for, you know, the Black, this organization called Black Q, which we'll talk about in a second.

And so the poor guy's like, to pay the bill.

And then he hides in a back room in the restaurant, closes the door.

And it's like, okay, we've done as much as we can here.

Like we can't pester this guy further.

What we didn't know, but learned shortly thereafter from the news reporting was that the guy was an ex-Israeli intelligence official and that he was working for Black Cube, which is a private intelligence firm that a little bit like NSO performs like being like super competent, super secretive, right?

And we know this in part because one of the private investigators that his team had subcontracted, who was also, we learned later, running surveillance on the meeting, became a whistleblower.

Told his story.

And eventually becomes somebody I know and has like told me the whole story.

It's also in publications.

It's in a book by Rondon Farrow.

This stuff is all public knowledge now, which is why I can say it without fearing a defamation lawsuit.

And is that video online?

It is.

Can I put it in this interview?

You have my permission.

Perfect.

Let's go.

I witnessed a foreign private intelligence agency running around New York City as if it's some spy playground.

If I'm getting the geolocation of your cell phone and you're not willingly providing that, then we're breaking some laws somewhere, right?

I'm a licensed private investigator.

He was a subcontractor for a private intelligence firm called Black Cube.

Black Cube is an organization that's a mercenary in the spy business.

An elite Israeli private intelligence agency.

Harvey Weinstein hired Black Cube to help him investigate his enemies, the women who were accusing him of rape, and the journalists investigating the situation, especially Ronan Farrow and Jody Cantor, who both won the Pulitzer Prize for their reporting on Weinstein.

I didn't know who the client was.

There was always a mystery.

We're going to follow Jody Cantor, a New York Times reporter.

And he explained to me that we're interested in her sources.

Here's a heads up.

We're going to follow another reporter, Ronan Farrow.

Harvey Weinstein had sources.

He had rats in all of the media.

Specifically had us do surveillance.

He told me that they're going to use the geolocation feature to find out where Ronan is.

Ronan's phone was in this area between, you know, for at least a two-hour time period.

The point of this tracking is to kill the story, to suppress journalism.

Once Igor figured out what he was doing, that he was helping Harvey Weinstein, he had real concerns about that.

I felt like we're doing something that's very unpatriotic by following journalists to find out who their sources are.

sort of independently decided he was going to come forward with this.

I called Ronald Farrow and I said, I thought I got to tell him that he's being followed, that he was followed.

I was in the crosshairs of, frankly, an insane international espionage operation.

Black Cube wanted to give him a polygraph exam to see if he was, for instance, working with Ronald Farrow, which he was.

And I totally freaked out.

I didn't know what to do.

That whole Weinstein case was done under my license.

You know, I'm the one who should be scared.

If you're already self-custody of Bitcoin, you know the deal with hardware wallets.

Complex setups, clumsy interfaces, and a seed phrase that can be lost, stolen, or forgotten.

Well, BitKey fixes that.

BitKey is a multi-sig hardware wallet built by the team behind Square and Cash App.

It packs a cryptographic recovery system and built-in inheritance feature into an intuitive, easy-to-use wallet with no seed phrase to sweat over.

It's simple, secure self-custody without the stress.

And time named BitKey one of the best inventions of 2024.

Get 20% off at bitkey.world when you use code WBD.

That's B-I-T-K-E-Y dot world and use code WBD.

One of the things that keeps me up at night is the idea of a critical error with my Bitcoin cold storage.

This is where AnchorWatch comes in.

With AnchorWatch, your Bitcoin is insured with your own A-plus rated Lloyds of London insurance policy, and all Bitcoin is held in their time-locked multi-sig vaults.

So you have the peace of mind knowing your Bitcoin is fully insured while not giving up custody.

So whether you're worried about inheritance planning, wrench attacks, natural disasters, or just your own mistakes, you're fully protected by AnchorWatch.

Rates for fully insured custody start as low as 0.55% and are available for individual and commercial customers located in the US.

Speak to AnchorWatch today for a quote and for more details about your security options and coverage.

Visit anchorwatch.com today.

That is anchorwatch.com.

Do you wish you could access cash without selling your Bitcoin?

Well, Ledin makes that possible.

Ledin are the global leader in Bitcoin-backed lending, and since 2018, they've issued over $9 billion in loans with a perfect record of protecting client assets.

With Ledin, you get full custody loans with no credit checks, no monthly repayments, just easy access to dollars without selling a single sat.

As of July 1st, Ledin is Bitcoin only, meaning they exclusively offer Bitcoin-backed loans with all collateral held by Ledin directly or their funding partners.

Your Bitcoin is never lent out to generate interest.

I recently took out a loan with Ledin and the whole process couldn't have been easier.

It took me less than 15 minutes to go through the application and in just a few hours I had the dollars in my account.

It was super smooth.

So if you need cash but you don't want to sell Bitcoin, head over to ledin.io forward slash WBD and you'll get 0.25% off your first loan.

That's ledn.io forward slash WBD.

That's a wild story.

It took a while.

Feel free to cut as much of this out as you'd like.

I don't want to cut any of that out.

That was brilliant.

The footage is there.

So this is 2019.

Obviously, long time has passed since then.

I'm sure you are pretty considered about how you travel the world, how you live your life.

Yeah, that's right.

Has there been anything since then?

So I don't want to say too much about more recent things, but we're obviously extremely vigilant to this.

But there is another kind of protection.

Before 2021, the only problem that companies like NSO had was like researchers like us blowing the whistle.

Since that time, a bunch of things have happened.

NSO got itself sanctioned by the U.S.

government.

Suddenly, we weren't its only problem.

And there's now what I like to call an accountability ecosystem of like dozens of organizations that do legal work That do advocacy that do research that do technical work also all investigating the mercenary spyware world and so whereas Some years ago we might have been like a narrow point of failure.

We now have a protection in this Amazing kind of network because suddenly like take one out.

There are others Sorry remind me the name of the company in Israel Black Cube, not at all a shady name.

No, sorry, the one that developed Pegasus.

NSO Group.

The NSO Group.

So you're saying they were sanctioned by the US, but haven't, you sent me an article a few days ago.

I did.

Haven't they just renewed a contract with them or something?

Let's talk about that.

Yeah, let's.

You tell me what's going on.

So for like, so I started working on this topic in like 2011, 2012, long ago.

Did a detour work to Google for a bit, came back into civil society full time.

During that time, a relatively small group of people, some very brilliant colleagues of mine, our director, Ron Deibert, we have been shouting, and it feels like shouting into the wind about this problem set.

And it hasn't, it hadn't really had the impact that we wanted, which is conferring some kind of like major protection.

Partly because a lot of governments like the existence of this industry, or at least their security services do.

Because just like arms dealers, right?

They're useful.

If a government, you know, would come to like the US or I don't know, Germany and be like, hey, we want to cooperate with you.

Like, can you give us your sexy tools?

And of course the US is like, no, no, no.

We can't do that.

But we know a guy, right?

We'll put you in touch.

And maybe they'll do it, right?

And Israel managed to use spyware.

So Haaretz, the Israeli publication, use the term spyware diplomacy, right?

A lot of governments want this stuff.

Seems like an oxymoron.

And for the most part, like the security voices had the ear of politicians.

And what they were sort of saying to politicians, lawmakers was like, look, we're doing secret spooky business.

It's in the net interest of us and our country that things stay secret and spooky.

But something happens.

And this is not at all a big surprise.

So if you take a pie chart, take a core sample of who's being hacked with Pegasus on a Wednesday, right?

Well, who's in that pie?

Obviously, they're the cases that we work on in our mandate, activists, dissidents, journalists, political voices, artists, truth-tellers, whistleblowers.

But a major slice of the pie, maybe the biggest slice of the pie, is government on government.

Of course it is, right?

If you're a government, even if you acquire this technology, which is marketed, So, like, the marketing frame and the justificatory frame of these companies is like, we're here to help you solve serious crime and track terrorists.

But the open secret is that this is an espionage technology, and the primary business that governments make with this is hacking other governments.

Now, governments are going to hack.

They're going to hack each other, right?

And I guess it's almost like- I don play a violin for those governments but it pisses governments off if they discovered it The major inflection point was the U government realizing that their diplomatic personnel were getting hacked in spades, and not just the U.S.

government.

We made the phone call to the U.K.

government to be like, by the way, we found evidence of an infection on the networks of number 10 Downing Street, right?

And suddenly it became clear that government's willingness to just totally ignore the problem as a problem and treat it as like a secret thing was going to have consequences for them too.

Yeah, because if someone has it, they may as well have it too.

And basically, it's like any other game theoretic problem, right?

Like you can only, in this case, like you can only allow this industry to proliferate so far before the pee in the pool begins to fill the rest of the pool, right?

Like before we're all swimming in it.

And suddenly governments were seeing a security and a national security problem, not just a benefit.

And that was like the major inflection moment.

Now, the question that you asked me was like, so fast forward to today, like what's with this news about like the US government doing business with spyware companies?

Well, during the past administration, there was, I think, a very strong, clear-eyed awareness of the national security risks that the proliferation of this technology posed.

And the truth is like America's got pretty good, you know, this from Ed Snowden and others.

America's got pretty good skills, right?

Like in theory, America doesn't need the existence of these mercenary players the way that like, if you're Togo, you might, because you're not going to develop that in-house.

And so it's not entirely surprising to me that the U.S.

would have seen this problem set as like, okay, this is not in our interest to have NSO being a cowboy, selling this technology to all these governments who are going to hack us, right?

Like, we don't like this.

And moreover, right, it's going to misalign with within the foreign policy objectives of the United States.

Like the U.S.

doesn't want, you know, democracy activists having all of the work that they've done eroded.

What has changed?

Well, if you're a spyware company, what's the big prize?

What's the market prize for you?

It's not selling to dictatorships.

Selling to the US government.

Selling to the US government.

Biggest possible client and a big brother of protection, right?

How do you deal with the fact that you're pushing right at the edges of the law, right?

You're buying exploits potentially from like Chinese hacking groups that are also linked to the Chinese government.

or maybe you're connected to really shady characters.

Maybe you're doing cross-jurisdictional business that could be considered like money laundering, structuring, right?

You're living in the grays.

You need protection.

And so NSO Group and others have paid untold amounts of money to lobbyists, to formers in government, to try to convince parts of the US government, like put their arm around them and be like, listen, we're friends, right?

Buy our stuff.

Well, it didn't really work out that way for NSO.

because of this sanction.

But in 2019, something else happened.

A company called Paragon, another P, right, was founded also in Israel with some American venture capital backing.

Two founders or co-founders- Was this Palantir backing?

It was like, there's a whole ecosystem of private equity that I think kind of enjoys the sexiness of like pushing at edgy surveillance things.

Like they kind of like it, right?

It's like Walter Mitty, but for like espionage and surveillance.

So who are Paragon's like, you know, among the co-founders and co-founding board members?

Former Israeli prime minister, Hu Barak, Ehud Shornason, former head of unit 8200.

Big names.

And their whole pitch was, we're going to be stealthier than NSO.

Like, we're not going to get caught.

Unlike those NSO people who keep getting caught by Citizen Lab, right?

Like, government, are you tired of getting your shit discovered again and again and again, hundreds of times, right?

Like, don't worry.

We've got a cooler, a lighter touch technology.

And by the way, that means it's less invasive, more likely to comply with your laws.

And we're ethical, right?

By our ethical mercenary spyware.

So Paragon made a big push with that narrative during the era when NSO was in the biggest reputational soup and their fortunes and valuation were falling.

Like NSO's debt lost like 80 cents on the dollar after a string of things, including U.S.

government action.

Like, all of the advocacy work that had happened didn't much touch it.

Entity listing, having it publicly disclosed that they hacked the U.S.

government, that was, like, game over, right?

And, like, people lost money.

Now, Paragon's narrative is, like, we're the solution to this, right?

All the good stuff, none of the bad.

And that narrative sort of worked, I think, on the U.S.

government, because a part of the government wanted to believe that there was, like, a way for this industry to exist and be shaped towards better.

What happens?

Well, towards the end of the Biden era, somehow, somewhere in the Department of Homeland Security and ICE, Immigration and Customs Enforcement, there is a deal done with Paragon, which is like mana from heaven for Paragon, presumably, right?

Like, okay, we get our kind of wedge in.

And by that time, the US government had an executive order on spyware that required review for national security, counterintelligence, human rights issues.

So when the US government learns about this, they're like, okay, stop work order on this contract.

We're going to have to seriously review this.

Now, we've talked about security problems.

We've talked about human rights abuses.

But there's also this counterintelligence problem, which is like, if you're using spyware developed by foreign developers, well, that's a huge problem, right?

The US Air Force would not field a Chinese stealth fighter for obvious reasons, right?

There might not be a Chinese spy sitting in the backseat.

Similarly with spyware, if you're using technology developed by somebody else, you have to assume that at minimum, they have a special insight in maybe how to find it.

But there's, of course, more, which is if that company is plugged into the US government, there's going to be information flowing back and forth.

Even more concerningly, they're selling to multiple governments, which means that multiple governments may have a special insight into the things that the US government is doing with its spyro deployment, with this Paragon spyro called Graphite.

Now, all of those seem like pretty strong reasons to do like a scorched earth counterintelligence assessment.

And it seemed like Paragon was like, you know, wallowing in this, but then something changed.

We just got an announcement that the stop work order was lifted on that Paragon Graphite contract with ICE.

We don't know this because they told us.

We know this because a journalist named Jack Paulson spotted the stop work order lift in the federal contracting database.

Now, the real question here is, is this kind of technology, like, does it align with values that Americans would recognize?

Is it dangerous to American values?

Do Americans believe that there should be this, like, secret, unaccountable, hidden surveillance technology?

I think many Americans, if they kind of had it laid out, would be like, this feels dangerous.

This feels like it doesn't align with the Constitution as I understand it.

And so, obviously, we've spent the vast majority of this now talking about Pegasus and Paragon.

Can we take it a little further into sort of general state of surveillance?

Because the question I would have is, how do you view the new technologies, things like AI, in terms of surveillance?

I'm fucking terrified.

Let's start with the thing that I know best, which is my little world.

Me and my colleagues tracking spyware and serial and stuff.

Well, okay.

So if you hack a person's phone, you're doing it because you want data from that phone, right?

Maybe you're doing that top up that we were talking about earlier.

You need to move a bunch of data from the phone.

And then you're going to go analyze it somewhere else, right?

So it would be like, I've got to, you know, photocopy every page of every document in your house, and then I've got to somehow sneak that bag out of the window, right?

Well, with the arrival of on-device AI and sort of private enclave cloud AI, I think there's a real possibility that now an attacker just needs to talk to that AI and be like, hey, go find every instance of this word, or go find only the communications with this person, go find only the videos featuring this person's voice.

And suddenly, instead of having to export like a couple hundred megabytes of stuff, which might be detectable by some vigilant network monitors, I'm taking like, I'm sipping like 12 kilobytes, right, tiny little information.

Or I could just say like, hey, I sit on there and monitor and let me know when Danny reaches out and talks to this person.

So I think it has immediate potential to change how difficult it is to move stuff and how easy it is to remain undetected.

I think that's kind of only the beginning.

processing the information, understanding it, it's going to get worse from there.

We can pull out the lens a little bit further out of spyware as well.

One of the great challenges that you have if you sell spyware is like having reliable working exploits, right?

You've got to have these.

And these are like, they cost millions of dollars.

They're highly sophisticated code at this point, right?

Part of the consequences of our work and efforts by Apple and others has been to drive up the cost.

Now, exploit discovery involves a lot of work.

Some of that work can be done, I think, by AI.

Sure, you can use it for defense as well and finding stuff, but like balance- Because it's just great at passing massive amounts of data, is that- It is really good and at trying lots of things in lots of different ways.

And that's intersecting with other realities about the abilities of virtualized, different kinds of cell phone operating systems.

So just there's a real danger there.

And there's a danger that the problem expands out from the kinds of very sophisticated spyware that we find and we're working on it and we're talking about here.

The second kind of like layer of this problem set as I see it is with tech, it's like everything new is actually old.

And people now, many of them have invited a new chat window into their lives where they talk to somebody else's computer and that computer gives them answers.

Now we could talk for a long time about the sort of concerns about what that means for like the safety of your cognition, right?

Your ability to really know what you feel and know which thoughts are yours.

and are you going to get one-shotted by a system designed to butter you up and keep you chatting with it?

But the other component of this is that those systems are also getting a unique visibility into people's lives and thoughts and the things that they're doing.

It's a huge data flow.

Totally.

And that is like back to your analogy of that, like the burglar in the house, it's almost like you're passing the burglar, whatever it is in your house out the window.

So the first point you made there is interesting.

I've not heard that.

The second point was really the thing that I have been thinking about in the sense of, like, are we just giving up all this massive amounts of personal information to corporates?

And do we go into a world of sort of corporate surveillance, the corporate panopticon?

We're there.

We live in a world of corporate surveillance.

And what is absolutely infuriating is that the corporate surveillance machinery was designed to sell you stuff, to better understand behavior, to better sell advertisements to sell you stuff, right?

Like, turtles all the way down.

But of course, that whole machinery is like catnip for an intelligence service.

Ten years ago, if you wanted massive data collection people, you had to build that system.

Now, you go to advertising exchanges, you get that data, you get that analytic data, like, you want to know where people are, right?

Like, all of that data flow that was once also the purview only of the states that built it is now available to anybody that can pay for it.

And that means that many of the legal protections that were like sort of carefully honed in any given like country about what governments can collect, how they can collect it, how they can retain it, who has to review it.

Like none of that matters if you can just go use your credit card and get like, you know, 80% of the places that Danny went with his phone from like a shady, you know, third party data broker.

Right.

And even more scary, a bunch of those data brokers will sell to like China or other adversarial regimes.

And so if you're a government now, you don't just have to worry about like foreign spies following your shit.

You've got to think that like, you're actually like all of your people, all of your actions, like it's super legible.

That extends all the way down to you and me, but it's fiendishly difficult.

Like with spyware, okay, I find spyware on your phone, right?

You're gonna get your answer later in the conversation.

Maybe we can figure out who did it.

And maybe we can trace some lines to that.

But with data brokers, like, how would you know that it was data broker X?

Like, what is the chain of the six intermediaries through which your data flowed that eventually resulted in it being used against you?

That's just a black bag of who knows what.

It's a black bag of badness.

And what we have created is like, it's just like a million honeypots for a thousand bears.

All of them just stuffed with every possible kind of personal data.

And I am deeply personally angry that so many very talented people built these systems designed to monetize behavior without thinking for a second that they were creating a parallel structure of control, a parallel structure of monitoring that is now with us.

Now, if you take that, I would say, increasingly mature structure, and you bolt it into the world of AI and the new ways that AI chat systems are understanding people, you start fusing that data, you get an absolutely terrifying degree of understanding of people.

What scares me, like pulling out from the sort of initial privacy question, like your stuff is being exposed, is this.

I believe in the core of my being that there's a category of questions that governments should never be permitted to ask about their population.

I 100% agree.

They should never be permitted.

And unfortunately, in too many societies, friction was the only thing preventing that from happening.

your data, data about you has less friction than you trying to make a financial transaction.

That's bananas, right?

So when it comes to this, like big tech corporate surveillance, what do you put it down to?

Like the people running these businesses, like is it, are they evil?

Is it ignorance?

Is it just profit-driven, ignore the consequences?

Like what is it?

Cash rules everything around me, cream, right?

Like you look at each, Shout out to that.

Every organization that is trying to do stuff and that winds up collecting data at some point is going to have this conversation.

Oh man, we could have another revenue stream.

Like, oh, we're helping people solve this problem.

We're connecting different bank accounts, right?

Like Plaid, we're connecting this bank to that bank.

We're helping you get your check deposited, right?

So it's like, oh man, we have a second parallel monetizable flow of data.

Our shareholders are going to love this.

A version of that is repeated over and over again in company after company.

And it isn't necessarily framed as privacy violating.

It can be framed as helping prevent fraud, right?

Transaction analytics, right?

Quality of experience.

These things are always put that way.

They're always in the same way that the online safety stuff is framed at like age assurance, right?

Like saving the children.

And I think a lot of good, well-meaning people, good-hearted people got themselves into corporate structures where they built like 85% of the privacy-destroying chainsaw.

And then suddenly they started seeing a chain getting attached to it, and they're like, oh, fuck, right?

And many of them left, or they became disillusioned.

Some of them volunteer and work at Citizen Lab, right?

But we got here through the God-Prophet and through a million KPIs.

What scares me about this conversation is we apply it to the world of Bitcoin, is that many different players in this ecosystem, I think, are going to discover many of those same incentive structures if they haven't already.

And it's going to be happening everywhere a little bit all at once.

It's like arsenic poisoning by eating a lot of a certain kind of food, right?

No one bite is the bad one, but the sum total is trouble.

And I'm really curious what you mean about that.

Is that the fact that there's kind of like KYC choke points anywhere you want to go and buy Bitcoin?

Is it the fact that like anyone now working on privacy software has to fear going to jail?

And like if Bitcoin can't have any privacy tools on top of it, like does that then become another sort of surveillance panopticon?

The thing about surveillance panoptica, right, is that I think there is something structural right now that means that like a version of this thing is going to keep repeating itself in every business structure.

It's kind of like there is a certain crystalline structure that protects privacy, but there are other versions of that.

And if those crystals are introduced, suddenly and over time, the crystalline structure pivots and suddenly it becomes not just like, you know, potentially resistant to privacy and rights, but actually like the fastest rails to violate those things.

What got me into this whole world, I was doing something totally different.

and the Arab Spring happened and I saw governments trying to suppress speech.

So I got into this by building projects to get information out during internet shutdowns in Egypt and then in Libya.

And the big personal excitement that I experienced was, oh my God, technology has ended the historic asymmetry between people and the powerful and the ability to push information out.

Holy cow, right?

Suddenly, you don't need to take over a TV station to have a protest movement and to tell people in your country what's going on.

You don't need to persuade a foreign journalist to talk about your story or get your quote.

You can broadcast it.

But what social media platforms had not done, they had not reduced the historic and abiding asymmetries between people and governments of power and of risk and of resources.

And the other shoes started dropping even during the Arab Spring as countries realized, uh-uh.

The solution is not to turn off all the tech to do internet shutdowns the way that Egypt and Libya did.

The solution is to do what Syria and so many others did.

Keep the tech on, but start surveilling, right?

Give people the feeling of the freedom to express and then slowly find and pick people off.

Versions of that keep happening all around us.

The craziest thing about that as well is I think when people hear these stories, you assume that that's going to be the stuff that China's doing.

And I'm sure China are doing it.

But it's also happening in the UK now.

It's happening across Europe.

The online safety act that passed in the UK is terrifying.

And again, as you said earlier, it's framed in this way that it's to protect kids online.

And there'll be a lot of people who don't pay close attention to this, that will just believe that narrative and think that that's just a good thing by default.

But maybe you can talk to the actual risks to stuff like this.

don't listen to what they're telling you.

What they're telling you is you need to protect kids online.

And you and me and everybody else, opinion polls show, right?

Like seven out of 10 people do believe that kids online face risks and have some appetite for a better solution.

I think a lot of people for different reasons have some version of that belief.

So where do you go with that?

Well, parents often want a big red button that they can press that will just make the bad stuff go away from their kids online.

Unfortunately, politicians for whom everything is a nail with their big sledgehammer are like, okay, we have a solution.

Let's just put a bouncer at the door.

How intuitive?

He just wants to look at your ID.

You just flash him your ID.

That is the bouncer fallacy.

What they're actually asking for is something that changes the structure of the internet.

What they're saying is, you want to go to the club, you show your bouncer your ID, he makes a copy of it.

Maybe he's also going to check your bank account.

And then everybody you interact with is also going to receive a copy of your ID.

Where are they going to store it?

In their sheds out back.

You're creating a situation where people are suddenly forced, law-abiding people who just to interact with the internet, to dox themselves, to KYC for speech.

And I firmly believe bad things come from this model.

And in fact, the model has already got problems.

So the Washington Post, Drew Harwell, this journalist in the Washington Post, this really cool bit of data analysis.

He looked at web traffic to adult platforms after the Online Safety Act had its switch turned on.

What did he find?

Did you find this?

I've not seen this, so I'm going to try and guess.

What happens?

Let's say there are two categories of online platforms.

There are rule-abiding platforms and there are platforms that don't care.

What happens to the traffic of those two platforms?

They go to the ones that don't care.

And even the people going to the rule-abiding platforms, I would assume traffic seemingly drops from those countries, but really people are using VPNs and ways to get into them anyway.

The harder the government squeezes on the internet, the faster and more effectively people get around it, which is why we talked a minute ago about opinion polls.

Do you know there's another number in UK opinion polls about the Online Safety Act, which is most people don't think it's going to work.

Like only like 30-some percentage of Brits actually think that this thing will work.

So it's a paradox.

Most people want something.

Most people know it won't work.

Well, that's a great example of a scenario that highlights really that like the answers being provided are wrong.

I can't think, because like you don't have to have an expert tell you that it won't work.

So what happened?

It drove traffic in mass to 3X the traffic of platforms that didn't comply, which there will always be because they will be outside of the jurisdiction.

And almost certainly, not only did that, was that traffic would have otherwise gone to complying platforms, suppressed their traffic.

And like you say, it drove people to use VPNs and led to this sort of absurd scenario of a British official going online and being like, can we all please not use a VPN, which is the best viral marketing for VPNs I've ever seen in my life.

100% and VPNs went to like the top of every app store chart.

And like really the sort of second order consequence of that is people just have better online privacy as a result, regardless of what websites they're trying to access.

For now, but follow the follow-up.

What's going to happen?

Well, what scares me about the Online Safety Act and the versions of it that we see in the United States, in specific US states that are doing their own like age verification- Florida have something similar, I think.

You got a bunch of different US states like dozens have done something on this And whether it like ID or scan your face or age gating right Like there are elements of the same thing Well you told that this is about protecting kids from harm But if you look at the definitions, they're often really vague.

Like the harmful stuff, pretty vaguely defined, which gives governments a huge amount of ability to define things into censorship.

But it goes further, right?

what problem do big platforms face in this scenario?

Well, the Online Safety Act has this feature where some clever boffins sat down and they're like, okay, you know what?

We're going to solve the free speech problem too.

So the Online Safety Act has penalties if you show people, you show kids bad content, legal but harmful content, but it also has penalties in theory if you over-censor.

But here's the truth, right?

Like, if you tell a major online platform, we're going to dock 10% of your global revenue if you mess this up and show somebody something harmful, they're not going to call a constitutional lawyer to be like, how can we push right up to the edge of this?

They're going to just start over-complying.

So Alec Muffet- Well, but that was- Sorry to interrupt, but that was the cool thing about what 4chan did.

because they basically said, just fuck you.

Like, we're an American company.

You can't impart this law on us.

Yeah.

Well, so the British government has faced off comes with this problem, which is like, what do you do?

Like, you know what the analogy is like?

It's kind of like the analogy to taking your shoes off and like, you know, putting your liquids in a little bag.

Like, it's security theater.

It's safety theater.

And the problem is what ends up happening is the rules that result in the mass transfer of personal data, the mass, like, secret census of online desire that is being created and given to governments, that stuff is going to stick around even as everyone's going to know it doesn't work, right?

It always ratchets like this.

Getting back to this question for a second of, like, the implications.

So Alec Muffet, who writes thoughtfully about age verification stuff, is, like, platforms historically, the way that they've sort of categorized content is kind of like may contain peanuts, right?

Like could be harmful, but like what's harmful?

It's like might be something that would bother people if they view it in their office, right?

NSFW, right?

They're not designed to do these sort of careful categorizations of each kind of speech in ways that maximally protect people's freedom of speech.

So of course, platforms warn to do this.

They're gonna over-categorize.

They're gonna apply this sledgehammer.

It's not a scalpel, it's a big sledgehammer.

And the end result is people seeking health information, people collecting flags, people struggling with depression, people want help quitting smoking.

All their communities are going to get turned off, too.

And you get a situation where you're going to need a passport to speak and a passport to post and a passport to listen.

That is bananas.

And as it doesn't work, the other thing that I think governments are going to do is they're going to say, OK, well, we can't solve the problem of these noncompliance.

Right?

So what do we do?

We need a great British firewall.

We need to start blocking at the gateway to our little island.

We need to start blocking VPNs because just too many bad things are happening with these technologies.

A lot of people are going to quickly draw that conclusion.

And if they start implementing it, it's not just going to be lessons learned from China.

It's going to be like parallel evolution of some of the same structures of control.

And let me tell you my big suspicion.

I don't think that everybody who promotes these things is a prude.

I think many of them, many lawmakers, are exercised by concerns, legitimate concerns about children.

We can talk about maybe better ways to address this.

But in the U.S., some of the groups that are bankrolling this stuff, they just want to end things that they don't like.

They want to stop people from viewing adult content, but also other kinds of content that they deem societally harmful.

Well, what is that?

That's a bunch of prudish, often religiously driven censors trying to censor everybody.

This is very bad.

It doesn't belong in democracies.

And so what are the things that people can actually...

In fact, before I ask that question, because the part of this that we've not necessarily spoken about yet is the risk to end-to-end encryption.

So I don't know the exact state of this as of right now, but I know as part of this act, they were trying to have a backdoor that they were trying to call not a backdoor.

In the sense that if you're using the Signal Protocol to communicate, they wanted, I believe, to verify a message before it was sent on device to make sure it didn't contain anything harmful or hate speech or however they deem it before you actually send that message.

So that is not end-to-end encryption.

If you have to premise what you're saying with this is not a backdoor, it's a backdoor.

Yeah.

So client-side scanning is this concept that people might send harmful things that the state can't control across the platform.

So how do you deal with that?

Well, you push a bunch of rule sets to a device, and that device is then going to censor the stuff, prevent it from going, and maybe also send a little notification to the powers that be that you tried to send it.

China has also experimented with and implemented all sorts of versions of this in chat apps.

A bunch of Chinese chat apps will have rule sets that they download of words you can't share.

right?

So UK treading where others perhaps have treaded before.

The problem with that system is that even if it is implemented for the purposes of blocking content that you or I might regard as morally reprehensible and is harmful in its creation and in its sharing, all the government needs to do once it has that system in place is to push some new rules to that system.

And suddenly it is an on-the-device speech monitoring system, right?

I can just tell it, well, actually, you know what else bothers me is a certain kind of flag.

You know what else bothers me is like a certain kind of meme, right?

China does this, right?

Winnie the Pooh memes, got to block them.

And- Starmer memes will be banned next.

You got to protect the children from those memes.

This is a serious business.

And the problem with that kind of structure, and this is like, Look, I've only worked on this issue and related privacy issues for 14 years.

That's a pretty small slice of time.

Our director, Ron Debert, brilliant guy who founded the Citizen Lab back around 2000, looking way into the future when he founded us.

We remain reliably independent of government and corporate pressure and funding.

The view is, and I think he would tell you this.

I don't want to speak for him, but he would tell you this.

is if you build systems that allow for control and access, the temptation is just too great.

And everything that we know from every other surveillance technology, in fact, every historical example of secret surveillance capabilities created is that they get abused.

Nowhere more so than when the state is pretty confident that they can surveil without the citizens knowing that it's happening.

I think it's almost analogous to the idea that is talked about a lot in Bitcoin.

that if you give the state the ability to press a big red button and print more money, that's too powerful for them not to press.

And it's the same thing here.

ED HARRISON- Like, during the entire Cold War, people lived in fear of a big red button.

We don't want to be in a position where people have to fear the pressing of a big red button, either for them specifically, because it's something they said, did, thought, or shared privately or in public, or for speech generally.

And I think what we've learned from the excesses of the past few years is that platforms are really bad at playing a role of speech police.

They don't want to be speech police either.

And like, it's going to sound controversial, but like this stuff harms platforms too.

They're not designed for this.

They shouldn't be asked to do it.

Now, the Island Safety Act has like, you know, categories of platform, three different tiers, depending on how big the platform is, how big the user base is, how big the potential for harm is.

But ultimately this stuff harms innovators too.

And it harms people that are trying to run communities that are helpful.

And I think fundamentally, like, it creates a...

Like, you don't want a scenario where the people overseeing the speech world are the equivalent of the teacher monitoring the detention room to make sure that people aren't talking to each other.

And that's where we're headed.

It's funny, like, we talked earlier about the risk of these megacorporations just harvesting data and doing corporate surveillance.

But in some ways, are they the potential answer to this?

in the sense that if Meta and Apple, I mean, Signal have already said, if the UK implement these laws, they're just going to pull out.

But there's so much more weight behind a company like Apple and Meta doing that.

Do you think that is one of the ways that we can actually fight this?

I am a pragmatist in a lot of ways.

And I think that we benefit when there are certain alliances or alignments with interests that might otherwise be strange bedfellows.

right?

Like it's kind of remarkable that there was like a national security and human rights problem with mercenary spyware.

But those things change when political realities change.

Those things change when realities change for platforms.

It wouldn't take much for a big platform to not be an ally.

And I think when it comes to our privacy, like we have been massively victimized by most of the platforms that we've brought into our world, right?

As the Russian saying goes, the only free cheese is in the mousetrap.

And we are just getting every day, the mousetrap is just hammering us harder than, you know, like a self-flagellating, you know, medieval attempt to get rid of the plague.

And what I see though, is that like where there's potential is decentralization, systems with less permission, systems that have a robustness and that aren't susceptible to somebody sitting down with their leadership and being like, hey, look, your other product line, which is providing services to our government, whether it's a cloud service, whether it's a messaging platform, whether it's an AI platform, you want to keep those contracts?

We're going to need certain things from you.

The problem is many big companies are big enough and shareholder driven enough that they cannot really be trusted as allies in the long term.

In the same way that most politicians really can't be trusted as allies in the long term.

So I've, again, met you a few times throughout the years.

You're a very positive person.

And despite that this has been one giant black pill, where is the kind of hope here?

Because I know you, Citizen Lab, are at Oslo Freedom Forum.

You've been to these HRF events.

Is the answer in Freedom Tech, things like Noster and these new platforms that are being built?

I think like being in a really stifling speech submarine, right like a big breath of oxygen goes a long way I believe that certain kinds of freedom tech is that breath of oxygen and I think when people start breathing it they feel the difference I am most optimistic both of the growth of some of those technologies and also of what happens when they get mainlined into popular things that get used by everybody when encryption for like think Think about it like this.

My personal view, one of the biggest developments in like the 20 teens was when Google decided to turn HTTPS encryption on for all of their user base.

Because suddenly government went from literally reading all of your emails to having to try to figure out a new way.

But being knocked out, a switch was flipped.

I'm a great believer in that kind of switch.

Well, what was one of the other big switches that was flipped in the teens?

I would say WhatsApp implementing end-to-end encryption, right?

Even if they did it kind of shitty?

The implementation of end-to-end encryption that they did is using the same ciphers that Signal uses.

My issue with it, and tell me if I'm wrong here, is if someone has cloud backup on their chats, does that not remove the end-to-end encryption for everyone who's communicating with them?

Indeed, the great challenge with many of the most popular devices and platforms is around backup and encrypted backup.

And it is extremely meaningful to me that WhatsApp now provides backup encryption.

Okay, I didn't realize I'd done that.

Apple provides backup encryption for iCloud.

Because it used to be the case that I think part of why states were so comfortable with iPhones being pretty solidly encrypted was that they could just go get iCloud stuff on the back end.

I see the efforts that big players make often to incorporate these technologies.

There's often a bit of a workaround somewhere in there.

And I think that that is partly because it's hard to do everything at once.

Sometimes, pragmatically, it ends up functioning a little bit like a pressure release valve for governmental pressure.

But nevertheless, on balance, it still massively increases privacy.

Because, for example, let's go back to a time period where people had WhatsApp encryption but were backing up to Google Drive.

If the state wanted the contents of your Google Drive, they'd have to go to Google.

They would have to prepare a judicial request.

They'd come to Google.

Google would review it, decide whether or not it appeared to comply with rules.

They would sort of look at the case, and then they would respond.

That's a huge layer of friction and oversight.

That's people in the middle.

The truth before that was there was nobody in the middle.

The state could just monitor from the wire.

And so there's still tremendous net benefit with friction, but it's incomplete.

So this is like, don't let perfect be the enemy of good kind of thing.

And the problem that we have as a community is we have to be very clear about what wins look like, but just as clear about the fact that we're incrementing.

And one of the problems, again and again, is the sort of heroic individual model of privacy and security, where someone is like, you know what?

This tech, this encrypted messenger is the only one that I really trust.

And you have to come to me and use my thing if you want to be.

I'm guilty of that.

And honestly, it's like being with the most insufferable privacy vegan you've ever met.

Oh, no, I did that with my family.

I said, if they want pictures of my daughter, then they have to be on Signal.

Well, Signal is a good balance.

I think that there's a lot of room for incrementing, right?

And for creating network effects.

I think it's a totally legit thing to do as a son or a daughter to be like, mom and dad, I would love to send you pictures of my family, but it's going to have to be with some encryption, right?

That is healthy.

What's unhealthy is when people are like, I reject all popular platforms and only do this one thing.

Because the problem is the enemy of political organizing, right?

What do we care about in society?

People talking, communicating, sharing ideas, inter-exchange, inter-cambio, right?

That stuff, for it to work, requires that there not be too much entry cost friction.

And so what I worry about is that the privacy world, and it's like, you know, understandable but most exhausting, winds up shitting on everything other than the orthodox perfection.

But the truth is, so often, they're not really doing the perfection thing themselves all the time either.

It's just a bit of a performance.

It's hard, and your friends aren't going to be there talking with you.

So I feel like if we're looking at the problem set, and this is my pragmatic frame, and others might disagree, there's a huge value in large-scale big increments.

and there's huge danger in large-scale friction reduction in surveillance.

Those two things are kind of, in many cases, like the biggest fights.

What I'm excited about is at these ends, there are developers and others working to develop tools to their standards, right?

Let a lot of flowers grow.

Some of those are going to work.

Some of those are going to become really popular.

They're going to be network effects like Signal.

Others are not.

The trick is, so we were talking earlier about like, who's going to get targeted by Pegasus?

Here's the truth of the matter.

We don't know who tomorrow's targets are going to be.

They don't know themselves because they may not have made a choice that puts them into the like target line of a government.

They have not yet decided to step out and speak their truth or speak up against something that they see is wrong or share a thought that bothers somebody.

They don't know, and we don't know who they are.

We have to be designing our technology so that those people will have, already on their devices, tech that has like 80% of the way there and then can have some small changes.

Because once a person has made that choice, it's too late to sit them down and be like, okay, you need to behave like a total spy.

You got to use all this sophisticated technology.

And none of your friends are there, right?

Like they raised their voice and they were dangerous because they raised their voice and people listened.

You can't suddenly tell them to turn off their voice for the sake of being safer online.

You have to work with them and balance with them, which means that a little bit like trimming a bonsai, if you trim the branches and you trim the roots and you bend it and you water it all at the same time, you kill it.

You do one thing at a time.

And the same is true for user behavior with security.

So I'm a huge believer that user experience and ease of entry, which means no scolding, has to be as friction-free as possible.

It's more consequential in many ways that a popular app has an on-ramp for a privacy and freedom technology than that there exists somewhere carefully honed, polished pearl of perfection.

And that is why Signal is perfect.

In my opinion, I think it's the best messaging app because it just feels like anything else.

And you know you have the additional security and privacy.

Can I make two suggestions to your viewers about how to use Signal in a little bit more of a secure way?

Absolutely.

Okay.

So here's what you need to do.

Go into your signal settings and check out privacy.

I'm going to do this as you said.

Let's do this.

Let's see how I do.

Go into signal settings and privacy.

Yeah.

Okay.

Yep.

Now choose advanced.

And tell me what you see on there.

Sensorship circumvention.

What else?

Proxy, always relay calls, show status icon, allow from anyone.

Okay.

What I want you to do is turn on always relay calls.

On.

It's already on.

Good man.

Do you know what it's doing for you?

I don't.

So when I call you on signal, I install signal, you install signal, we just make a call like that.

It's a peer-to-peer communication, right?

Which means that our devices are talking to each other across the network, which means that the network may know that we're having a signal call.

They can't hear what we're talking about.

They can't see what we're messaging about.

But the fact of that call is known to various parts of the network.

That is a really interesting piece of metadata.

Call Relay bumps your call through servers that signal controls.

It's still encrypted, but that means that the network now in this place that we're sitting, all that the network knows is that you're having a call that looks like a signal call.

They can't see the IP address of your correspondent, and your correspondent can't see your IP address either.

Major improvement.

Now, I want you to go to your WhatsApp.

Okay.

Is that the only setting in there that's the...

No, actually, should we do another signal setting change?

OK, so the other thing that I want you to do is I want you to turn on disappearing messages by default.

He's already on at four weeks.

Great.

Here's why.

Remember, we were talking earlier about how governments do top ups with surveillance, right?

They're like, you know, they're a little penny pinching with their surveillance technology.

Same holds true for all kinds of other hacking.

If your device doesn't have old chats on it, a hacker can't get them either.

Disappearing messages at four weeks is a great default mechanism.

Set it once and don't think about it more so that your phone is not carrying five years of your life's worth of interesting conversations.

I do turn off on some of my chats, though, because I want to have the history.

Exactly.

And the model should be turn it off as needed, not turn it on when paranoid.

Yeah.

You should be default concerned.

Okay.

WhatsApp.

Let's go to our WhatsApp.

This one will be worse because I don't use WhatsApp very much.

Okay.

So I probably never played with the settings.

I want you to do exactly the same thing.

Go to privacy.

Okay.

And go to, I think it's advanced.

Yep.

And do you see protect my IP address and calls?

That's off on this one.

You turn that bad boy on.

Okay.

Same deal.

Now, people should understand there are structural privacy difference.

And obviously, turn on disappearing messages by default on WhatsApp too.

Yeah, I think you have it already.

Okay.

People need to understand there are structural differences in the privacy that you have when you use different kinds of platforms.

So, Meta will know more about you as a WhatsApp user than Signal will know about you as a So act accordingly.

But what I like about both of these settings is, especially with Signal, you're now way more secure and you're more private.

This is really important.

And it took like 30 seconds, right?

Should we talk about another thing that people should do?

Please.

OK.

What kind of phone is that?

iPhone or Android?

Are you comfortable saying it on the iPhone?

iPhone.

OK.

Have you ever heard of lockdown mode?

Yes.

I actually saw a thread on Twitter.

I think I might have done this.

Let's see.

Lockdown mode.

So go to privacy and security.

Got it.

Go all the way to the bottom.

You should see lockdown mode.

Yeah.

It's not on.

Okay.

So let's talk about what lockdown mode is.

So in 2021, November 2021 was a shitty month for NSO Group.

Not only did they get dinged by being put on the entity list by the U.S.

Commerce Department, but Apple also notified a whole bunch of people that had been hacked through Apple services with Pegasus.

and they sued.

Very bad situation for NSO.

Apple also began the process of rolling out lockdown mode.

Well, what's lockdown mode?

There are ways of taking a regular device and turning it into a much more secure, hard-to-hack device.

They come with some trade-offs.

Apple watched how NSO Group and other governmental actors we're hacking people's phones through Apple services and through different settings, default settings.

And it was like, okay, we can come up with a list of changes that you can make that price out whole categories, right, rice fields worth of attacks.

How cool is that?

The thing is, some of those things will introduce a bit of user friction.

What we don't want, and I'm now guessing, because I don't know the internals of Apple's thinking here, but if I'm Apple, what I don't want is to suddenly push out a high security setting to everybody, And then people have like a bad experience of friction and their next purchase is an Android.

We don't want that.

And is the friction here, you have to save your private keys for it?

No.

The lockdown, if you...

So go to turn it on.

Okay.

And what does the first screen show you?

Turn on lock, McDermott.

Turn on and restart.

So click on that.

Okay.

We need some elevator music while this happens.

Okay, let's see.

Oh, did we restart?

Yeah.

Oh, okay.

It's going to turn on and restart.

So let me then tell you what else would have happened.

So Apple also warns you as a user that this is an extreme security mode and it's not intended for most people because they're worried, I think, I speculate, that people turn that thing on, forget that they've turned it on and they have a bad experience, right?

Like a regular person who's not facing those threats.

And if you're like a big company, you've got to worry about that sort of thing.

If you're watching this and you care about privacy, turn it on.

Have the experience of breathing a little bit of oxygen, much more, not a silver bullet, but like a much harder to hack device, right?

Like, and I can tell you that empirically, like a much harder to hack iPhone.

It's very cool.

And for the longest time, if you were an Android user and you asked me this question, people always ask me this question, what's more secure, like an Android or an iPhone, right?

Everybody has their own view.

My colleagues will have different views, but unfortunately, no lockdown analog existed for Android.

That has just changed.

Google has rolled out advanced protection for Android, which is like a version of lockdown mode.

It has some other really interesting features.

So where lockdown mode is partially working by, you know, preventing people outside of your contacts on FaceTime from calling you and other stuff that's sort of like Royal Road for attacks.

Android's advanced protection, as we understand, it has some other cool features like allowing you to securely put logs from your device into a secure protective crowd that only you have access to, which means that if you're an attacker and you hack this device, in theory, right?

Like a log of that is being immutably kept somewhere, right?

Like your private ledger.

And so even all the effort that they may do to try to clean up and hide later there may still be some evidence Now this is like for now these are sort of in the realm of hypothetical will this make it easier for defenders to find stuff But it definitely increases the risk factor for, because if you're a scammer, it's a numbers game, right?

If you're doing hacking a large number of people for crypto stuff, it's a numbers game.

And you're not using a really fancy, you're not paying pearls to do your hacking.

If you're using one of these sophisticated pieces of technology, getting caught is like game over.

It's really really bad And it you like all the customers fate share, right?

So if it stops working For you know, hungry, it's also going to stop the same x-play stop working for everybody else They get caught and patched.

So they're always trying to hide This cat and mouse thing.

Yeah, and the risks to them are much higher And so if you as a user can do things or if developers can do things that are more likely to generate alerts or things that are hard to get rid of, then you've actually changed some of the cost calculus about whether it makes sense to hack a person or whether it will work.

The other cool thing about lockdown mode is it may break certain kinds of exploit effort in ways that a user would then see.

And so there's sort of dual layers of potential protection in some of these things.

None of these are silver bullets, but they're all very interesting.

Well, I am now a little more secure.

So thank you.

A little bit more.

I could literally talk to you all day, Jon.

This has been really good.

We are already quite late for dinner, so we should probably wrap this up.

But there is one thing to do.

There is one thing to do.

Let's find out.

So, um...

Do we need to turn the cameras off while we do this part?

Well, what we can do is I can go off scene for a minute.

Sure.

And do something.

Should we do that?

Yeah.

Am I lucky enough to be a part of the Pegasus crew?

So, Danny, do you consent to me telling you your results on a podcast viewed by a bunch of people?

Yeah So the good news is we didn find signs of the kinds of things that we check for I need to push harder The good news is probably that means that there a bunch of stuff that never happened on your device The caveats, of course, the things we don't know to look for, right?

There are things that maybe we're not able to check for with this particular methodology.

So, you know, known unknowns, unknown unknowns.

but it's like the equivalent of getting like, you know, a strep rapid test, COVID rapid test for your device.

I wish it were the case that there were some app that everybody could have access to that would do a check at that level of fidelity and give you an answer.

The problem is if that existed, it would stop working the next day because they would know exactly how to get around it.

When you look at contracts from like mercenary surveillance providers, sometimes they get leaked.

You'll see like a document.

The document is like a list of like 30 antiviruses with a little like a green check mark.

Like, don't worry, not detected by any of this.

Right?

And so the challenge as researchers is always you want to check widely, but not burn the ways that you're checking.

And then you hope, and I kind of like, let's end on this thought.

So we've talked a lot about tech and a lot about privacy.

We haven't talked too much about victims as individuals, but I'm going to tell you something really interesting that to me is maybe the most meaningful thing.

The real heroes in this story are the people that got targeted with spyware and that got checked.

Why?

Because they were the canaries in the coal mine that led to discoveries that have increased the security of that device and every other device around us.

Billions of devices in the world have received security improvements thanks to individuals, a Saudi woman driving activist, a UAE human rights defender whose name is Ahmed Mansour who now a prisoner of conscience Individuals who bravely chose to get checked and to consent to have their stuff shared These are the heroes in this story.

And we are all safer thanks to them and their participation.

In many ways, we're just the vehicles for those people.

And I really want to highlight this.

Like, I'm a researcher.

I work with a team of very smart people.

But the motor of our research is collaboration.

Nowhere more so than with regional and local organizations around the world, scrappy people who work with us and do the legwork to get people checked and to get people screened.

So to tie the buckle on this one, it is a remarkable story of the script getting flipped on these scary, powerful companies that what caused them to lose billions, what caused them to have huge trouble, in some cases to fold, was one humble activist somewhere.

That is amazing.

You asked me earlier about hope and optimism.

That's my optimism motor.

I mean, what an amazing way to close out the show.

I mean, I'm very grateful for this conversation.

I've really enjoyed it.

I think it's a really important message.

Obviously, a very different show for me.

We're normally just talking about Bitcoin and macroeconomics.

But I'm glad you made the time to do this.

So yeah, I appreciate it, John.

Thank you.

And I just want to thank everyone who has contributed to our research, everyone who's collaborated, all the people who have helped us along the way.

Thank you to them.

This has been awesome.

Thanks, man.

Let's go have dinner.

Good God.

Never lose your place, on any device

Create a free account to sync, back up, and get personal recommendations.