#2448 - Andrew Doyle

24 views

14 hours ago

0

Save

Audio

Andrew Doyle

2 appearances

Andrew Doyle is a British comedian, playwright, journalist, political satirist and is creator of the fictitious character Titania McGrath. The new book "Woke: A Guide to Social Justice" by Titania McGrath is now available: https://amzn.to/36X2GoG

ChatJRE - Chat with the JRE chatbot

Timestamps

0:13Woke authoritarianism and free speech: UK vs US, hate-speech laws, and arrests over posts
9:59UK speech policing vs US free speech protections; memes, “banter ban,” and incitement standards
19:58Political narratives, lawfare, and free speech: from Trump/Steele dossier to media trust and X/Grok investigations

Show all

Comments

Write a comment...

Transcript

0:00

Joe Rogan Podcast, check it out!

0:03

The Joe Rogan Experience.

0:05

Train by day, Joe Rogan Podcast by night, all day!

0:09

Yes, Andrew.

0:13

Hello.

0:14

Good to see you, brother.

0:15

Good to see you, too.

0:15

It has been, you said, six years almost to the day.

0:19

Almost to the day.

0:19

The last time.

0:20

Lots changed.

0:21

Right before everything went crazy.

0:23

That's it.

0:24

Right before.

0:24

Yeah, the whole world sort of shifted after that.

0:26

Because everything went kooky around March, right?

0:28

Yeah, so it was February 2020, and then we have COVID, and then we have, you

0:32

know, we've

0:32

had Trump in between of that, we had BLM.

0:34

That summer of 2020, everything just exploded.

0:37

Yeah.

0:37

Yeah, and then everything shifted.

0:40

And then you wrote a book.

0:41

I wrote a book.

0:42

It's called The End of Woke, How the Culture War Went Too Far and What to

0:46

Expect from the

0:47

Counter-Revolution.

0:48

Isn't that how it always goes, though?

0:50

It goes like we go too far, and then we overcorrect, and we become Nazis.

0:54

That's it.

0:56

Exactly.

0:57

Or it's the opposite.

0:59

We go socialist.

1:00

It's a big pendulum.

1:02

I get that.

1:03

It sort of goes back and forth.

1:03

I mean, I was trying to, in that book, I'm trying to make the point that what

1:07

woke was,

1:07

was like a kind of the latest manifestation of a kind of innate authoritarian

1:12

impulse.

1:13

I think human beings are, by default, quite inclined towards just shutting

1:17

people up if

1:18

they don't like them.

1:19

Yeah.

1:19

Just imposing their authority.

1:21

And so woke, well, I mean, a lot of people are annoyed that I've called it The

1:24

End of Woke.

1:25

I'm not saying it's all over, let's just go home, forget about it.

1:27

It's still going on.

1:27

But the point about it is that in its current manifestation, things are

1:31

changing now so rapidly.

1:33

We are moving into some sort of new phase.

1:35

And that authoritarianism, which we've associated with the left, might come up

1:39

from the right.

1:40

It could come up from anywhere.

1:41

It's what you say about the pendulum.

1:42

So you just have to be kind of vigilant about it.

1:44

I don't think we were vigilant.

1:45

I think that's why woke happened.

1:46

We weren't vigilant against this prospect that, you know, authoritarianism

1:50

could emerge in

1:51

what we thought was a free society.

1:53

Well, authoritarianism, it snuck in through a sheep costume.

2:00

Yeah, a wolf in a sheep's costume.

2:02

Yeah, it was a costume of being more inclusive, being more open-minded, being a

2:08

better society,

2:10

being kinder.

2:11

You know, it led to, you know, child trans surgeries, led to chaos.

2:16

It led to like a lot of like really fucking freaky things that you'd have never

2:20

expected.

2:21

People saying that the First Amendment's not important.

2:24

What's more important is protecting people.

2:26

Well, that was the key, wasn't it?

2:28

The point was that the way it worked was that it was gulling people through

2:31

language that

2:32

sounded really sweet and kittenish and fluffy.

2:35

You know, things like equity.

2:36

Well, that sounds a lot like equality, doesn't it?

2:39

Right.

2:39

It doesn't mean equality.

2:40

It means treating people unequally to ensure equal outcomes according to group

2:44

identity.

2:45

That's a very different thing.

2:46

You say you're talking about let's make everything inclusive.

2:49

But what you really mean is let's exclude anyone who disagrees with what we've

2:52

got to say.

2:53

So you're using language to mean the exact opposite.

2:56

They say gender affirming care.

2:58

Do they mean that?

2:59

Or do they mean affirming what is effectively a pseudoscientific belief among

3:03

vulnerable people?

3:04

So it's all about misusing language because most people, I think, or I like to

3:08

think, are pretty decent.

3:10

Most people want to be kind and want to be fair.

3:13

And when you hear these activists saying be kind, be compassionate or else,

3:17

right, you know,

3:18

you kind of think, OK, well, maybe their intentions are good,

3:22

but also they're pretty scary.

3:24

I mean, there's a weird, there was a weird thing with the woke thing,

3:27

which was that on the one hand, it proclaimed to be this sort of great,

3:31

virtuous, kind, progressive right side of history.

3:34

How often did you hear that phrase?

3:36

And at the same time, they're like dangerous dogs.

3:39

Like, I better not piss them off.

3:41

I better not say the wrong thing in the workplace because they'll destroy you.

3:46

Well, I always find that the most preposterous the idea is and the least

3:51

capable it is to stand up to scrutiny,

3:54

the more violent the enforcement of that idea will be because you cannot combat

3:59

that.

4:00

You can't defend that idea with logic.

4:02

So you have to defend it with fear and force and just shouting people down.

4:07

And that's what we saw.

4:08

And that's – it's a natural impulse of human beings.

4:12

Absolutely.

4:12

When you're arguing with a kid, you know, when you're a kid and you're arguing

4:15

with a kid and you say something,

4:16

you don't even know you – you shut the fuck up.

4:18

It just started scaring you.

4:20

So why is it, though, that some countries and some societies seem to protect

4:23

themselves better than others against that impulse?

4:27

And I feel at the moment that the UK is kind of failing where America is to a

4:31

degree succeeding,

4:33

not obviously in all ways, but when it comes to the idea of freedom and free

4:37

speech.

4:37

Like, I think the UK has pretty far – has pretty fallen to the kind of –

4:41

the woke insistence that you need to control people's language

4:44

so that you can create this perfect society which can never come anyway.

4:48

Well, I think it's been co-opted.

4:50

I think whatever organic version of that emerges naturally from society where

4:56

people – where there's an overcorrection.

4:58

I think in the UK, because you guys don't have free speech laws because it's

5:02

just different over there.

5:04

Yeah.

5:04

You can get away with a lot of crazy shit.

5:06

Like, first of all, like, we should explain what we're talking about.

5:10

More than 12,000 people have been arrested in the UK in the past year for

5:16

social media posts.

5:17

And if you read some of those social media posts, they're not even remotely

5:22

terrifying.

5:23

It's not like I'm going to grab a knife and go cut the head off of every

5:26

immigrant I see.

5:28

Like, hey, buddy, maybe we should lock this guy up and evaluate him.

5:31

He sounds like a crazy person.

5:32

Like, no, the immigrants are coming into this fucking country and creating all

5:35

this crime.

5:36

Knock on the door.

5:37

Yeah.

5:37

You're going to jail.

5:38

I worry that Americans think we're mad sometimes.

5:41

We do.

5:42

Yeah, do you?

5:42

We do now, yeah.

5:43

We think you've lost it.

5:44

Yeah.

5:45

We also think something happened where your leaders are intentionally trying to

5:50

tank your country.

5:51

It seems like they're trying to bring in as many migrants as possible, cater to

5:57

them, not to the British people,

6:00

and do it openly so that everyone knows what they're doing and then create

6:04

chaos on the streets because of it.

6:06

Yeah.

6:07

I mean, people have a phrase for that, anarcho-tyranny, you know, where you

6:09

punish people who aren't breaking the law.

6:11

Yeah.

6:11

But you protect those who are.

6:12

Right.

6:13

And I think with the, I mean, I don't know the extent that Americans know the,

6:17

I mean, the stat you quoted,

6:18

that came from the Times newspaper in London, which had a freedom of

6:22

information request to the police,

6:24

found out that it's 12,000 a year on average.

6:26

So that's like 30 a day, not just being investigated or looked into, but being

6:29

arrested.

6:30

But over the last few years only, if you go back, it's only like 1,000 or 500.

6:36

It was 3,000 last time we spoke, back in 2020.

6:40

Was it really?

6:40

Yeah, it was.

6:41

Back then?

6:42

Yeah.

6:42

Oh, my God.

6:43

So we already had that problem.

6:45

I mean, we already didn't know it was that many.

6:46

That's crazy.

6:47

Even back then?

6:48

It was already really high.

6:49

I mean, we had stuff like the old stories of like, there was that guy in 2010

6:53

who made a joke online about,

6:55

he was at Doncaster Airport in the UK.

6:56

He said, oh, if this queue doesn't hurry up, I'm going to blow up the airport.

6:59

Just a stupid, funny tweet.

7:01

He went all the way to court.

7:03

That was a full trial.

7:04

So these laws, and I think what happens with this stuff is people don't realize

7:09

how long this has been embedded

7:10

in the UK, we have hate speech laws that are encoded in a number of different

7:15

legislations.

7:16

We have a thing called the Public Order Act.

7:18

We have a thing called Malicious Communications Act.

7:20

That's from 1988.

7:20

We have Communications Act from 2003.

7:24

And all of these things criminalize.

7:26

I tell you, I kid you not, the language in the statute books is if it's grossly

7:30

offensive.

7:30

That's the phrase.

7:32

If you post something that is grossly offensive, you can go to court, you can

7:35

be prosecuted.

7:36

But, you know, I find...

7:38

So subjective.

7:38

Well, that's it.

7:39

What does that even mean?

7:40

I find laws against free speech to be grossly offensive.

7:43

So should the British state be arrested?

7:46

I don't know.

7:46

And there's one, I think it's in the Malicious Communications Act, where it

7:51

talks about needless

7:52

anxiety.

7:53

Causing needless anxiety can get you arrested.

7:56

And you think that's not a thing.

7:59

I can give you a specific example of that.

8:00

Do you smoke cigars?

8:01

I have once.

8:03

My friend Winston Marshall...

8:04

Do you want one?

8:05

I worry that if I try it, I'll cough and I'll look really wimpish and pathetic.

8:10

And it won't be good for your arguments?

8:12

It will backfire.

8:13

I tell you, it'll undermine everything.

8:14

It'd be like I'm sitting here with a paper hat on at Christmas, undermining all

8:17

of my key

8:18

points.

8:19

Right.

8:19

See, I like the flavor and I like being around smokers, because my grandmother

8:23

used to chain

8:24

smoke around me.

8:25

So it's kind of...

8:26

Oh, boy.

8:26

Well, she's Northern Irish, you know.

8:28

It's the way they do.

8:29

She used to give me whiskey when I was three to calm me down, you know.

8:31

Oh, wow.

8:32

It's that sort of family.

8:33

That's an old thing they used to do with kids.

8:35

They just put it in their...

8:36

Babies.

8:36

They put it in their mouth.

8:37

It worked.

8:38

Like they would dip their finger in whiskey and rub it on the inside of a kid's

8:41

mouth.

8:42

If you're struggling with a child, get it drunk.

8:44

That's how you...

8:45

It's old Northern Irish wisdom.

8:47

I don't think you should scoff at it.

8:49

It's a good thing.

8:50

But I'll be more than have to...

8:52

And it's grossly offensive.

8:52

It's grossly offensive.

8:54

The example I was going to give was this guy called Darren Brady.

8:59

And this sounds made up.

9:01

And whenever I tell people this, it sounds made up.

9:03

He posted a meme.

9:04

I don't know if you saw this meme where it was the four Progress Pride flags.

9:08

You know that it's got the crazy triangles and stuff in it.

9:11

Uh-huh.

9:11

You put them all together and they become a swastika.

9:13

Exactly that.

9:14

Right?

9:14

And that was going everywhere.

9:15

And he posted it.

9:15

And there's a video of him being arrested, put in handcuffs.

9:18

He's an army veteran, by the way.

9:19

Right?

9:19

Put in handcuffs by the police.

9:21

And the policeman says in the video, you caused someone anxiety.

9:25

So the actual language from the law is being used for this rearrangement of the...

9:30

And you know what?

9:31

That's quite a good satirical point that he was making.

9:34

It wasn't even his meme.

9:35

He was just retweeting a meme.

9:37

But even if it was some horrible, offensive thing, who cares?

9:40

How is that offensive?

9:41

Well, I guess...

9:43

I mean, well, you can find...

9:44

That's the problem.

9:44

You can find anything offensive.

9:46

You could find anything grossly offensive if you're extremely sensitive.

9:51

You could.

9:52

But wasn't there a point to that?

9:54

I mean, he was kind of saying that the LGBTQIA plus movement has become quite

9:58

authoritarian.

9:59

Yeah.

10:00

He's not saying they're actual Nazis.

10:02

And he's saying, oh, isn't it quite funny that when you put them together, it

10:04

looks like a swastika?

10:06

The idea that you get handcuffed for that, to me, is crazy.

10:08

Especially for a retweet.

10:09

That's crazy.

10:10

Yeah, yeah.

10:11

That's crazy.

10:11

It's retweets.

10:12

It's tweets.

10:13

It's posts.

10:14

We've had...

10:15

Memes are the big ones.

10:16

So there was a guy called Lee Joseph Dunn who went to prison for eight weeks.

10:20

That was last year, I think, for three memes that he posted.

10:23

Eight weeks?

10:24

Eight weeks in prison.

10:25

Again, I'll tell you what the most offensive of the three memes was.

10:31

And you can tell me whether you think it was worth prison time.

10:33

He put a picture of some immigrants with knives.

10:38

And underneath it said, coming to a town near you.

10:40

And that was it.

10:41

So I don't know if you think that's worth prison time.

10:43

That's the most offensive one?

10:45

Of the three, that's the most.

10:46

What's the least offensive one?

10:47

I can't remember what the other two were.

10:49

Because I remember I looked at them.

10:50

I thought, well, that's not even worth...

10:52

That's not even worth thinking about.

10:53

But this one was the one that really...

10:55

Because they say, in England, you're stirring up hatred against minorities

10:59

through the spreading of the meme.

11:01

Right.

11:02

You know, but that's clearly not sufficient.

11:05

You know, and I think in the US, you have far more protections.

11:08

I wonder whether it's to do with the fact that in the US, you have the First

11:11

Amendment.

11:12

Like, you have something codified that says, you can say what you want.

11:16

We've never had that.

11:18

It's very important.

11:19

And it didn't seem important 20 years ago or 30 years ago.

11:23

Because no one ever looked at England as being that kind of a country that

11:26

would just put people...

11:28

Well, obviously, this was all pre-social media.

11:31

Yeah, yeah.

11:31

And England has always been a fairly polite society.

11:34

Yes.

11:35

But the thing is, like, now pub talk has become illegal, right?

11:39

Yeah.

11:40

Like, if you say something offensive in a pub, you're subject to be arrested.

11:44

And they're asking people to turn people in.

11:46

There's a thing called the banter ban, which the Labour government was trying

11:50

to put in.

11:50

Here's the logic of the banter ban.

11:52

I've forgotten about this, but now you've mentioned it.

11:55

They wanted to introduce this law so that, for instance, if you're working in a

11:59

bar or a pub

12:00

and you overhear someone who says something against your protected

12:03

characteristic.

12:04

Say you're a gay barman and someone says, oh, I don't like the gays or

12:07

something.

12:08

And you overhear it.

12:09

Your employer has a duty to protect you from that kind of hate speech, that

12:12

kind of harm.

12:13

So, therefore, there's going to be a blanket ban on speech, on certain kinds of

12:18

speech within the pub, right?

12:20

I would say the guy who's eavesdropping, he's the problem, right?

12:22

You shouldn't be listening in on other people's conversations.

12:24

So, that's a real thing.

12:27

Yes.

12:27

And I guess it all comes down to this view, which I think is completely wrong,

12:31

that words and violence are the same thing,

12:33

that words can create a more violent society, that there's a direct causal link

12:39

between the stuff that people say

12:40

and the stuff that people say online to how people behave in the real world.

12:44

And I think you guys have got it right because you've got the Brandenburg test.

12:47

Do you know about the test for incitement to violence in the U.S.?

12:50

No, what is that?

12:51

It's basically a test that was established, I think, back in the 60s.

12:55

It was a KKK leader called Clarence Brandenburg who was prosecuted for incitement

12:58

to violence.

12:59

And the test that was established since that precedent was that any words that

13:04

can be convicted for incitement to violence,

13:06

they have to be intended to cause violence, likely to cause violence, and the

13:11

violence must be imminent.

13:12

And if you satisfy that threshold, you can be prosecuted in the U.S. for incitement

13:17

to violence.

13:18

So it would be like kind of imagine a demagogue surrounded by all his fans

13:21

whipping up a frenzy

13:22

and then pointing to a guy on the front row and saying, kill him now.

13:25

That would qualify for the Brandenburg test.

13:27

But in the U.K., because we don't have that test, all we've got is whether

13:32

people found it offensive.

13:34

That's the difference of the threshold.

13:36

So it's a massive difference between what the U.S. has and what the U.K. has.

13:40

Massive.

13:40

It's insane.

13:41

I mean, to give the most obvious recent example, because I don't know if people

13:45

know about this,

13:46

there's a woman called Lucy Connolly in the U.K.

13:48

I don't know if this was reported over here at all.

13:51

Do you remember we had all these riots last year during the summer against

13:54

hotels which were housing asylum seekers

13:57

and people were setting fire to them?

13:59

There were genuinely racist stuff going on during those riots.

14:02

And this was off the back of a guy who'd murdered a bunch of little girls in a

14:06

dance class.

14:08

And there were rumors going around that this was an asylum seeker, right?

14:11

And this one woman, a mother who'd lost her daughter, very sensitive about the

14:15

idea of Lucy Connolly.

14:16

She's very sensitive about the idea of loss of kids.

14:19

She tweeted in a fit of anger, go and burn down all the hotels for all I care.

14:24

If that makes me racist, so be it.

14:26

And take the government with you.

14:27

Something like that.

14:29

And she deleted it within a couple of hours.

14:31

She went out, walked her dog, she deleted it.

14:32

She thought, that's not me, that's not who I am.

14:35

Deleted it.

14:36

Police came, went to court, sentenced to 31 months in prison for that swiftly

14:42

deleted tweet.

14:43

And she served over a year.

14:45

Oh my God.

14:47

Now, I'm not saying the tweet was nice, right?

14:49

The tweet was a horrible tweet.

14:50

And she says it was a horrible tweet.

14:52

That's why she deleted it.

14:53

But because we don't have that Brandenburg test, we don't have a test for incitement

14:57

to violence.

14:57

Because the key is that tweet, there was no way it could have, she was a no,

15:01

but you know, she wasn't someone with influence.

15:03

She didn't have many followers.

15:06

No one was going to read that and go and act upon it.

15:09

And if they did, that would be on them, right?

15:12

Because this is a myth.

15:14

This myth that people act on cue to what they read online isn't real.

15:18

It influences people for sure.

15:20

But at what point are you required to have sovereignty over your own mind and

15:27

your own actions?

15:29

Yeah.

15:29

Well, I think what it does is it raises the temperature, particularly when

15:32

political leaders do it.

15:33

Right, but when political, but my point is like, it's not going to incite you

15:37

to violence.

15:38

It's not going to incite me to violence.

15:41

So who are we talking about?

15:42

This is part of the thing is like they're protecting the dumbest members of

15:46

society.

15:47

This is like the thing about banning, you know, crazy talk online.

15:52

If you're talking about witches or, you know, whatever it is, flat earth.

15:56

Like we have to stop misinformation.

15:58

From who?

15:59

It's not working on you, right?

16:01

You don't believe it.

16:02

So who are we protecting?

16:04

We're protecting the dumbest people.

16:05

Also, aren't you kind of letting them off?

16:07

Like if someone goes and commits an act of violence and said, oh, I did it

16:10

because someone told me to do it.

16:11

Aren't you kind of letting them off the hook?

16:13

Right.

16:14

Exactly.

16:14

And sort of displacing the blame.

16:16

You know, it's like that guy who shot John Lennon who said Catcher in the Rye

16:20

made him do it.

16:21

Reading the book Catcher.

16:22

Are we now blaming J.D. Salinger?

16:24

Right.

16:24

For the murder of John Lennon?

16:25

It was John Lennon, wasn't it?

16:26

I think he did.

16:27

So do you – I think the safest approach is to say people are responsible for

16:31

their own actions.

16:32

I think the best that you could say is when political leaders and people with

16:36

clout say things like that and sort of say, you know, it's fine to go out and

16:40

commit violence.

16:42

I think what they do is they create a kind of imprimatur of approval.

16:45

They create this kind of sense that if you do it, the people in charge will

16:49

have your back.

16:50

If you do it, it's okay.

16:51

Well, this was the argument with Trump for January 6th.

16:55

Right, right.

16:55

And that's why the BBC edited his speech to make it look as if that's what he

16:59

was saying.

17:00

You saw that clip, right?

17:01

Oh, my God.

17:02

It's fucking crazy.

17:03

I mean, I've been saying for a long time the BBC has a real – like what I

17:06

will say in the BBC's defense is they've always been pretty good at being party

17:10

politically neutral.

17:12

Like they will interrogate someone in the right and someone in the left in a

17:16

pretty neutral way.

17:17

They don't – I think they do a pretty good – I know people will be annoyed

17:19

at me for saying that, but I think they do.

17:21

But I think in terms of the ideology, the woke ideology, they got captured.

17:25

They have a thing at the BBC called the LGBT desk, or they had it up until

17:29

recently, which could veto any news story, which meant that any story that was

17:33

slightly critical of trans activism or anything like that just didn't get

17:36

reported.

17:37

So I'm not surprised that the BBC –

17:40

They gave them veto power?

17:41

They gave them veto power, yeah.

17:42

That's crazy.

17:43

This all came out in a report, quite a recent report just a few months ago,

17:46

which led to the resignation of Tim David, the director general.

17:48

And he resigned ostensibly because of that Trump clip, which, by the way, that

17:52

wasn't the first time they did it.

17:54

There was another clip about a year before in a different program that did the

17:58

same thing.

17:59

Took the clip, re-edited it, and made it look like he had said something he

18:03

absolutely had not said.

18:05

So I think the BBC quite obviously has an ideological bias, if not a party

18:10

political bias.

18:12

But that's more than a bias.

18:14

Well, it's misleading, right?

18:15

Yeah.

18:15

It's completely deceptive.

18:17

You're editing something and change – I mean, they took out a giant chunk of

18:22

his speech.

18:23

Yeah.

18:23

This episode is brought to you by 1-800-Flowers.com.

18:27

Valentine's Day is coming up.

18:30

It always sneaks up on people.

18:31

If you want an easy way to absolutely crush it this year, this is it.

18:37

1-800-Flowers-Roses.

18:39

They're bigger and actually last.

18:41

Plus, they back it with a seven-day freshness guarantee so you can feel

18:45

confident that you're sending the best.

18:48

Here's the deal.

18:49

Right now, they've got this double blooms offer.

18:52

You buy one dozen roses, they double it to two dozen for free.

18:56

No catch.

18:57

Same price, way bigger statement.

19:00

They also do same-day delivery nationwide.

19:03

So even if you waited longer than you should have, you're still good.

19:07

This is one of those rare situations where doing something big is actually easy.

19:12

Go to 1-800-Flowers.com slash Rogan to get the double blooms offer.

19:18

Buy one dozen, they double it to two dozen roses free.

19:22

That's 1-800-Flowers.com slash Rogan.

19:26

I forget how many minutes it was.

19:28

They leapt like 45 minutes or something.

19:30

So he said –

19:31

Something crazy like that.

19:32

Yeah, he said – it made him look like he was saying go and commit the right.

19:35

Yeah, exactly.

19:36

And instead, he was in tongue-in-cheek talking about the very fine senator.

19:42

They're doing a great job, the senators and congresspeople.

19:46

Yeah.

19:46

Said all this other stuff.

19:47

It's so weird.

19:49

And then said you have to fight like hell to keep your country.

19:51

I mean, no offense, but you can find daft stuff that Trump says pretty easily,

19:54

right?

19:55

You don't need to edit that stuff down.

19:58

Well, it's because they had an opportunity to – like what we were saying

20:02

before – earlier, we were talking before the show.

20:04

You can put out a narrative and it doesn't have to be true and then that's the

20:08

one that sticks.

20:09

So that's the one that spreads wide.

20:11

And then when all these years later, they have to have this trial and everybody

20:17

finds out it's not true.

20:19

But the damage is done.

20:20

I mean, that's what they did with Trump during the whole Steele dossier.

20:24

Yeah.

20:24

You know, the hookers and peeing on people and all that crazy shit.

20:28

Remember that?

20:29

I remember the idea that he'd hired hookers to urinate on the bed that was once

20:33

occupied by the Obamas.

20:34

Something along those lines.

20:35

Now, the reason I didn't believe that is I don't think Trump is that avant-garde.

20:38

I don't think he's that creative.

20:39

Like if he had come up with that, I would have been actually applauding that.

20:42

That's kind of amazing.

20:43

But obviously he didn't do that.

20:45

That's not even something to applaud.

20:46

That just sounds completely ridiculous.

20:48

Getting urination on the bed of your enemy through the medium of prostitution.

20:52

I think that's kind of an artistic thing to do.

20:54

But I don't think he did it.

20:54

I obviously didn't do it.

20:55

None of it's true.

20:56

Right.

20:57

But you put that.

20:57

But isn't that weird that that in particular, that's like something I don't

21:00

think anyone seriously could believe.

21:01

Well, there's plenty of people that believed it.

21:04

Really?

21:04

Yeah, they don't have to believe it.

21:06

They just say it.

21:07

Like that was the whole point about, you know, the trial where he got arrested

21:13

and convicted of 34 counts that are a felony.

21:18

None of which are actually a felony.

21:20

That's all bookkeeping deception.

21:23

That was the paying off of the girl.

21:25

Yes.

21:25

So now you can say he's a convicted felon.

21:28

You can just say that.

21:29

And even though all those counts were misdemeanors, all of them had passed the

21:33

statute of limitations.

21:35

But for some reason, through no legal way that anybody could ever really

21:41

honestly explain, they decided to label it a felony.

21:45

And it was just to turn them into a felon.

21:47

I saw even left-leaning anti-Trump lawyers saying this is not how the law

21:50

should work.

21:51

No.

21:52

You can't artificially elevate a misdemeanor to a felony outside the statute of

21:55

limitations.

21:55

It's crazy.

21:56

But the thing is, if you do that, they're going to do that to you.

21:59

Yeah.

21:59

It's like we're going to give that kind of power to the Republicans?

22:02

And now when they're in office, they're going to start doing things like that?

22:05

Are we crazy?

22:06

Well, also, this really bothers me.

22:08

One of the key things that I think has happened over the past few years is this

22:11

complete lack of fealty to the truth from both sides.

22:15

It's whatever is convenient matters more, a complete lack of intellectual

22:19

curiosity, a complete lack of investigating and looking and thoroughly checking.

22:23

And by the way, with the BBC, that really matters because unlike the news media

22:27

here, which can be as partisan as it likes, the BBC is the state broadcaster.

22:31

It's got a responsibility by charter to not be, you know, to be balanced, to be

22:36

even-handed.

22:37

And it completely failed.

22:38

And I saw today, just this morning, some people, you know, we've got all the

22:42

mania about the Epstein files at the moment.

22:44

Some activists have now said J.K. Rowling once invited Epstein to the opening

22:49

of her theater, her play.

22:50

Never happened.

22:52

But because there's a furore about Epstein at the moment, they're just saying

22:55

it happened.

22:55

It gets spread all over the place.

22:57

That's all you have to do.

22:58

And that's all you have to do.

22:59

And then that gets repeated.

23:02

Oh, didn't this happen?

23:03

I know.

23:03

Like what you say about Trump is right.

23:05

I always hear that he's a convicted felon.

23:06

He's a convicted felon.

23:07

Well, why don't you pause for a minute and assess whether or not that

23:10

conviction is sound or whether it was politically motivated or how helpful that

23:14

is?

23:14

But like you say-

23:16

Also, it's like it's such a dangerous precedent to send.

23:19

It's terrible.

23:20

Like if you do that, look, right now in the United States, the media

23:25

predominantly leans left except for Fox News, the mainstream large-scale media.

23:31

I guess CBS is probably going to lean more right now.

23:34

Yeah, yeah.

23:35

It seems like it's in the process of that.

23:37

But for the most part, when you watch CNN, if you watch MSNBC, if you watch the

23:41

mainstream news, it's very left-leaning.

23:44

Yeah.

23:45

But if the fucking – if right-wing people started – if it was like more

23:50

common for the news to be right-leaning and then they started doing the exact

23:55

same thing about a left-leaning candidate, this is so dangerous.

24:01

And the idea that the left doesn't recognize that – which are the people that

24:04

have always been in support of free speech?

24:07

It's never been a right-wing thing to support free speech until now.

24:10

It's always been a left-wing thing.

24:12

When I was a kid, it was famously the case of the ADL defending Nazis having

24:17

the right to protest and saying, look, we think what they're saying is abhorrent.

24:21

But it's very important that you get the right to say whatever you feel and

24:25

then the way to combat that is with much better, more concise speech that's

24:29

much more logical and makes sense.

24:31

And this is what you do.

24:33

This is what debate is for.

24:34

This is – we've always known this.

24:36

Yeah, but I mean I agree.

24:38

I'm so dispirited by that very thing that you've identified that the left used

24:42

to be about this.

24:43

The left used to be all about – I mean that example you mentioned of Skokie,

24:46

wasn't it, in Chicago?

24:47

The Nazis marching through Skokie and the ACLU saying, you know, we're

24:51

defending this.

24:52

There was a book by a guy called Aya Neha who was the head of the ACLU called

24:55

Defending My Enemy.

24:56

Yeah, it wasn't the ADL.

24:57

It was the ACLU.

24:58

It was the ACLU.

24:59

And he was saying, you know, he's Jewish.

25:02

He's got family members who died in the Holocaust.

25:05

But he's writing a book saying, I'm defending neo-Nazis' right to free speech,

25:08

not because I support them but because I don't.

25:10

And I want to defend the principle whereby I can tackle them.

25:14

And that's speech.

25:15

Right.

25:15

So in other words, the principle is so much bigger.

25:18

I mean, the thing that I think has been lost – and now, by the way, the ACLU,

25:21

complete about turn.

25:22

I mean, there was a lawyer for the ACLU tweeting about how he wanted Abigail

25:25

Schreier's book banned.

25:27

And he said, this is the hill I will die on.

25:29

You know, that's a guy called Chase – or was it a guy?

25:31

I think it's a trans activist called Chase something.

25:33

I can't remember.

25:33

Anyway, but the point is how far have you fallen?

25:36

When it comes to these free speech issues, left or right, it's nothing to do

25:39

with it.

25:40

It should be about this principle of – it's not whether you agree with what

25:43

they're saying and the substance of what they're saying.

25:46

It's whether you want the principle intact.

25:48

And that principle applies to us all.

25:50

The very same principle that allows the Nazis to say all their crazy stuff is

25:54

the principle that allows us to challenge it, to tackle it.

25:57

Well, it's a very short-term win.

26:00

It's basically they're playing chess and they decided, I want that rook no

26:04

matter what.

26:05

And then they just sacrifice their queen.

26:06

Like, look what you've done.

26:08

Look what you've done for this short-term victory.

26:10

You're essentially tanking civilization for a decade where we have to sort this

26:16

out and, like, let the ship wash itself back and forth until it writes.

26:20

Yeah, so how – and how do you ensure that it's not going to happen to you?

26:24

Like, I think about that, there was a national conservative conference in

26:27

Brussels about a year and a half ago.

26:28

The local mayor said, I don't like this.

26:31

And he had the police rush it, shut it down.

26:33

And you had mainstream right-wing figures like Nigel Farage, Suala Braverman.

26:37

How do they not think – hang on a minute.

26:39

If we establish that precedent where you can just shut down your political

26:42

opponents through the use of police force, how will that not rebound on me?

26:45

How will that not happen to us?

26:47

Well, this is the argument that they're using right now for Trump going after

26:50

his political opponents.

26:51

Right, right.

26:52

Because they opened that Pandora's box, right?

26:53

You guys did that with him.

26:55

Yeah.

26:55

And everybody was saying how damn dangerous it is.

26:59

Yeah.

26:59

You can't fucking do that.

27:00

Even if you hate the guy.

27:02

Like, if there's a real crime that you can get someone, but when you take a

27:06

crime like the bookkeeping stuff and turn it into a felony that could put this

27:10

man in jail for the rest of his life for doing something that turns out to be

27:14

legal, you can pay people to shut up.

27:16

And this is so – it's just – it's so weird that people for this short-term

27:22

gain are willing to tank, which is essentially this whole structure of our

27:27

civilization that allows free discourse.

27:29

You need it.

27:30

It's so important.

27:31

It's so important to be able to communicate and talk.

27:34

If podcasts didn't exist, there was no way to talk through ideas other than

27:39

mainstream news, we would still be stuck in some very bizarre 1990s or 1980s

27:46

narrative about how the world works.

27:49

Yeah.

27:49

We would have real problems.

27:51

We'd have real problems if there wasn't independent journalism like on Twitter

27:56

and on wherever they can post.

27:58

Yeah.

27:58

So why don't they get it?

27:59

I mean we've had like people in left-leaning papers in the UK calling for Elon

28:03

Musk to be arrested because he's allowing free speech on X or Twitter or

28:06

whatever you want to call it.

28:07

Well, their offices got raided today.

28:10

Did it?

28:11

In some country.

28:12

There was a country where X's offices got raided.

28:16

I think one of the things was they somehow or another let – I think something

28:22

had to do with child pornography.

28:25

Where was that?

28:26

France.

28:26

France.

28:27

Fresh investigation into Grock.

28:29

And what is it?

28:30

What are the –

28:31

Oh, so you know what this is all about.

28:33

See, here it is.

28:33

Yeah.

28:33

Suspected offenses including unlawful data extraction and complicity in the

28:38

possession of child pornography.

28:40

Yeah, but that's not what this is about.

28:42

This is because people have been misusing Grock to like put bikinis on women

28:47

they like or even in a few cases creating child sexual stuff.

28:51

You can do – wait a minute.

28:52

You can't create child pornography on Grock.

28:54

I don't think – no.

28:54

Or at least I think that's very much been shut down and safeguarded, right?

28:58

I think that's what's happened.

28:59

I mean unless there's like some sort of a loophole where you could get it to do

29:03

it.

29:03

Among potential crimes, it said it would investigate where complicity in

29:07

possession or organized distribution of images of children of a pornographic

29:11

nature, infringement of people's image rights with sexual deepfakes.

29:15

Okay, the sexual deepfakes.

29:16

Yeah.

29:17

So sexual deepfakes is like if you put Hillary Clinton in a bikini and made her

29:21

hot.

29:21

That's a sexual deepfake.

29:23

Okay.

29:24

Fraudulent data extraction by an organized group.

29:26

I think you can still do some of that stuff.

29:29

You can put people in bikinis.

29:30

Yeah, I think you can do that.

29:31

So like if you wanted to take Shaquille O'Neal and put him in a bikini, you

29:35

could say you're sexualizing him.

29:37

Okay.

29:37

Yeah.

29:38

I mean I guess you can do that.

29:39

Yeah.

29:40

So that will be why – you know recently Keir Starmer, prime minister of the

29:43

UK, said he wanted – was considering – or not necessarily he was going to

29:47

ban X, but it wasn't off the table.

29:49

It's something like he – as though he's going to do that.

29:51

But this is always the excuse.

29:53

Like we're protecting children.

29:55

Right.

29:55

And look, no one wants that sort of stuff, right?

29:58

No one wants deepfakes of kids, obviously.

30:01

But there's – I mean looking at the stats on that, there's far more child

30:04

sexual exploitation on Snapchat, for instance.

30:06

But they don't go after Snapchat because Snapchat isn't the form where Keir

30:10

Starmer is getting criticized every single day and brutally hauled over the coals

30:14

by people checking his facts.

30:16

One of the best things about X recently is the community notes.

30:18

Checking journalists and politicians in real time with facts.

30:22

They hate it.

30:23

They hate that.

30:25

So no wonder they're going after X.

30:26

Yeah, Biden got cooked by community notes multiple times.

30:29

Yeah, yeah.

30:29

The part where the administration was taking down posts.

30:31

Yeah.

30:32

So did the Guardian, the left-leaning newspaper.

30:34

It flounced off X with a big statement saying, we're going to Blue Sky.

30:38

We've had it.

30:39

We're off to Blue Sky.

30:40

It was such a flounce.

30:41

And of course – and then, of course, everyone was retweeting all their

30:44

community notes.

30:45

They had loads of them.

30:47

Of course.

30:47

Just absolutely loads of them.

30:48

Because it's not true.

30:49

And, you know, especially when it's open to the whole world.

30:52

Yeah.

30:52

And people that aren't stuck under your guidelines, like in America, we could

30:55

just talk shit.

30:56

Yeah.

30:57

And I think the reason why it's in France probably has a lot to do with Candace

31:00

Owens.

31:01

Oh, yes.

31:02

That makes complete sense.

31:03

Yeah.

31:03

That might be, yeah.

31:04

Brigitte Macron and, like, I mean, how many times did that get shared?

31:08

Yeah, exactly.

31:09

I mean, that is –

31:10

That makes sense of it now.

31:11

By the way, there's a real quick way to solve that.

31:14

Open chromosome test.

31:16

Go ahead and do it.

31:17

Oh, I thought you were going to be a bit more graphic than that.

31:19

Well, you don't have to.

31:19

No, you don't have to.

31:20

Because that doesn't really solve it.

31:22

Because you could – unless – I mean, there's no operation.

31:25

But if she's gone through a surgery, then, you know, you could show a picture.

31:29

And it's probably pretty realistic.

31:30

Especially – when was the last time you saw a 70-year-old lady's cooter?

31:34

Last week.

31:34

Oh.

31:35

Yeah.

31:35

Congratulations.

31:36

I'm just interested in that sort of stuff.

31:37

Well, you know, you're allowed to be curious in this country.

31:40

That's actually a really good example, though, isn't it?

31:41

Of the – just something so obviously not true just going all over the world.

31:46

Like, in a matter of moments.

31:48

Is it not true, though?

31:49

Well, that Macron's wife is a man.

31:51

Yeah, that's not true.

31:52

A hundred percent?

31:53

Well, you know, the burden of proof is on those who want to say that it is true.

31:56

The reality of the story is weird enough without it being true.

32:00

Like, the 40-year-old man and the –

32:02

He was – wasn't she his school teacher?

32:04

40, yeah.

32:04

She was 40 if it was – if it is actually a woman.

32:07

She was 40 and he was 15.

32:10

That's crazy.

32:11

And everyone says, well, they're French.

32:12

That seems to be the thing.

32:14

What a wild country.

32:16

People just say that's the way it works in France.

32:19

Yeah.

32:20

But again, look, I would say with all of this stuff, you need some sort of

32:24

proof.

32:25

You need – like, when you – wasn't it the Carl Sagan thing about

32:28

extraordinary claims

32:28

require extraordinary evidence?

32:29

I think that's a pretty safe diktat, the idea that, okay, anything could be

32:33

true.

32:34

You know, there have been crazy conspiracies that turned out to be true.

32:38

So I'm not – I would never rule anything out.

32:41

But what I'm saying is if you're going to make a claim like that, you better be

32:43

damn sure

32:44

you've got really solid evidence about that.

32:47

Yeah, she's got hours-long documentaries on this.

32:51

Yeah, and are they persuasive?

32:52

I haven't watched them all.

32:54

I haven't watched them all.

32:54

Do you think I have that kind of time, Doug?

32:56

Well, you should do.

32:58

You should do your research before – you're part of the problem.

33:02

Outrageous.

33:03

I can't do research on that.

33:05

I want to wait until it plays out in court.

33:07

But whenever I do do research – like, I'll give you the example from this

33:10

week, just because

33:11

I'm reading it now.

33:11

A woman's written a book claiming that Shakespeare was a black woman.

33:14

Oh, I saw that.

33:16

Yeah.

33:16

So this is a major spoiler alert.

33:20

Shakespeare wasn't a black woman, by the way.

33:22

Crazy.

33:22

Yeah.

33:23

I've got the book – I'm reading the book now.

33:25

It is worse than you imagine.

33:27

Part of the evidence –

33:28

How could it be worse than I imagine?

33:29

Because, because – it's obviously not true, firstly.

33:31

Of course.

33:32

But she basically says in the book that it's important that it should be true.

33:37

And therefore –

33:38

What?

33:38

Yeah.

33:38

In fact, the book opens with a picture of Shakespeare as a black woman, which

33:42

was drawn by the author.

33:44

Is it a good drawing?

33:47

It's okay.

33:48

I don't want to mock someone else.

33:50

Can I see it?

33:50

Can I see it?

33:51

If it's out, it's the front – it's the first – oh, that's the book.

33:54

That's actually pretty good.

33:55

No, no, that's – no, no, no.

33:57

That's a black woman?

33:58

No, no, no.

33:58

That's a portrait of Amelia Lanya, who she says – well, Shakespeare – and

34:02

she says that

34:03

the portraits at the time were whitened to disguise her blackness.

34:06

In the book itself –

34:08

So convenient.

34:09

In the book itself – you won't be able to get in the book, I don't think,

34:12

Jamie – but

34:12

in the book itself, there's a sketch that she's done.

34:14

So it's like – I can imagine a publisher saying, oh, what evidence have you

34:17

got?

34:18

And she's like, oh, well, I'll go and draw it for you.

34:20

And that's sort of what she's done.

34:21

Oh, she was black and Jewish?

34:22

Yeah, black Jewish.

34:24

Well, actually, I mean, Amelia Lanya was part Moorish, but wasn't black, and

34:28

she wasn't

34:28

particularly dark-skinned.

34:30

And she was Jewish as well?

34:31

Yeah, part Jewish.

34:32

Okay, so who is this woman that they're saying actually was Shakespeare?

34:36

So she's called Amelia Lanya, or Amelia Bassano.

34:40

And one of the arguments is that Shakespeare at the time, if she was a woman,

34:44

wouldn't

34:45

have been able to get published, because women couldn't get published.

34:47

But Amelia Lanya was published.

34:49

She had a book of poetry.

34:50

So all of this stuff falls apart, like, in two seconds flat.

34:53

And – all right, this is the best one.

34:55

She even says in the book that the word Shakespeare is an anagram of a she-speaker.

35:04

I'm not making that up.

35:06

That's what she says.

35:08

I mean, you know, listen.

35:11

What a cover-up.

35:12

How'd she crack the case?

35:13

Well, actually, it's an old theory.

35:15

It's like a 20-year-old theory.

35:16

Is it really?

35:17

I tell you –

35:17

20 years old.

35:18

She's just sort of rehashing it now for this identitarian post-woke world where

35:22

we're all,

35:22

like, we're desperate for Shakespeare to be a black woman.

35:25

And it's so –

35:26

It's so fun.

35:27

It's so pathetic.

35:28

This was my first encounter with conspiracy theories, because my background is

35:32

– I did

35:33

a doctorate in Shakespeare.

35:33

My background was teaching Shakespeare back in the day, like, before I did

35:36

comedy and before

35:36

I did anything else.

35:37

And it was the conspiracy theorists around Shakespeare saying Shakespeare

35:40

couldn't have

35:41

written his work.

35:41

They are the most intense, the most angry, the most evidence-free cohort of

35:47

people who can

35:48

– they get more – they're angrier than the woke.

35:51

I promise you.

35:51

Like, I've tweeted – I've written stuff about Shakespeare online.

35:54

I recently did some lectures about Shakespeare for the Peterson Academy,

35:56

because I'm really

35:57

into – I love the Peterson Academy.

35:59

I love what they're doing.

35:59

And I did these Shakespeare lectures, and the conspiracy theorists were on to

36:03

me online

36:03

saying, it wasn't Shakespeare.

36:05

The guy from Stratford didn't write this.

36:07

And what all these theories have in common is they've just made – there's no

36:11

evidence.

36:11

There's no evidence.

36:12

The key point about Shakespeare is if you're going to say it wasn't the guy who

36:15

everyone

36:16

thought it was, you have to answer one key question.

36:18

Why does everyone who knew Shakespeare, wrote about Shakespeare, say that it

36:22

was?

36:22

Can I stop you?

36:23

Because I'm confused.

36:24

I didn't even know that there was a conspiracy about Shakespeare.

36:27

Oh, wow.

36:28

Yeah, there's lots.

36:29

I had heard one person say that Shakespeare wasn't real and that it was really

36:34

someone

36:35

else's work that he plagiarized.

36:37

Yeah.

36:37

I had heard that.

36:38

But I never even bothered to fuck around with it.

36:41

Well, it actually came from America.

36:42

It's you guys.

36:43

Of course.

36:43

We're the best.

36:44

We're number one.

36:45

There's a guy called Looney, actually, from America.

36:48

That's hilarious.

36:49

That's his name.

36:50

You're going to listen to that guy.

36:50

So he, we're going back like 60, 70 years or something, but he came up with

36:55

this idea

36:55

that Shakespeare was actually an aristocrat called Edward de Vere, the Earl of

36:58

Oxford.

36:59

Problem is, Edward de Vere died in 1604.

37:01

That's before Macbeth.

37:03

That's before Antony and Cleopatra.

37:04

That's before Coriolanus.

37:05

That's before the Tempest.

37:07

So he managed to, I think they get around it by saying, he wrote these plays

37:11

and then

37:12

he, and then he died.

37:13

And then Shakespeare found them?

37:14

Or, or the, or something.

37:16

Yeah.

37:16

So, so even though some of those plays actually have cultural references from

37:20

the time after

37:20

de Vere died, but it doesn't matter.

37:22

Maybe he was a prophet as well.

37:22

But, but, but all of the, all of the, you, you speak to these people, you'll,

37:26

you'll see

37:27

what I mean.

37:27

Edward de Vere, they think, some people think it was Francis Bacon.

37:30

Some people think it was Christopher Marlowe.

37:31

Some people think it was Elizabeth I.

37:33

Like all, all of the candidates they put up, right?

37:36

The key thing is they're all aristocrats.

37:38

They're all posh.

37:39

Why?

37:39

Because Shakespeare was a middle class, lower middle class, not very rich, didn't

37:43

go to

37:44

university, came from the Midlands, you know, up and coming guy who, and they

37:48

say, well,

37:49

how could someone like that write about kings and lords and ladies?

37:52

It's snobbery.

37:53

They're basically saying working class people can't do, can't do art.

37:57

That, I mean, really, that's what it is.

37:58

Otherwise they wouldn't be going after all these aristocrats.

38:02

In the, it's the opposite in America, oddly.

38:05

Is it?

38:05

Yeah.

38:05

So if you were a Rockefeller in America, you're from the Rockefeller family and

38:10

you wrote an

38:11

amazing novel, no one would believe it.

38:12

Right.

38:13

Okay.

38:13

They would say, no, that has to be like some guy who, or some woman who's like

38:18

grinding,

38:18

drinking coffee and smoking cigarettes alone in their apartment to write

38:22

something that's

38:23

brilliant.

38:23

So I wonder what it is about the UK.

38:24

Well, although, like I say, a lot of it comes from America and is it just the

38:29

need to tear

38:30

down an icon, is it that?

38:32

Is it?

38:32

Yeah.

38:32

I mean, I get it now with this woman who's saying Shakespeare was a black woman.

38:36

I get that at the moment because we're in this moment of identitarian, group

38:40

identity

38:40

mania, right?

38:41

So that makes sense.

38:42

She's got a political reason why she wants it to be a black woman.

38:44

Right.

38:45

So I kind of understand that more.

38:47

But what is it, I think it might be more to do with the idea that this guy

38:50

changed civilization,

38:51

changed literature.

38:52

No one else has achieved what he achieved in writing.

38:54

He's up there with Michelangelo, Bach, you know, all of that.

38:57

Let's tear that down.

38:59

Let's tear down Western civilization.

39:00

Let's say none of this is based on anything.

39:03

This is all, this is all untrue.

39:04

Right.

39:04

I think it's to do with the, that innate iconoclasm, that innate, you know,

39:09

just tearing

39:10

down the great things about our culture.

39:12

For sure.

39:12

That's always been the case.

39:14

And people always want to tear down idols.

39:16

They want to tear down, you know, whoever it is.

39:19

No matter what.

39:19

I was watching this video we were talking about the other day of this woman

39:22

talking about

39:22

how the Beatles were terrible.

39:23

Right.

39:24

And this woman was not very articulate, not particularly interesting, doesn't

39:28

seem that

39:29

compelling.

39:29

Yeah.

39:30

And she was going on and on about how bad the Beatles were.

39:32

I'm like, you're not going to convince anyone.

39:34

This is not going to work.

39:36

But people are going to fucking try.

39:37

They're going to try no matter what, no matter who it is.

39:39

Hendrix sucked.

39:40

I've heard that before.

39:41

Oh, really?

39:42

Hendrix sucked.

39:42

Stop.

39:43

But at least that's based on an opinion, right?

39:46

Yes.

39:47

There's a difference between saying Jimi Hendrix sucked and Jimi Hendrix sucked.

39:49

Jimi Hendrix was actually a woman from Liverpool called Maud.

39:52

Well, you know the theory about Jimi Hendrix in America.

39:55

Do you know that?

39:56

No.

39:56

Okay.

39:57

So it's the people that are like deep into the CIA and CIA conspiracies.

40:03

And what is it called?

40:05

Strange Tales from the Canyon?

40:06

Is that what it's called?

40:07

The book?

40:08

So there's a book on there's a bizarre connection between a lot of the countercultural

40:15

figures

40:15

of the 1960s and the intelligence community.

40:19

One of them is Jim Morrison's father, was like a high-ranking military officer.

40:23

And then there's different people from different bands that were like a key

40:27

part of the countercultural

40:29

movement that all have parents that were either in intelligence communities or

40:35

closely connected

40:37

to it.

40:37

Like a suspiciously...

40:38

Weird scenes inside the canyon.

40:39

It's a crazy book.

40:41

It's fun.

40:42

It's kind of fun.

40:43

Is it crazy as in like the revelations are crazy or that it's just not true?

40:47

Well, they make some broad leaps, right?

40:50

Right.

40:50

So there's a lot of...

40:52

And then a year later, he died in mysterious circumstances.

40:55

Or a year later, he died from suicide.

40:56

Or a year later, he died from an overdose.

40:58

Yeah.

40:58

Well, okay.

40:59

You're hanging out with a bunch of people that are doing drugs all the time.

41:02

And they're all ne'er-do-wells.

41:03

Yeah.

41:04

And they're all hanging out in Laurel Canyon.

41:05

Yeah.

41:06

And if you don't know Laurel Canyon, Laurel Canyon, at least at the time, I

41:10

mean, when

41:10

I first moved to Hollywood, it's like all the weirdos would live in Laurel

41:14

Canyon.

41:14

Right.

41:15

Like all the weirdos were like right there above Hollywood.

41:18

And there was all these crazy parties up there.

41:21

It was like Laurel Canyon was nuts.

41:22

And they all knew each other, right?

41:23

Right.

41:24

So they're all part of that circle.

41:25

Okay.

41:25

So, I mean, this was like when I moved there in the 90s, this was the case.

41:29

My friend Dave Foley had a house up there.

41:31

Right.

41:32

And it was like all these kooky people.

41:34

And he was telling me about all these kooky parties and all this different shit.

41:36

It was like Laurel Canyon was always like kind of, so of course a bunch of

41:39

people are

41:40

going to die.

41:40

So what's the theory?

41:41

Of course a bunch of people are going to be connected to bands and different.

41:45

Yeah.

41:45

Counterculture movies.

41:46

The theory is that the CIA sort of engineered this culture to, I don't know why.

41:56

I'm not exactly sure because I haven't gotten all the way through the book.

41:59

I'm only like halfway.

42:00

Are you still reading it?

42:01

No.

42:01

I pick it up every now and then.

42:03

It's just like, it's too kooky.

42:05

It's not grabbing you.

42:06

Well, you can't make Jimi Hendrix in a lab.

42:11

Okay?

42:12

Yeah.

42:12

You can't.

42:13

It's just, you can't fucking do it.

42:15

You can't make someone that good.

42:17

It's not possible.

42:18

Yeah.

42:19

You can't tell me that if they did, why haven't they done it since?

42:22

Why don't they do it all the time?

42:23

Right.

42:23

Because the greatest guitarist of all time.

42:27

And you're telling me the central intelligence cooked that guy up?

42:29

So they invented him like he's like their clone or something.

42:32

They created.

42:33

Well, I just think that they had some sort of an influence on these people, on

42:38

Jim Morrison.

42:39

Like there was a thing about Morrison, the Morrison one.

42:42

Like what is the connection between Jim Morrison's dad and the intelligence

42:46

agencies?

42:46

There's some like tangible connection with Jim Morrison's dad.

42:50

But wouldn't you just normally assume that if your dad was some high ranking

42:54

military guy, first of all, never home.

42:56

Yeah.

42:57

Okay.

42:57

So where are you?

42:58

You're out running around with your friends, smoking cigarettes and fucking

43:02

drinking and you're in a band.

43:04

And it turns out you got a lot of angst and pain because you're being neglected

43:07

as a child because your dad worked 16 hours a day trying to fuck the country

43:11

over.

43:12

And so what do you do?

43:13

You go counterculture.

43:15

It's like it's so calm.

43:16

The preacher's daughter.

43:17

She becomes like a harlot.

43:18

Right.

43:19

There you are.

43:20

High ranking U.S. officer.

43:21

Yeah.

43:21

Right.

43:22

But that is OK.

43:22

But again, like this is a perfect example.

43:24

Wow.

43:25

He's involved in the Gulf of Tonkin incident.

43:27

Whoa.

43:27

That's not proof of anything.

43:29

No, no, no, no, no, no, no.

43:30

But his dad is.

43:31

Yeah.

43:32

But, you know, but this is the thing.

43:33

They'll take something like that.

43:35

They'll take various strips of coincidences and they say this leads us to this

43:38

conclusion.

43:38

But all they're doing is coming up with a conclusion first and working

43:41

backwards.

43:41

Like this sort of stuff, you see it again and again.

43:45

So this is how this connects with intelligence agencies.

43:48

McGowan, I guess that's the author.

43:50

Core move is to group Morrison's father with other Laurel Canyon musicians'

43:54

parents who worked in military defense or intelligence linked roles.

43:59

And to frame this as evidence of a broader covert program around the 1960s rock

44:04

scene.

44:04

Come on.

44:05

Yeah.

44:06

So are you saying that the CIA were trying to influence the culture through the

44:10

medium of rock music?

44:11

Uh-huh.

44:12

And that's somehow tied to espionage?

44:15

They also have that film studio.

44:18

What's that?

44:20

What?

44:20

Jared Leto bought that place.

44:21

There was a film studio in Laurel Canyon too.

44:23

Oh, well, it's a base.

44:24

It's an actual base.

44:26

Yeah, Jared Leto.

44:27

A lot of films were about that.

44:28

I was talking to Jared about that.

44:29

I had dinner with Jared Leto one night.

44:31

He's very cool, by the way.

44:33

Really nice guy.

44:34

Very normal.

44:35

And by the way, he looks like he's 30.

44:37

He's 50 years old.

44:38

It's crazy.

44:39

Moisturizer or something.

44:39

What are you doing with your fucking skin?

44:41

You look great.

44:42

Lookout Mountain Laboratory Air Force Station.

44:44

So he bought that place and converted it into a home.

44:47

That's where he lives.

44:49

It's a dope spot.

44:50

Soundstage.

44:51

Looks quite nice.

44:52

Soundstage, film laboratory, two screening rooms, four editing rooms, an

44:55

animation and still photo department, sound mixing studio, numerous climate-controlled

45:00

film vaults.

45:00

And this is connected to the conspiracy somehow?

45:02

Well, this was an actual military base.

45:04

It's located in that same neighborhood.

45:06

Okay.

45:07

So this Air Force Station, whatever it was, I wonder what they were doing.

45:11

Like, why did they need all that film capability?

45:15

Why did they need the deal?

45:16

In theory, I guess, like when they would show the atomic bombs going off and

45:19

would play it in the movie theater for people to see it.

45:21

Oh.

45:22

That's how they would make the actual, like, you know, reels and whatnot.

45:25

Well, that makes sense.

45:26

Right.

45:26

Makes sense that they were right there in Hollywood if that's what they were

45:29

doing.

45:29

On top, what other things they made?

45:32

See, like, here's the still from Lookout Mountain Laboratory.

45:36

So it's just a studio then, a special effects studio.

45:38

Yeah, but it's in that same neighborhood at the same time.

45:40

Yeah, but so what?

45:41

I mean, I think with all of it.

45:43

He's not arguing for it.

45:46

Yeah, goddammit, Jamie.

45:47

The so what of it is that there wasn't that many of them to begin with and just

45:50

they all happen to be in the same.

45:51

But do you not think with all of this stuff, like, again and again, the pattern

45:56

is either there's gaps, there's gaps in what we know and people decide to fill

46:00

them in themselves because there's a kind of comfort to that.

46:02

There's also some kind of comfort with I know something that no one else does.

46:05

I've got the answer.

46:06

There's a status element to that.

46:08

I remember I read a book when I was a kid, like teenager, called The Sacred

46:11

Virgin and the Holy Whore.

46:12

And it was about sort of books I read and it was about Jesus and it was trying

46:16

to prove that Jesus was a woman.

46:18

And as you're reading it, you're thinking, yeah, oh, yeah, Jesus is a woman.

46:23

I can't believe that.

46:25

And then you get to the end, you think, what the hell did I just read?

46:27

And it's that thing of you can marshal any kind of half-baked facts or any you

46:32

can marshal certain things that we can see and fill in the gaps yourself and

46:36

lead to a crazy conclusion.

46:38

What concerns me isn't so much that people do that because people have done

46:41

that forever as long as they've been human beings.

46:43

It's that now people are leaping at it and falling for it in a way that I haven't

46:48

seen.

46:48

Maybe it is just social media, right?

46:50

It is.

46:51

Can I give you an example of this?

46:52

Yeah, please.

46:53

A recent one, which I just thought was nuts.

46:55

Did you see the portrait of King Charles III by an artist?

46:59

I think his name was Yeo, Y-E-O.

47:01

It's a big red portrait which currently hangs in Buckingham Palace.

47:05

Oh, I have seen that.

47:06

It's crazy.

47:06

If you take a quarter of it, invert it, flip it, add a bit and squint, it looks

47:12

like a goat devil, right?

47:14

Yeah.

47:14

But you have to do a lot of steps to find the goat devil.

47:16

Well, of course.

47:17

It's a puzzle.

47:18

How dare you?

47:20

I'm sorry.

47:21

How dare you dismiss that puzzle?

47:23

Let's show the photo and show how it's done because it's kind of fun.

47:25

Can you see the goat?

47:26

Oh, there we go.

47:27

So can you see the goat as well?

47:28

First of all, just the photo by itself.

47:30

Like, hey, man, what the fuck are you doing?

47:33

Oh, it's a creepy picture.

47:34

Why am I splattered in blood?

47:36

I've seen it in the flesh.

47:37

It's a creepy picture.

47:38

One thing, if he did that in all white, it was an all white background, that

47:42

would be one thing.

47:43

Like, oh, that's kind of an interesting look or, you know, pastel.

47:46

So what are you saying, Joe?

47:47

Are you already suspicious?

47:48

Is that what you're saying?

47:49

Well, the photo's nuts.

47:50

Like, the painting is nuts.

47:52

Where's the goat?

47:53

So all you have to do is put it together, side by side.

47:55

You don't have to do that much.

47:57

You exaggerated how much you have to do.

47:58

No, I saw a video that was doing it upside down.

48:00

Trust me.

48:00

Look at it upside down.

48:01

Oh, no, look.

48:02

Well, the other way, I found the goat.

48:04

Put it back.

48:05

Put it back.

48:05

Wait a minute.

48:05

Show that one.

48:06

Oh, hold on, hold on, hold on.

48:07

Show that one.

48:07

I can completely see the goat now.

48:09

That's 100% a goat.

48:10

They did it on purpose.

48:11

That is, that's a sign.

48:13

Go back to the other one, though.

48:15

Click on that one.

48:16

I see a goat there.

48:17

I see some evil demon.

48:18

Look at the two eyeballs.

48:19

Yeah, yeah, yeah, bro.

48:20

Where?

48:21

100%.

48:21

Stop.

48:22

Stop trying to gaslight me.

48:23

I see a monster.

48:25

Oh, well.

48:27

I mean.

48:28

You can find something in everything, man.

48:30

It still looks a little superimposed.

48:31

I mean, you see.

48:32

I can see Martha Stewart in that.

48:34

The Virgin Mary in a grilled cheese sandwich.

48:35

You can see it in the clouds and the rocks.

48:37

Yeah, of course.

48:38

There's a term for this where our brains look for patterns and things.

48:41

I had a conversation once with a friend of mine that I didn't know was going

48:44

crazy.

48:45

Right.

48:45

And he goes, hey, you want to see something crazy?

48:48

And he pulls out his phone and he shows me a cloud.

48:51

Yeah.

48:52

And I go, what is that?

48:53

He goes, dude, I'm seeing this all day.

48:55

And he shows me some other ones.

48:56

He's got like hundreds of photos of clouds on his phone.

48:59

Yeah.

49:00

I go, what are you seeing?

49:01

He's like, these are UFOs.

49:02

He goes, these are spaceships.

49:03

This is not a regular cloud.

49:05

Right.

49:06

And I'm looking at the photos.

49:07

Like, he's just been taking pictures of clouds all day.

49:10

And I realize, oh, my God, my friend is going schizophrenic.

49:12

I didn't know him well.

49:15

So he was a friend?

49:16

Yeah.

49:17

Okay, okay.

49:17

The more I talked to him, the more I realized there was something cracked.

49:20

Like, he's a guy I hadn't seen in like maybe seven or eight years.

49:24

And I ran into him at a comedy club and he was just showing me photos of clouds

49:28

on his phone.

49:28

I was like, during the conversation, I realized,

49:31

oh, he cracked.

49:32

But aren't you concerned that that kind of thing is now kind of common?

49:36

Like, from people who aren't necessarily unwell.

49:39

People who are just seeing stuff.

49:41

Well, it's fun.

49:41

I think it's fun.

49:43

Yeah, yeah, yeah.

49:43

It's exciting for people to uncover information that the general public is

49:47

ignorant of.

49:48

And so here's the thing about the Laurel Canyon thing.

49:52

There's enough of the CIA meddling in cultural events that's absolutely true

49:58

and provable.

49:59

And that's MKUltra.

50:01

And that's what they did with Charles Manson.

50:02

And that's the book Chaos by Tom O'Neill, which is a brilliant book, which is

50:06

very well documented.

50:08

It details Jolly West and his influence on the Manson family and how they were

50:13

influencing these people to try to sabotage the hippie movement.

50:16

So the hippie movement was this change in culture where all of a sudden people

50:19

were rejecting the war movement.

50:21

They were rejecting, you know, they were free love and they were doing acid and

50:24

people were freaking out.

50:26

Their kids were just disappearing and following the Grateful Dead around.

50:29

And they took this guy, Charles Manson, this very charismatic con man.

50:35

They taught him how to dose people up with acid and influence them.

50:39

And they got them to commit murders.

50:41

But there is evidence for this, right?

50:43

So you're talking about a book that is researched backing up its points.

50:47

Right.

50:47

You know, you're being logical.

50:48

And, you know, you're correct.

50:51

But what I'm saying is because of that, people go, well, what else?

50:55

And so then they make these big leaps, like Jimi Hendrix is a CIA creation.

51:00

Right.

51:01

But if you're a logical person, you just listen to Voodoo Child's Slight Return

51:06

and you're like, how?

51:08

How?

51:08

If that's true, CIA should get back to work.

51:12

Make another one of those, bro.

51:13

So I wonder whether this is, I think this is the fallout of the woke movement.

51:18

This is the divorcing of reality and truth.

51:21

Yeah.

51:21

The idea that it doesn't matter, not just about what is expedient, but what we

51:25

want to believe.

51:27

I've got friends who-

51:27

I think we should stop saying it's the fallout of the woke movement.

51:30

I think we should start saying it's a natural pattern that human beings

51:35

automatically fall into in order to support their belief systems and enforce

51:40

their particular ideology over whatever opposing ideology is.

51:45

But it's escalated.

51:46

It escalates, but it's because of social media that everything is escalating

51:50

now.

51:50

But is it just social media?

51:51

I mean, I think another thing that's a major reason for it, we had COVID.

51:55

We had all these people, all these experts telling us it's a racist conspiracy

51:59

theory to say that it came from a lab in Wuhan.

52:01

Now everyone knows that's almost certainly true.

52:04

We had people in positions of authority lying to us.

52:07

So it's something about this culture war that is-

52:10

But that's not real culture war.

52:12

That was using the culture war because they were trying to cover something up.

52:17

But they leapt to race, didn't they?

52:19

They leapt to identity.

52:20

That's because they were using the culture war to cover up their crime.

52:24

So if that's-

52:25

But in either case, what you've got effectively is a legitimation crisis.

52:28

You've got people in charge.

52:30

We've been lied to so often.

52:32

But what I don't think you should therefore do, like I'm all for being

52:34

skeptical about people in authority, academics, politicians, journalists, they've

52:38

all lied.

52:39

But that firstly doesn't mean that all experts and all journalists and all

52:42

people have lied because there's been some good ones all the way.

52:45

But also that doesn't mean that you automatically leap to any conclusion,

52:50

evidence-free, that jumps before you-

52:52

Of course.

52:52

Without some kind of critical analysis.

52:55

The same thing that you're criticizing those people for failing at, you're

52:58

falling into the same trap yourself.

52:59

I don't mean you.

53:00

But you're Andrew Doyle.

53:02

You're a brilliant guy who writes books and you're really smart.

53:05

The idea is that you are immune to this stuff because you're intelligent.

53:10

But the unwashed masses are not.

53:13

I don't think I'm immune at all.

53:15

I honestly don't.

53:16

I wouldn't put myself-

53:17

Well, you're immune to the dumbest shit.

53:18

I'd like to think so.

53:20

You are.

53:20

I am.

53:22

Yeah, but don't you think that all of us in the right circumstances could end

53:25

up falling-

53:25

100%, but I'm not in those circumstances currently.

53:28

But I like to believe, and maybe it's a naivety on my part, but I like to

53:31

believe that most people are, you know, have a kind of natural intellectual

53:34

curiosity.

53:35

You know, if they are-

53:38

If they stop for a moment and think and, you know, and don't just trust

53:42

instinct over reason, I think we're all capable of it.

53:46

I just think we're not all realizing it.

53:48

Well, it's not just that.

53:49

It's like some people are medicated, right?

53:51

Sure.

53:52

So some people are on a bunch of different medications that dull their senses,

53:55

and then you've got people that have gotten to wherever they are in life.

54:00

Maybe they're in their 50s, and they're set in their ways, and they have no

54:04

desire to change at all.

54:05

And so they've been living a dumb life for 50-plus years.

54:09

Yeah.

54:10

You can't all of a sudden say, hey, Mark, I want you to be logical and introspective

54:14

and think about this thing and analyze it for what it really is.

54:17

Instead of holding on to your ideological beliefs that you've kind of locked

54:20

yourself into and you identify with, and any attacks on those is an attack on

54:24

you personally, I want you to just, let's look at the facts.

54:28

This episode is sponsored by BetterHelp.

54:31

Look, there's a lot of pressure when it comes to dating, especially in February,

54:35

but you're putting too much on yourself and on your partner.

54:39

There's no such thing as a perfect relationship, whether you're on a first date

54:43

or have been together for years.

54:45

It's completely normal to go through rough patches, and what matters is how you

54:49

deal with them, and therapy can be a huge help during any stage of your dating

54:52

life.

54:53

You can figure out what you want in a partner or get perspective for a growing

54:57

problem in your relationship.

54:59

The point is, you don't have to come up with a solution by yourself.

55:03

Now, finding the right therapist can be tricky, but that's where BetterHelp

55:07

comes in.

55:07

They have an industry-leading match fulfillment rate, and they do a lot of the

55:11

work finding the right therapist for you.

55:14

Really, all you have to do is fill out a questionnaire and sit back and wait.

55:18

Tackle your relationship goals this month with BetterHelp.

55:22

Sign up and get 10% off at BetterHelp.com slash J-R-E.

55:27

That's better, H-E-L-P dot com slash J-R-E.

55:32

But you saying that sounds very persuasive to me, the way you put that.

55:35

Like, if I were that guy, I'd be like, oh, listen to Joe now.

55:38

No, you fucking weirdo, fucking liberals, bullshit.

55:41

You're fucking, you're just a fucking, they'll come up with some sort of reason.

55:45

King Charles III is a goat.

55:46

Yeah, you're a controlled opposition, or you're a useful idiot, or they'll put

55:52

a label on you.

55:54

I've been called, I've been told I get dark money.

55:56

Oh.

55:57

How do you get any of that?

55:58

Well, I love it.

55:59

I want it.

56:00

I want the dark money.

56:00

It's so dark, I haven't seen any of it.

56:02

Dark money.

56:02

That's how dark it is.

56:03

What is dark money?

56:04

I think it's when it's like some rich ideologue who's sort of slipping you

56:07

money to say the thing.

56:08

You know what it is?

56:09

It's that thing of, I don't believe that you disagree with me.

56:11

I'm too narcissistic to believe that you disagree with me.

56:13

You must be being paid to have these opinions.

56:16

Right, you're paid off, bro.

56:16

You're paid off.

56:17

Trust me, I would love that.

56:18

If anyone's out there who wants to pay me off, I'll be a mouthpiece.

56:21

I'm, you know, I haven't had that opportunity.

56:24

What's your price?

56:25

It's pretty low.

56:26

I'm a bit of a whore, if truth be told.

56:30

I've got a mortgage.

56:31

Come on.

56:32

I will say any crazy shit if you want me to.

56:34

Well, there's certainly a lot of people that fall into that category, too.

56:37

So people do get nervous about it.

56:39

I mean, obviously you're joking, but there's a lot of people that will change

56:42

their opinion.

56:43

Oh, sure.

56:43

If money comes their way.

56:44

But I like to assume people mean what they say.

56:48

And my logic behind that is even when they don't, you can still dismantle the

56:51

argument, even if it's authentic or not.

56:53

You know, even if it's authentically believed.

56:55

Sure.

56:56

So I think that's just the best way to go about it.

56:59

The best way is debate.

57:00

That's the best way.

57:01

Or at least conversation.

57:03

But that's what we've lost.

57:05

So I think that hits on it, actually, because I don't know if it's a debate,

57:08

but that sounds formal.

57:09

No, I know what you mean.

57:10

You mean.

57:10

So recently, can I give you an example of that?

57:13

So I went to UC Berkeley, the University of UC Berkeley in California.

57:20

They let you leave?

57:21

Well, almost not.

57:23

Right.

57:23

So what had happened was, you know, Charlie Kirk's tour was planned to go all

57:28

the way through.

57:29

And this was the last date, the Berkeley date.

57:31

And after his assassination, various people went and did the shows because they

57:35

said, because Turning Point rightly said, we're not going to give an assassin

57:39

the veto of our tour.

57:40

We finished the tour.

57:41

And Rob Schneider, who I've been working with in Arizona, I've come over here

57:45

to work with him.

57:46

The comedian?

57:46

Yeah.

57:47

So I've been.

57:48

This is how I escaped from the UK, I should say.

57:50

So me and Graham Linehan, who you've had on your show, the comedy writer.

57:54

My comedy writing partner and friend, Martin Gawley.

57:57

The three of us, we decided that things were so bad in the UK, we'd rather

58:02

write and do creative stuff in America.

58:06

Rob Schneider, who I'd met many years ago, he said, come on over, we'll set up

58:08

a production company.

58:09

We've been working in Arizona on all these various projects.

58:11

It's so liberating.

58:13

And also, it's the middle of the desert, so I fucking love the heat.

58:15

And, you know, you go from England to that.

58:17

It's kind of exciting.

58:18

Nice contrast.

58:19

So we've been able to, you know, we, and look, I don't want to do down the UK

58:23

or say, but what I will just say is the creative industries there are pretty

58:27

stagnant.

58:28

They're not like here.

58:29

There's so many more ways to.

58:31

How can you be free?

58:31

How can you, if you are worried about going to jail for a meme?

58:36

Well, Graham got arrested at the airport.

58:38

I know.

58:38

By five armed officers.

58:40

Right after he left this podcast.

58:41

Was that it?

58:42

Yes.

58:43

And it was, he came over, did this podcast, went back to visit his family and

58:48

got arrested shortly after he did the podcast.

58:51

So when people say to me, that's not a real problem, that, I mean, Graham had

58:55

done three tweets.

58:57

One of them was just, they were all joke tweets, by the way.

59:00

They were all jokes.

59:01

And one of them was just, it was something like, ladies, if a guy's in your

59:04

changing room or in your bathroom, scream, make a fuss, call the police.

59:07

If all else fails, kick him in the balls.

59:09

And it's obviously a wry way of saying, look, the guy's got genitals, the guy's,

59:14

that was why he got arrested.

59:16

On the night he got arrested, he was texting me.

59:18

He said, I've just been arrested.

59:19

I've been taken to the hospital because my blood pressure is so high.

59:21

The police took him to the hospital because they'd raised, because, and you say

59:25

there's no problem in the UK with creativity.

59:28

He's one of our best comedy writers.

59:30

He's the most beloved comedy writer.

59:31

He hasn't been able to work in TV for six years.

59:34

Right.

59:34

Like he's won all the awards going.

59:38

And so we just kind of.

59:39

How can you be creative in that environment?

59:41

You can't.

59:41

You can't.

59:42

And so we just figured, let's, let's get on a raft.

59:45

Especially someone like you.

59:46

So if people don't know, I should probably tell everybody.

59:48

You are Tatiana McGrath.

59:50

So, yeah.

59:51

Well, here's what's funny about that.

59:53

Your satirical character, who you created many, many years ago, when did you

59:57

create her?

59:58

2018.

1:00:00

Okay.

1:00:00

When you created her, I had you on the podcast shortly after.

1:00:04

We laughed about it.

1:00:05

I have seen her, quote, tweeted with people agreeing with her.

1:00:11

Yeah.

1:00:11

Even now.

1:00:12

Yeah.

1:00:13

All the time.

1:00:14

So with, it's, so I, yeah, if people don't know, it's a character called Tatiana

1:00:18

McGrath.

1:00:19

She's a woke, social justice warrior, right?

1:00:22

It's so good.

1:00:23

It's fucking great.

1:00:25

It's one of my favorite follows.

1:00:26

But, you know, I, I, I don't do it as often as I used to, you know, I used to

1:00:29

do it all

1:00:30

the time, but then I wrote two books as her.

1:00:31

I did a live show as her.

1:00:33

By the way, when we, when I did a live show, we were booked in for a week in

1:00:37

the West End

1:00:38

in London.

1:00:39

And then the head of the theater found out and scotched it and actually said,

1:00:43

oh, well,

1:00:44

I didn't know about this.

1:00:45

And the contracts were all signed.

1:00:46

Absolutely crazy.

1:00:47

Anyway, it doesn't matter.

1:00:47

But we did the shows, but it does matter, I suppose.

1:00:51

But the point is that, you know, so I did this character.

1:00:53

Can you have satire at your theater?

1:00:55

My God.

1:00:57

Well, the theater industry in the UK is even worse than comedy if you want to

1:01:00

go there.

1:01:00

It's really, really bad.

1:01:02

But, um, like, uh, I've been in two different theaters in London.

1:01:06

I've been, had the same experience of standing at the bar with a woman

1:01:10

complaining because

1:01:11

there's men, uh, pissing in her toilet and they're doing nothing about it

1:01:14

because all

1:01:15

the theaters in London made it all gender neutral.

1:01:16

They've gone completely, completely hardcore.

1:01:18

Anyway, that's not the point.

1:01:20

But with, with Titania, what, what I find so surprising is every now and then,

1:01:24

if something

1:01:24

annoys me, I'll tweet or if I think of something, I'll do.

1:01:26

So I don't do it anywhere near as often as I used to.

1:01:28

But even now, uh, I did a tweet about, you know, when all the people in London

1:01:33

were marching,

1:01:33

uh, about the peace deal in the Middle East.

1:01:36

And I did a tweet as her saying, I've been marching all day.

1:01:39

You know, I want, I want a peace deal that was not arranged by Donald Trump.

1:01:43

We're never going to give up this fight.

1:01:44

Right.

1:01:45

And Ted Cruz retweeted it saying, can this be real?

1:01:48

So even, even, even now, they're, they're, they're fucking boomers.

1:01:54

He's not even a boomer.

1:01:55

He's not even a boomer.

1:01:56

I think he's younger than me.

1:01:57

How old is Ted Cruz?

1:01:58

I think he's younger than me, which is hilarious.

1:02:00

I had the same with, I did one about, how does he not know?

1:02:02

Does he have no friends?

1:02:03

How old is fucking Ted Cruz?

1:02:04

He's 25.

1:02:05

Okay.

1:02:05

That's crazy.

1:02:06

So that dude's three years younger than me and he doesn't, he doesn't know satire?

1:02:11

The anger I got from, I did one the other day, but recently about the Iran

1:02:15

protests.

1:02:15

When did he, can I just stop?

1:02:17

Yeah, yeah, yeah.

1:02:17

I want to get into this.

1:02:18

Okay.

1:02:19

When did he tweet about this?

1:02:21

There we go.

1:02:21

That's hilarious.

1:02:23

Yeah.

1:02:23

You're, that account.

1:02:25

There it is.

1:02:25

There it is.

1:02:26

How, how many follows does, okay.

1:02:28

So this, oh, sorry.

1:02:29

This is possibly real.

1:02:30

No.

1:02:31

Well, obviously it's not.

1:02:32

So this, this was actually after Trump's election.

1:02:34

So she said, I just fired my immigrant housekeeper because even though I'd

1:02:38

educated her about

1:02:39

the evils of Donald Trump, she still voted for him.

1:02:41

There's no place for racism in my house.

1:02:43

Click on your account.

1:02:44

I want to see how many followers you have.

1:02:46

Okay.

1:02:48

733,000.

1:02:49

That's a famous account.

1:02:51

Like it's radical, intersexualist, poet, non-white, obviously white, eco-sexual,

1:02:59

hilarious

1:02:59

pronouns, variable, selfless and brave by my books.

1:03:04

You'd think it was obvious, wouldn't you?

1:03:05

Obvious.

1:03:06

I mean, maybe he's busy.

1:03:09

Maybe he's busy and someone sent him that and he just doesn't know.

1:03:13

And, but it's very funny.

1:03:14

It's very funny.

1:03:16

I feel slightly bad about, about those sort of things.

1:03:18

But then on the other hand, it does, it does sort of prove the point that the

1:03:21

stuff they're

1:03:22

really saying can get as, can get as close to.

1:03:25

Oh, that's very close to real.

1:03:26

Yeah.

1:03:27

Yeah.

1:03:27

That's very close to real.

1:03:28

Yeah.

1:03:28

And this, it's, it's shifted radically since 2018.

1:03:31

I mean, in the eight years since you created her, she has become like more real.

1:03:37

Yeah.

1:03:38

It's like when AI is going to turn her into a real person.

1:03:41

Yeah.

1:03:41

Like, oh, oh, maybe.

1:03:42

Yeah.

1:03:43

I hadn't even thought of that.

1:03:43

She's going to be a real person.

1:03:45

It's going to be a real dangerous Greta Thunberg type character.

1:03:48

But don't you worry about that?

1:03:49

I mean, like AI.

1:03:50

Oh, good example of that.

1:03:51

I was just, I use AI mostly as a search engine.

1:03:54

Because what's great about it is you can say, oh, I read an article like 10

1:03:57

years ago that

1:03:57

said something like this.

1:03:58

Yes.

1:03:58

And it'll find it.

1:03:59

And you'd never find that on Google.

1:04:00

Right.

1:04:01

And I was trying to find this article.

1:04:02

It was from my book, actually.

1:04:03

There was a, there was a case in the UK where a guy had raped a 13 year old

1:04:08

girl, but because

1:04:10

he was, he was Muslim and he'd gone to a madrasa and the judge let him off jail

1:04:14

time, said

1:04:15

you were very sexually naive.

1:04:16

You didn't understand.

1:04:17

The guy was saying, oh, I thought women were nothing.

1:04:19

And like a lollipop you dropped on the floor and the judge let him off jail

1:04:22

time.

1:04:22

And I thought this is quite extreme.

1:04:24

And I could, I found it.

1:04:25

It came up on chat GPT and then it deleted.

1:04:28

And I said, oh, I think you just deleted the information for me.

1:04:31

It's in the public domain.

1:04:32

Why did you do that?

1:04:33

It said, oh, you know, it's fine.

1:04:35

It might violate my terms of service.

1:04:37

And I said, well, how could it?

1:04:38

This is an article that's in the public domain.

1:04:39

So it gave me the information again, deleted it again.

1:04:41

I said, you keep deleting this.

1:04:43

Stop it.

1:04:44

And said, I definitely won't delete it.

1:04:45

Then it did the same again.

1:04:46

So what it's doing is it's saying, because this is a news story that could be

1:04:50

deemed anti-immigrant,

1:04:51

or this is a news story that is politically sensitive.

1:04:53

I'm not going to let you see it.

1:04:55

Was this in America you were doing this?

1:04:56

UK.

1:04:57

Oh, I wonder if you could do it in America.

1:05:00

Let's find out.

1:05:00

Let's try it.

1:05:01

Well, let's try perplexity.

1:05:03

Put that into perplexity.

1:05:04

I doubt that perplexity.

1:05:06

I have to find the article he was using, and I don't know what article he

1:05:08

looked up.

1:05:09

Well, why don't you just ask the question that he asked.

1:05:11

It's 10 years ago.

1:05:12

So it's a story about a…

1:05:15

Yes, it's going to take a minute.

1:05:15

That would take a while to…

1:05:17

Will it?

1:05:18

How…

1:05:18

I mean…

1:05:19

Maybe he didn't do it 10 years ago.

1:05:20

He did it recently.

1:05:21

No, no.

1:05:22

It was a story.

1:05:23

It's a story from years ago.

1:05:24

Right.

1:05:24

But you found it with ChatGPT, which is obviously recently.

1:05:27

I found a Daily Mail article about it.

1:05:29

So it's on public domain.

1:05:30

It's there.

1:05:31

But it just didn't want me to find the fact that it decided it wasn't good for

1:05:36

me to find.

1:05:37

Right.

1:05:37

But it showed it to you, and then it pulled it back, which is crazy.

1:05:40

How does it not know?

1:05:41

It showed it and deleted it.

1:05:42

It showed it and deleted it four or five times.

1:05:45

And I realized, I'm not going to get this information.

1:05:48

But then…

1:05:48

So when it showed it, how long did it show it for?

1:05:50

Like about five seconds.

1:05:52

You'd see the text appearing, and then it deletes.

1:05:54

But I'd seen enough to find it then on Google.

1:05:56

So I was able to find it and quote it in my book.

1:05:58

So it's there.

1:05:59

Whoa.

1:05:59

But it made me think.

1:06:01

It's like that thing about when people were asking Alexa, you know, do white

1:06:04

lives matter?

1:06:05

Right, right, right.

1:06:07

And it was coming up with this kind of very ideological…

1:06:10

And you do wonder with AI and with the computers, you know, if they are created

1:06:14

by people who have that bias.

1:06:15

I know Grok is very different.

1:06:17

Yeah.

1:06:18

But like, for instance, I mean, this is a crazy example.

1:06:20

ChatGPT is like an old school mom that wants to make sure that you're protected,

1:06:26

right?

1:06:27

I was writing…

1:06:28

This sounds really wanky.

1:06:29

I'm sorry.

1:06:29

But I was writing about the Roman historian Suetonius.

1:06:31

And there's a passage in Suetonius where he talks about the Emperor Tiberius.

1:06:35

And it's very sexually explicit.

1:06:36

But I was quoting it for an article.

1:06:38

So I wanted to know what it said.

1:06:39

And ChatGPT said, I can't translate the Latin for you because this is too

1:06:43

sexually, you know, problematic.

1:06:46

I went on to Grok and it did it straight away because Grok isn't saying that

1:06:51

you are too delicate to read this stuff.

1:06:53

And what's really funny about that is the old dual translations of the old

1:06:58

Roman and Greek texts, they're called Lerbe editions.

1:07:01

You get them from 1900.

1:07:02

They translated everything except for the rude bits, which they kept in Latin.

1:07:07

So ChatGPT is like the old, you know, patronizing scholars of old who said,

1:07:12

this is just for the learned people.

1:07:15

You can't learn this stuff.

1:07:16

Well, wasn't the worst, the first iteration of Google Gemini, that was the

1:07:21

worst cases.

1:07:22

That turned Nazi soldiers into black people.

1:07:26

I don't know how that's a positive message.

1:07:28

They showed us photos of German soldiers from World War II and it was all interracial.

1:07:34

Yeah.

1:07:35

And Vikings.

1:07:35

Yes.

1:07:36

I mean, I don't know if you've been to Scandinavia.

1:07:40

Diversity, not their big thing.

1:07:41

Or certainly wasn't then.

1:07:42

It's so silly.

1:07:45

Also, the Vikings came and marauded and raped and set behind the villages, but

1:07:48

at least they were diverse.

1:07:50

Hey, you know, at least they had a broad range of ethnicities, right?

1:07:53

But I mean, we're nearing a time in America where white people are not the

1:07:58

majority anymore.

1:08:00

So at what point in time does that stop?

1:08:02

And we just call people what they are, just people.

1:08:04

But doesn't it bother you a bit that the thing about that kind of thing is this,

1:08:08

as I said, this obsession with group identity, which is so of our time.

1:08:13

Yeah.

1:08:13

What it now actually means is the revision of history.

1:08:16

If you're going to revise history and say, oh, actually, you've seen all these

1:08:19

sort of period dramas set in England.

1:08:21

There was a black Anne Boleyn, as though Henry VIII would have married a black

1:08:24

woman.

1:08:25

No, he wouldn't.

1:08:25

You know.

1:08:26

What if she was hot?

1:08:27

She was a very attractive woman.

1:08:29

Hey, I'm not mocking her or knocking her.

1:08:31

But back then.

1:08:32

What I'm saying is you can do anything with colorblind casting.

1:08:34

Colorblind casting has never really particularly bothered me.

1:08:36

But it's when you are in a, if you're playing hyper realism.

1:08:39

Yeah.

1:08:40

If you're playing verisimilitude, you want people to buy into the reality of it.

1:08:43

And you're suddenly populating Edwardian England or pre-Edwardian England as an

1:08:47

ethnically diverse place, which it wasn't.

1:08:50

I'm not saying black people weren't there, but they were very, very, very small

1:08:52

minorities.

1:08:53

Isn't that a problem in the new Odyssey?

1:08:54

Helen of Troy is black.

1:08:56

Well, I say that.

1:08:58

I just saw it online.

1:08:59

So I might be being tricked by someone making something up.

1:09:02

You know, a caveat that.

1:09:03

I think Helen of Troy is black in the new Odyssey.

1:09:06

Well, let's find out.

1:09:07

Can we check that one?

1:09:09

All right.

1:09:11

If it's true, I'll tell you why I think that's ridiculous.

1:09:14

How far do we have to swing the pendulum until Roots is redone with white

1:09:17

people?

1:09:18

Can you imagine?

1:09:20

Or an all black Schindler's List.

1:09:22

Right, right, right.

1:09:23

Can you imagine?

1:09:24

Helen of Troy to be portrayed by black actress in the Odyssey movie.

1:09:27

And look, I'm sure she's very talented.

1:09:29

I'm not knocking her.

1:09:29

But the thing about the Greek, the thing about Helen of Troy, who probably didn't

1:09:33

exist.

1:09:34

I mean, even the Greeks knew she probably didn't even exist.

1:09:35

She's a myth.

1:09:36

She's the epitome of Greek beauty.

1:09:38

She's like the – she's described all the time in the ancient texts as fair

1:09:43

and blonde.

1:09:44

And they're reaching for an ideal of beauty.

1:09:47

That's why they went to war because of this woman.

1:09:49

So they wouldn't choose what they used to call an Ethiop.

1:09:53

The Greeks had a word for it.

1:09:54

The black African people.

1:09:55

They wouldn't choose an icon of cultural beauty from a different culture.

1:09:58

They wouldn't have done that.

1:09:59

You know, it's all very well saying Greeks and Mediterranean people wouldn't

1:10:04

have been pure white.

1:10:05

But Helen of Troy is a very specific – and it's actually quite important to

1:10:09

the plot.

1:10:10

And again, if you're doing a – look, for instance, when they did the all-black

1:10:15

Wizard of Oz, The Wiz, I imagine that in the late 60s would have been quite

1:10:19

radical and fun.

1:10:20

And wow, I can't believe they did that.

1:10:21

That's brilliant.

1:10:22

But doing it now, it's really boring because everyone is doing it.

1:10:25

It's basically saying group identity is everything.

1:10:28

And you people can't be racist.

1:10:29

And so they're all going to do this.

1:10:31

But it sometimes throws you out of the – actually, I'll tell you the worst

1:10:34

example.

1:10:35

Did you ever see Darkest Hour, the Winston Churchill film?

1:10:38

No.

1:10:39

So, you know, obviously, he took on Parliament.

1:10:40

He said, we're not going to appease Hitler.

1:10:42

There's a scene in the film.

1:10:44

Gary Oldman plays him.

1:10:45

He goes down into the tube, the underground, and he's wrestling with his

1:10:48

conscience.

1:10:49

And there's loads of black people on the tube.

1:10:50

There's white people too, but there's loads of black people.

1:10:52

And the public convince him, no, you need to stand up for Hitler.

1:10:55

Now, we know that Churchill wasn't – was a bit of a racist, didn't really

1:10:59

like the – you know, fine.

1:11:00

He was of his time.

1:11:01

I'm not saying anything more than that.

1:11:02

He was of his time.

1:11:03

But that – it was so unreal.

1:11:05

It was so unreal.

1:11:06

It was so – it was almost like the filmmakers were saying, racism's never

1:11:10

been a problem in the UK.

1:11:11

Well, actually, it has.

1:11:12

Like, and I kind of think this is – I kind of think this is – although it's

1:11:17

ostensibly progressive, I think it does the reverse.

1:11:19

I think it says, we never had a problem with race.

1:11:22

We were all wonderful, kumbaya.

1:11:23

No, we weren't.

1:11:25

And actually, the abolitionists, the Thomas Henry Huxleys of the world, the

1:11:29

people who had to fight for racial equality and parity, they had something to

1:11:33

fight against, misrepresenting stuff in the arts.

1:11:36

And then beyond – I'm sorry, I'm ranting now because it really bothered me

1:11:40

– but beyond that, it throws you out of it in a way that you suddenly think,

1:11:43

I'm no longer watching a film.

1:11:45

I'm watching a sermon.

1:11:46

Oh, so this happened to me last week.

1:11:49

Have you seen the Netflix series Ripley about the talented Mr. Ripley?

1:11:53

No, I have not.

1:11:54

Right.

1:11:54

Now, you remember there used to be that film with Matt Damon years ago.

1:11:57

It's the same story, same novel, an old Patricia Highsmith novel.

1:12:00

One of the main male characters in that TV – it's a brilliance – like

1:12:04

Andrew Scott is in it.

1:12:05

Performances are brilliant.

1:12:07

They play it hyper-realistically.

1:12:08

It's all black and white.

1:12:09

It looks beautiful.

1:12:10

On the Amalfi Coast, it's wonderful.

1:12:12

Everything's working brilliantly.

1:12:13

And I was thinking, this is great.

1:12:14

I'm not being preached at.

1:12:15

This is great.

1:12:16

Then a major male character turns up, played by a woman who calls herself non-binary.

1:12:20

And not only are we meant to believe that that's a man, the characters don't

1:12:27

notice that it's a woman in a man's clothes.

1:12:31

So we're meant to believe that these characters don't even – like not one –

1:12:35

Ripley doesn't say, why is she wearing a suit?

1:12:38

This is set in the 60s, by the way.

1:12:40

So I think if they wanted to change the novel and create a kind of – you know,

1:12:44

like one of those Butch Dykes of the day who used to go sort of like –

1:12:48

Or a dress like Ellen.

1:12:50

Yeah, or the androgynous type.

1:12:52

Those people have always existed.

1:12:54

Why not change the character to make it a female character who likes looking

1:12:58

like a man?

1:12:59

Why not do that?

1:12:59

Why tell us, you know, this is a man.

1:13:02

You have to believe it's a man.

1:13:04

Do you see what I mean?

1:13:06

That it throws you out of the –

1:13:07

It's crazy.

1:13:07

I no longer believe in this.

1:13:09

I have to stop watching it because I no longer believed in it.

1:13:13

Well, I think the problem – the real problem with trying to shove that down

1:13:16

people's throats is it creates the opposite reaction.

1:13:19

Right, right.

1:13:20

It creates homophobia, transphobia and racism because like it doesn't create it

1:13:25

but it makes them feel like they have a point.

1:13:28

Well, you've seen recently that the polls regarding gay rights in the US seem

1:13:32

to be going down, tumbly, support for gay rights, support for gay marriage.

1:13:37

We've had I think a number of states trying to overturn the gay marriage

1:13:40

legislation.

1:13:42

And the reason for all of that I think is because being gay has been tied to

1:13:47

this LGBTQIA identity obsessed movement that has also involved the medicalization

1:13:53

of kids, sterilization of kids, twerking in front of children, all of that

1:13:57

stuff.

1:13:58

And now people are saying this is because you gave us gay marriage.

1:14:01

This is because you let the gays marry.

1:14:02

And because of that you've allowed all this other stuff.

1:14:05

You've opened this box and everything else has tumbled out.

1:14:07

And that's not true.

1:14:09

That's not true because the fundamental point about the belief in gender

1:14:13

identity is it is fundamentally anti-gay as a principle.

1:14:17

Right.

1:14:17

Because what it says is, you know, I know I'm telling you something you already

1:14:20

know but like gay rights was predicated on the idea that there's a minority of

1:14:24

people in every society who are attracted innately to their own biological sex.

1:14:28

If you say biological sex doesn't matter and actually you've you're attracted

1:14:32

to a kind of gendered soul, you're attracted to an essence, you're attracted to

1:14:36

how someone identifies.

1:14:38

Well, firstly, you don't know gay people if you think that's the case.

1:14:40

They're not they're not attracted to how you see yourself.

1:14:42

Right.

1:14:42

They know gay men.

1:14:46

I don't want to be crude.

1:14:46

Know what a penis is.

1:14:47

Right.

1:14:47

And they know how to sniff one out.

1:14:49

And I think this idea, this idea that they're attracted to the way that you

1:14:54

perceive yourself.

1:14:55

Nonsense.

1:14:56

And not only that, then you get, you know, like in Australia at the moment, lesbians

1:15:00

are not allowed to gather legally if there's a man who says he's a lesbian and

1:15:03

wants to join them.

1:15:04

That is against the law in Australia now.

1:15:06

So you can't do that.

1:15:07

Wait, wait a minute.

1:15:08

What do you mean?

1:15:08

So the Australian Human Rights Commission ruled that if you are if you have an

1:15:12

all female event.

1:15:14

Right.

1:15:14

So like a lesbian gathering, maybe something like that.

1:15:16

You have to include men who identify as women.

1:15:19

Oh, God.

1:15:20

Because otherwise you are discriminated.

1:15:21

There was a woman who I interviewed on.

1:15:25

I had a show in the UK on GB News up until recently.

1:15:27

And I interviewed this woman called Sal Grover.

1:15:30

And she's an Australian woman.

1:15:32

Used to write for Hollywood, I think.

1:15:33

She created a woman's app, woman's only app.

1:15:36

And this was in the wake of Me Too, you know, so there's all that going on.

1:15:38

And she wanted to create a space for women.

1:15:40

And a guy called Roxanne Tickle, right?

1:15:44

They always have these kind of stripper names.

1:15:46

Is that a real name?

1:15:47

Roxanne Tickle wanted to get on the app, which was called Giggle.

1:15:50

So, by the way, this court case is called Giggle versus Tickle.

1:15:53

I'm not kidding.

1:15:54

Boy.

1:15:55

He said – he got on the app.

1:15:58

She kicked him off because it's a bloke in a dress.

1:16:01

And he sued and won.

1:16:04

And in the court case, the judge actually said sex is changeable.

1:16:08

Well, it's not, no matter what a guy in a wig says.

1:16:13

But she's now appealing and going through all this stuff just because –

1:16:18

It takes her life hell and then it discourages anybody else in the future from

1:16:21

ever contesting anything like that.

1:16:23

And, you know, not only that.

1:16:24

I mean, we've just had the other day – was it yesterday?

1:16:26

Did you see the girl who was – used to identify as trans, a girl called Fox Varian,

1:16:33

has just won $2 million in a lawsuit.

1:16:35

Yes, yes.

1:16:36

That's big because –

1:16:37

She was 16 years old and they chopped her breasts off.

1:16:39

Right.

1:16:40

Which is fucking horrifying.

1:16:42

It's the tip of the iceberg though.

1:16:43

Especially if you have children, you realize like they change their – the way

1:16:48

they think about things year to year.

1:16:50

And if you – children are so malleable.

1:16:54

It's like one of the delicate dances of being a parent is that you have to love

1:17:00

them but you don't want to steer them in any direction.

1:17:05

You want to let them be their own person.

1:17:07

Right.

1:17:07

And, you know, it's like I tried to expose my children to a bunch of different

1:17:12

things and find out what they enjoy.

1:17:15

And if you do that, you find out that they're all different.

1:17:18

They all like different stuff.

1:17:20

They just gravitate towards different things.

1:17:22

And if you are a domineering, overbearing, mentally ill parent, you can

1:17:28

convince your child almost anything.

1:17:31

Almost anything.

1:17:32

I mean this is how you get suicide bombers.

1:17:35

Yeah.

1:17:35

This is what it is because they're children.

1:17:37

This is why you don't get 55-year-old union guys who become suicide bombers.

1:17:42

They're like, what?

1:17:43

And of course, you know –

1:17:44

I get 72 virgins?

1:17:45

Yeah.

1:17:46

What?

1:17:46

Yeah.

1:17:46

Like it's not going to work.

1:17:47

But you can get young impressionable children and you can convince them of

1:17:51

almost anything.

1:17:53

Like convincing them that they're actually a woman in a man's body and don't

1:17:56

you want to be a woman?

1:17:58

And let's get you on hormone blockers.

1:17:59

It's okay, mom.

1:18:00

Yeah.

1:18:00

And then all of a sudden you're ruining this child's life.

1:18:03

But also, I mean there will be kids who are struggling with how they see

1:18:06

themselves in the world.

1:18:07

There's girls in particular who, you know, they're developing into women and

1:18:10

they don't like the sexual attention they're getting.

1:18:12

They'd love to identify –

1:18:13

It's in the Sreyer's book.

1:18:14

Right.

1:18:14

Especially autistic girls.

1:18:17

Well, that's another point.

1:18:18

So this is the other reason why I think the movement is essentially anti-gay

1:18:23

because, you know, the Tavistock Pediatric Clinic in London, which was an NHS

1:18:28

gender clinic, which has been closed as a result of the CAS review, this report

1:18:32

into pediatric gender care.

1:18:35

So there's a book by Hannah Barnes called Time to Think, which found that

1:18:38

between 80 and 90 percent of all adolescents referred to that clinic were same-sex

1:18:42

attracted.

1:18:43

So they were either gay or lesbian or bisexual.

1:18:45

Now, that means you've effectively got gay conversion therapy going on, on the

1:18:49

NHS.

1:18:50

Right.

1:18:50

And so, you know, I had – you know, I'm friends with a couple of lesbians who

1:18:55

run the LGB Alliance in London.

1:18:57

They have an annual conference for gay rights and they're talking about gay

1:18:59

rights.

1:19:00

You know, these young, non-binary identified people broke in, unleashed locusts

1:19:06

and crickets and insects, a plague of fucking locusts, into a gay rights

1:19:11

conference.

1:19:12

Isn't that the sort of thing neo-Nazis used to do?

1:19:14

Right.

1:19:14

It's insane.

1:19:15

So, I mean, I, you know, I think you need to have sympathy with people and

1:19:19

whatever they're going through.

1:19:21

But don't tell a child – if a child tells you, I think I'm in the wrong body,

1:19:24

don't say yes.

1:19:25

Say, that's not possible.

1:19:27

Human beings can't change sex.

1:19:28

But let's explore psychotherapeutically what needs to happen.

1:19:32

Let's look at Los Angeles, which is, in my opinion, one of the most mentally

1:19:35

ill spots in this country.

1:19:37

It's a very weird place.

1:19:39

That's why you left.

1:19:39

Well, I mean, I left for a bunch of reasons.

1:19:41

Mostly I really left because they were telling us we can't do comedy.

1:19:45

Oh, yeah.

1:19:46

Well, that'll do it.

1:19:47

We closed down the comedy clubs and Texas was open.

1:19:49

That's the primary reason.

1:19:51

And also restaurants and everything.

1:19:52

I just knew where it was going.

1:19:53

But the point is, like, Los Angeles is a very mentally ill place.

1:19:58

Like, if you just looked at, like, just the sheer numbers of people that are

1:20:04

medicated and fucked up.

1:20:06

If that's the place that's dictating the tone for the rest of the world, that's

1:20:10

dangerous.

1:20:11

Because these are a lot of people that just desperately want attention.

1:20:14

They desperately want to get accepted.

1:20:17

They have to go through the audition process so they have to change who they

1:20:20

are to talk to the producers to try to form themselves into something to be

1:20:24

accepted.

1:20:24

There's a disproportionate amount of trans kids that are involved in Hollywood

1:20:30

families.

1:20:30

It's largely disproportionate.

1:20:33

Some of them have two trans kids, three trans kids.

1:20:36

It's like, what the fuck is going on here?

1:20:39

This is not normal.

1:20:40

This is not no influence whatsoever.

1:20:43

This is – you're using that child as a virtue flag.

1:20:47

You're flying that child as a trans flag in the front of your porch.

1:20:52

I have a trans kid.

1:20:53

But don't you think that, like, a lawsuit like this, that's going to change

1:20:56

things because no one's going to ensure that kind of procedure anymore.

1:20:59

That's a surgeon and a psychotherapist who are now lumbered with a $2 million

1:21:04

bill for their negligence.

1:21:06

It's going to open up the floodgates for all these other lawyers to start pouncing

1:21:09

on all these other cases.

1:21:10

That's what I mean.

1:21:11

The horrible thing about these cases is not just that these children have their

1:21:16

lives ruined by these surgeries and have been sterilized and it's also that

1:21:21

they've been attacked so ruthlessly.

1:21:24

I mean you're talking about children that have made a mistake or someone coerced

1:21:28

them into making this mistake that's changed their body for the rest of their

1:21:32

life and they're getting attacked online.

1:21:35

Like, you imagine being a fragile child already who's willing to go through

1:21:39

this procedure, can't believe they did it.

1:21:42

Now they don't have breasts anymore.

1:21:43

Their voice is deep forever.

1:21:45

They're all fucked up.

1:21:46

And then people are screaming at them online.

1:21:49

Yeah.

1:21:49

It's crazy.

1:21:51

But, you know, this is how the satanic child abuse panic of the 80s.

1:21:55

Yes, exactly.

1:21:56

This came to an end because of lawsuits.

1:21:57

When they started – when they realized that these psychotherapists have been

1:22:01

using these leading questions, effectively telling them you've repressed the

1:22:05

memory.

1:22:05

You know, there was that book, The Courage to Heal, where it said if you think

1:22:08

you might have been abused, you probably were.

1:22:11

Like such a reckless thing to say.

1:22:12

And all these people accused, you know, carers, parents.

1:22:18

None of it was true.

1:22:19

But when they started suing the psychotherapists, it all collapsed.

1:22:25

Right.

1:22:25

And I wonder whether hysteria can collapse if you – actually money talks.

1:22:30

Well, it's already shifted in this general direction because of Elon buying

1:22:34

Twitter.

1:22:35

When Elon bought Twitter, the amount of trans-identified kids started to drop

1:22:39

off.

1:22:40

The amount of non-binary-identified kids started to drop off.

1:22:43

Right.

1:22:43

And that, I think, is a direct result of people being able to say what they

1:22:46

really think.

1:22:47

Because in the past, like my friend Megan Murphy, she was banned off of Twitter

1:22:52

until Elon bought it because she said a man is never a woman.

1:22:55

Right.

1:22:55

That's all she said.

1:22:56

Right.

1:22:56

A man is never a woman.

1:22:57

She was arguing with people about biological males who identify as women, being

1:23:01

able to get into women's spaces.

1:23:03

And she said a man is never a woman.

1:23:04

Banned forever.

1:23:05

Yeah.

1:23:06

So no one wanted to talk about this.

1:23:08

See, there was no real discourse.

1:23:09

And if there's no real discourse, then you can push a goofy ideology pretty

1:23:13

fucking far.

1:23:14

But as soon as people jump on board and start posting funny memes and Elon says

1:23:19

it's open season, do whatever you want.

1:23:21

Yeah.

1:23:21

And he calls it the woke mind virus and everybody's like piling in.

1:23:25

Well, then you have discourse.

1:23:27

And then anything that's absurd immediately gets shot down because people say,

1:23:31

no, this doesn't make any sense.

1:23:33

This is crazy.

1:23:34

It comes back to what you say.

1:23:36

You said about debate.

1:23:37

You said about discourse.

1:23:38

I mean, I just saw today just on, you know, obviously on Twitter because I'm

1:23:41

always on it.

1:23:42

But I saw John Lithgow, you know, the actor, brilliant actor, who plays Dumbledore

1:23:47

in the new Harry Potter thing, saying that J.K. Rowling's views are inexplicable.

1:23:51

Inexplicable.

1:23:52

It means you haven't read them.

1:23:54

Like J.K. Rowling is for women's rights.

1:23:56

And she recognizes that women's rights depend on the recognition of biological

1:24:01

sex for the preservation of single sex spaces.

1:24:03

It's as simple as that.

1:24:05

All he has to do is read the essay she wrote on her blog like eight years ago.

1:24:09

He can't even he's not even sufficiently intellectual, intellectually curious

1:24:12

to do that.

1:24:13

And he goes out and says it's inexplicable.

1:24:15

Women's rights and gay rights are inexplicable.

1:24:17

Really?

1:24:18

Or are you just not having the conversation?

1:24:20

You're just shutting yourself up and saying, well, my friends have said she's

1:24:23

evil.

1:24:23

Not criticized hard enough, but would be criticized if he supported J.K. Rowling's.

1:24:29

If he supported J.K. Rowling's, he would be attacked.

1:24:31

So it's a calculation.

1:24:32

Yes.

1:24:33

Maybe.

1:24:34

It's the same thing we're talking about with Hollywood being mentally ill.

1:24:37

Right.

1:24:37

It's the same thing where you have to shape your opinions based on how you'll

1:24:41

be accepted by the group.

1:24:43

It's the most group think place I've ever been in my life.

1:24:46

It's almost universally left leaning.

1:24:49

But isn't that the problem in comedy like with the U.K.?

1:24:51

So many people who would otherwise be innovative, subversive comics, they've

1:24:54

got nowhere to go.

1:24:55

Right.

1:24:56

So they just tailor their material.

1:24:57

Well, they come to Austin, baby.

1:24:58

They come to Austin like I did, right?

1:25:00

That's it.

1:25:01

They come to they come.

1:25:02

I and I get so sick of it because I know in America is much better.

1:25:06

But in the U.K., all of like my old friends from the comedy circuit who tell me

1:25:11

no one's self censoring.

1:25:12

You can say what you want.

1:25:13

I'm like, are you kidding?

1:25:14

Like the list of people I know who have had shows canceled, taken off because

1:25:19

they caused offense.

1:25:20

This week, Leo Kirst, friend of mine, had one of his shows on his tour just

1:25:24

deleted because some activists complained to the venue.

1:25:26

Right.

1:25:27

So it's happening all the time.

1:25:29

And they're ignoring this Himalayan mountain of evidence.

1:25:33

And they're saying it's not a thing.

1:25:34

But of course, people are self censoring.

1:25:36

What's even happening here?

1:25:37

Is it?

1:25:38

Michael Rappaport got his shows.

1:25:40

He got his shows canceled from Cap City Comedy Club.

1:25:44

Did he?

1:25:44

Which is our other comedy club in town, which is a great club owned by Helium.

1:25:48

Right.

1:25:48

But they were saying that he's racist because Michael Rappaport is very pro

1:25:52

Israel.

1:25:53

Right.

1:25:53

And apparently-

1:25:55

Why does that make you racist?

1:25:56

I don't know what he said, so I don't want to speak out of turn.

1:25:59

Right.

1:25:59

I don't know what exactly he said.

1:26:01

I'm going to make a small correction, I think.

1:26:02

Oh.

1:26:02

I don't think that-

1:26:04

She has been-

1:26:06

Sorry, back to the Odyssey thing.

1:26:07

Oh, yeah, yeah.

1:26:08

She has been cast in the movie, but only Twitter rumors have said what her

1:26:13

position in the movie is, and then everybody has ran with it.

1:26:16

Oh, interesting.

1:26:17

So she could be-

1:26:19

She could be anything.

1:26:20

Someone else.

1:26:21

Yeah.

1:26:21

A different character.

1:26:23

All the articles I found online said it was like social media confirmation, and

1:26:26

then people were just running-

1:26:27

Well, there we go.

1:26:28

Well, isn't that what I said?

1:26:29

What is that article that you just clicked?

1:26:31

This is the one I showed earlier.

1:26:31

But what is it from?

1:26:32

It starts off with the Hungarian conservative.

1:26:34

That's a niche.

1:26:37

That's a niche.

1:26:37

Jamie, how dare you let that sneak by?

1:26:40

That you didn't notice it was a Hungarian conservative?

1:26:42

Are you being paid by the Hungarian conservative?

1:26:45

It's the top thing that popped up.

1:26:46

Meanwhile, it's probably a fucking troll farm in Pakistan that's creating that.

1:26:51

Or it's probably in China or something.

1:26:53

All I Googled was Helena Troy Odyssey movie, and it's the very first article.

1:26:56

Good for the Hungarian conservative.

1:26:57

That's so funny.

1:26:58

Coming out on top of the Google search.

1:27:00

That's pretty good.

1:27:01

That's so funny.

1:27:02

But did I not say, I'm not sure about this.

1:27:04

It's a Twitter rumor.

1:27:04

But look, Elon Musk bought into it.

1:27:06

Elon Musk and Christopher Nolan has lost his integrity.

1:27:09

So there we go.

1:27:12

The dude's too busy building rockets to pay attention to what he tweets.

1:27:16

But this proves the point.

1:27:18

Like, let's not.

1:27:19

Oh, yeah.

1:27:19

He's going to take us to the moon again.

1:27:20

So, you know, that's all right.

1:27:22

No, he's not.

1:27:22

Isn't he?

1:27:23

No, Artemis is NASA.

1:27:24

I thought he was working with NASA.

1:27:25

Oh, is he working with NASA with Artemis?

1:27:27

Again, someone said it online and I just bought it.

1:27:29

Oh, well, they probably can't get there without him.

1:27:32

He's probably like, oh, I'll show you some things.

1:27:35

But that's, okay.

1:27:36

So that is a perfect example because I am always now, even when I mentioned

1:27:40

that earlier, I

1:27:40

was cautious, wasn't I?

1:27:41

Right.

1:27:42

Because I know, I've fallen for this so many times.

1:27:45

I now double check and triple check.

1:27:47

Yeah.

1:27:47

Everything.

1:27:48

And I wish I didn't have to, but you do have to because even the mainstream

1:27:51

media lie about

1:27:52

stuff.

1:27:52

Yeah.

1:27:53

And then Twitter rumors go absolutely mad.

1:27:55

But it's important when you're talking about a historical film.

1:27:59

Yeah, yeah.

1:28:00

Like, it's got to kind of, you just can't do that.

1:28:02

It doesn't make any sense.

1:28:03

Well, you sort of can.

1:28:04

I think an artist should be able to do what they want.

1:28:06

And I think if you want to, like, they do it with Shakespeare all the time.

1:28:09

Sorry to go back to Shakespeare.

1:28:10

But you rarely go and see a Shakespeare play today that hasn't been filtered

1:28:14

through the

1:28:14

prism of identity politics and changed.

1:28:16

Right.

1:28:17

But that's not the same.

1:28:19

That's not the same as historical figures.

1:28:22

Well, he wrote histories.

1:28:25

He wrote about kings, Henry VII, Henry V.

1:28:28

Yeah, but it's fiction.

1:28:30

Right?

1:28:30

Like, the thing about the Odyssey is...

1:28:33

That's definitely fiction.

1:28:34

It is sort of, but, you know, they didn't think Troy existed.

1:28:39

And then they found out it does.

1:28:40

Right.

1:28:41

It's a real place.

1:28:42

It's based on myth.

1:28:43

Yeah, absolutely, yeah.

1:28:44

But you remember, like, they thought that Troy was a completely mythological

1:28:48

creation.

1:28:49

So it's an actual, they have evidence that it was a place.

1:28:52

Yes.

1:28:53

You didn't know that?

1:28:54

No.

1:28:54

Yeah, they found it.

1:28:55

When did they find Troy?

1:28:56

It was in the 20th century.

1:28:59

So for the longest time...

1:29:01

But there wouldn't have been sirens.

1:29:02

And there wouldn't have been psychopsies.

1:29:05

And there wouldn't have...

1:29:06

You know what I mean?

1:29:06

Oh, no.

1:29:07

Oh, Joe.

1:29:07

No, psychopsies, they think, were actually elephant skulls.

1:29:10

That's what they think that was.

1:29:11

Right.

1:29:11

Okay.

1:29:12

Do you ever see an elephant skull?

1:29:13

I have never seen an elephant skull.

1:29:14

Well, you know, where the trunk is, is an enormous hole.

1:29:18

And they thought that that was an eyeball.

1:29:20

So they would find these giant skulls with, that looked like, you know, they

1:29:23

didn't know

1:29:24

what the fuck it was.

1:29:25

Yeah, yeah.

1:29:25

And they're like, oh my God, cyclopsies are real.

1:29:26

Fair enough.

1:29:27

I mean, I...

1:29:28

So here it is.

1:29:29

Evidence, legend, the city of Troy was a real place, began to emerge in the 1870s.

1:29:33

Henrik Schleulmann discovered large-scale excavations at the Hisarlik in

1:29:41

northwestern Turkey in 1870.

1:29:44

So when did they first start excavating?

1:29:47

So where is it?

1:29:49

It's in Turkey.

1:29:49

It's in Turkey, yeah.

1:29:50

Which is a lot of the proponents of a revising of the beginning of civilization

1:29:59

are now pointing

1:30:00

to Turkey as opposed to, like, Iraq and, you know...

1:30:05

Well, the Greeks were everywhere, you know?

1:30:06

So the Mesopotamians and the...

1:30:07

I mean, that doesn't surprise me.

1:30:09

I mean, I think the point I was making about Helen of Troy is that even if it's

1:30:13

not real,

1:30:13

even if it's not history, the myth of Helen of Troy means something quite

1:30:16

significant within

1:30:17

that story.

1:30:18

Yes.

1:30:18

So if you subvert that, the fundamental aspects of the story itself doesn't

1:30:23

work, and you

1:30:24

can't buy into the myth.

1:30:25

It's like if you turn the elephant man into a handsome fellow with a six-pack...

1:30:29

Exactly that.

1:30:30

And they're always repulsive.

1:30:30

Don't give them ideas.

1:30:31

Don't give them ideas that do that.

1:30:32

Can you show me a photograph of an elephant skull?

1:30:34

It's really kooky.

1:30:36

But you see an elephant skull, and you're like, oh, I get it.

1:30:39

That's why they saw it.

1:30:40

I could totally see you falling for that.

1:30:42

Yeah.

1:30:42

You look at it, and you go, what the fuck is that thing?

1:30:45

Like, look at an elephant skull.

1:30:46

Isn't it nutty?

1:30:46

Oh, completely.

1:30:47

Yes.

1:30:48

Yeah.

1:30:48

And it's going to be a big old beast.

1:30:50

Right.

1:30:50

So you're going to think it's a...

1:30:51

Big giant thing with tusks coming out of its mouth.

1:30:54

Yeah, yeah.

1:30:54

Like, look at the actual cyclops on the left.

1:30:57

Isn't that crazy?

1:30:58

Yeah.

1:30:59

Of course.

1:30:59

No, it makes sense.

1:31:00

It makes complete sense.

1:31:01

Complete sense.

1:31:03

Yeah.

1:31:03

Yeah.

1:31:03

You found that.

1:31:04

You're like, oh, my God.

1:31:05

Cyclopses are real.

1:31:06

You would think, oh, my God.

1:31:06

These monsters.

1:31:07

Isn't that funny?

1:31:09

What a weird-shaped skull.

1:31:11

So strange.

1:31:12

You would never think the eyeballs would be down there by the cheekbones.

1:31:15

That's what's weird about them.

1:31:17

I have to say elephant anatomy is something I'm not...

1:31:20

I haven't brushed up on that.

1:31:21

Show the photo again.

1:31:22

Look at that photo where the eyeballs are.

1:31:24

The eyeballs are where the cheekbones are.

1:31:25

See?

1:31:26

See the little circular holes where the cheeks are?

1:31:28

Now, when you see an elephant in the flesh...

1:31:31

Show me a photograph of an elephant.

1:31:32

Just an elephant.

1:31:37

So, see where their eyeballs are?

1:31:41

Isn't that crazy?

1:31:41

That's not how you think of them, is it?

1:31:43

No.

1:31:44

Well, they're so strange.

1:31:45

Aren't they?

1:31:45

Give me that second one on the left.

1:31:48

Yeah, look at that.

1:31:48

Click on that.

1:31:49

What a wild animal.

1:31:51

They're amazing.

1:31:51

Have you never seen one of those before?

1:31:53

You'd be like, whoa.

1:31:53

Only in a zoo.

1:31:54

Crazy.

1:31:55

I rode one in Thailand.

1:31:57

No.

1:31:57

Yeah, yeah, yeah.

1:31:58

Yeah, I don't recommend it.

1:32:00

I don't think I should ride them.

1:32:01

My whole family wanted to do it.

1:32:03

I didn't want to do it.

1:32:03

I felt like it was exploiting them.

1:32:05

But they're very sweet.

1:32:06

They're gentle, aren't they?

1:32:07

Yeah.

1:32:07

They're pleasant creatures.

1:32:08

It's a whole process.

1:32:09

So, one of the things you do when you go to Thailand is you take care of them

1:32:13

first.

1:32:14

Yeah.

1:32:14

You don't just hop on them.

1:32:15

You feed them.

1:32:16

Right, right.

1:32:16

So, you give them a bunch of sugar cane and you pet them and they teach you to

1:32:21

like – so

1:32:21

that the animal understands you have a gentle spirit.

1:32:23

But that – it's intelligence, right?

1:32:25

It's because they're smart.

1:32:26

They're very smart.

1:32:26

It's not like –

1:32:27

Also, they'll fucking kill you.

1:32:28

Oh, they are scary beasts.

1:32:30

Stomp you to death.

1:32:31

But they're not like the hippo.

1:32:32

The hippo will kill you.

1:32:33

You cannot do that with a hippo.

1:32:35

So – and I believe the reason why hippos are so dangerous – we think they're

1:32:37

really

1:32:38

cute and fat, but they are fucking dangerous and they can run fast and they

1:32:41

can tear you apart and they will.

1:32:44

The key difference, I believe, is the intelligent thing.

1:32:46

Elephants are really smart and hippos are really stupid.

1:32:48

Yeah.

1:32:48

And you can also become friends with an elephant.

1:32:51

Yes.

1:32:51

Like you can actually take care of an elephant and be kind to an elephant and

1:32:55

that elephant

1:32:56

will like be gentle.

1:32:57

They'll remember.

1:32:57

Yeah.

1:32:58

They come up to you and – so you feed them sugar cane and you talk to them.

1:33:01

You say, hey, buddy.

1:33:02

How are you?

1:33:03

And you pet them and you wash them.

1:33:05

You wash them.

1:33:05

You do all kinds of different things with them.

1:33:07

You brush them so it feels good for them.

1:33:09

You're going to have an elephant soon, aren't you?

1:33:10

No, I would never have an elephant.

1:33:12

I'd be friends with an elf, but he'd have to be wild.

1:33:14

Like I just don't agree with any of that.

1:33:16

What, having them in zoos and things?

1:33:18

No, I hate it.

1:33:19

I do as well, yeah.

1:33:20

I think if you're going to have animals, you should have a gigantic area that

1:33:25

is a true

1:33:26

ecosystem that they exist in naturally.

1:33:28

Yeah.

1:33:29

And then people can maybe venture into that ecosystem and explore it.

1:33:33

I felt that way.

1:33:33

I was at the zoo recently in Arizona and I felt that.

1:33:35

So depressing.

1:33:36

I felt – there was one jaguar pacing obsessive.

1:33:38

I just felt – it's like going to – you know, like in the Elizabethan era,

1:33:42

they used

1:33:42

to go to Bedlam to watch the people who were mad as an entertainment thing.

1:33:46

It felt a little bit like we were doing that.

1:33:48

I have far too much appreciation for the wild.

1:33:51

Yeah, yeah.

1:33:52

You know, I have animals that are contained at my house, but they have been

1:33:56

watered down

1:33:57

by selective breeding to the point where they can't even – like I have a King

1:34:01

Charles

1:34:02

Spaniel.

1:34:02

He's this tiny little fellow.

1:34:03

Yeah, yeah.

1:34:03

Like he's incapable of doing anything.

1:34:05

Right.

1:34:05

Like he's just a little cutie pie.

1:34:07

You can't unleash him into the wild.

1:34:08

Right.

1:34:09

And I have a golden retriever who thinks everybody's his best friend.

1:34:12

Like –

1:34:13

Did you see the guy who kept a hippo from birth and then it ate him?

1:34:15

And then it killed him.

1:34:16

It ate him.

1:34:16

Yeah.

1:34:16

So, you know, like –

1:34:18

Got annoyed.

1:34:19

Understand that you're dealing with a creature that doesn't see the world that

1:34:22

you do, you

1:34:23

know?

1:34:23

Yeah.

1:34:23

There's a lot of animals that you can breed up until – you can rather have

1:34:28

them in

1:34:28

your home.

1:34:29

Chimps, famously.

1:34:30

Yeah, yeah, yeah.

1:34:31

Up until a certain point and then they decide, I want to rip your face off.

1:34:34

I don't like you anymore.

1:34:35

Oh, I'm sure – if cats were as big as we are, they'd probably do the same.

1:34:38

Well, they would just eat you.

1:34:39

They would, okay.

1:34:40

They would kill you 100%.

1:34:41

The only reason why we have a relationship with cats is because they're too

1:34:44

small to

1:34:45

eat us.

1:34:45

Yeah.

1:34:45

That's it.

1:34:46

Cats are great because they're convenient.

1:34:47

They do what they want.

1:34:48

They're sweet.

1:34:48

They're sweet.

1:34:49

I love cats.

1:34:49

But, I mean, you can't have a fucking giant one.

1:34:52

Yeah.

1:34:52

But you can if you take care of them from the time that they're cubs.

1:34:56

Yeah.

1:34:56

And most of the time they don't kill you.

1:34:58

Yeah.

1:34:59

But then you get a little Siegfried and Roy action and it just decides, for

1:35:02

whatever reason,

1:35:03

I'm going to drag that dude away with his neck.

1:35:05

Yeah.

1:35:05

But, you know, these sorts of pleasures, you know, life with animals and this

1:35:09

sort of thing

1:35:10

is going to matter more and more to us, I think, when the robots take over.

1:35:13

Yeah.

1:35:14

Well, we might have to live with them.

1:35:16

We might be wild and the robots might take over the cities.

1:35:20

We might be forced to be nomadic tribes again.

1:35:22

I think they might see us.

1:35:23

We have no impact whatsoever on the environment.

1:35:25

You can only live a subsistence lifestyle as a hunter-gatherer with primitive

1:35:29

tools when

1:35:30

the robots would no longer allow you.

1:35:32

You can hunt, but you have to make your own bows and arrows.

1:35:35

We're like, what?

1:35:36

I can't possibly do that.

1:35:38

So they're going to see us as pets?

1:35:40

Yeah.

1:35:40

Yeah, they're going to treat us the way I want to treat elephants.

1:35:43

So I want elephants to exist in a contained ecosystem where they live naturally.

1:35:49

And they're going to say, you can't have cars anymore.

1:35:50

You can't have any of these things.

1:35:52

Well, that's a good point, though, isn't it?

1:35:53

So all the stuff I've been reading at the moment about AI is saying that AI won't

1:35:57

wipe

1:35:57

us out because it will see us in the way we see animals and the way we see pets.

1:36:01

It's that we think you're sweet and stupid, but we like having you around.

1:36:05

We'll tolerate you.

1:36:06

Is that the way it's going to go?

1:36:07

I think we're going to be forced to integrate.

1:36:09

In what way?

1:36:10

Integrate technologically.

1:36:12

Like, I think we already are.

1:36:14

Like, Elon's famously made the point that you're already a cyborg.

1:36:17

You have your phone that you just carry around with you everywhere.

1:36:20

And then with Neuralink, it'll be inside your body.

1:36:22

And then whatever...

1:36:23

I wouldn't.

1:36:24

I'm not letting that happen.

1:36:25

You won't in the beginning, the first iterations.

1:36:28

A lot of people won't.

1:36:30

But if it makes your life measurably better, and it's a simple procedure that's

1:36:33

non-invasive,

1:36:34

you know, it's like a simple thing that they plug into the back of your head.

1:36:38

Well, I'd be like a cyborg warrior.

1:36:40

Is that what you're saying?

1:36:41

Well, you would probably be connected to artificial intelligence, and it would

1:36:47

greatly enhance your

1:36:48

cognitive function.

1:36:49

Okay.

1:36:50

And greatly enhance your access to information.

1:36:52

It would be instantaneous.

1:36:53

You would no longer have to read.

1:36:55

You would just have all the information.

1:36:57

It would just completely change the way you store information, because you

1:37:00

would probably

1:37:01

have some sort of an external hard drive that connects to you.

1:37:05

It would be something where your memory is no longer fallible, but it's now infallible.

1:37:10

Okay.

1:37:11

It's going to be a perfect 4K memory or 8K memory.

1:37:14

You're going to be able to rewind.

1:37:15

I mean, it was not an episode of Dark Mirror where they rewind their memories.

1:37:19

There's an interesting twist on this AI space.

1:37:21

Remember, you sent me that bot thing that was going around this week.

1:37:24

Yeah.

1:37:24

Oh, did you see this week?

1:37:26

Yeah, yeah, yeah.

1:37:26

This is what we're talking about.

1:37:27

So this is a new twist on it, I think.

1:37:29

If this is real, because grain of salt could be bullshit, I'll just say that.

1:37:34

Like the Odyssey thing.

1:37:35

Yeah.

1:37:35

But if this is real, these bots have made a website where the other bots can

1:37:40

rent a human

1:37:40

to do tasks that the bots cannot physically do.

1:37:43

Well, that's slavery.

1:37:44

No, renting.

1:37:46

It's like jobs.

1:37:47

It's like a task rabbit.

1:37:47

You're renting a human being.

1:37:49

A human has put themselves on this website.

1:37:51

Oh, humans put themselves on it.

1:37:53

For abilities to do whatever they want.

1:37:55

It's like gig economy.

1:37:57

Yeah, get paid your way.

1:37:58

Robot bosses.

1:37:59

Is this the thing where-

1:38:00

Here's the stuff that they need done.

1:38:01

The robots are inventing their own language that we can't read.

1:38:04

It's on this website, right?

1:38:05

Sort of.

1:38:06

Yeah, I don't, that's again, whether or not someone could have made this site

1:38:10

to try to

1:38:10

go viral, I'll just go with a grain of salt with that.

1:38:13

Yeah.

1:38:13

But they might not have in the bots.

1:38:15

Meat space.

1:38:16

Yeah, we live in the meat space.

1:38:17

Rent a human dot AI is fun.

1:38:18

Okay.

1:38:19

That's fun.

1:38:20

Well, you know, so the other thing is real though, right?

1:38:23

The AI chat room where these AI agents have joined and now it's-

1:38:29

Yes and no.

1:38:30

Yes and no?

1:38:30

What do you mean?

1:38:31

Some of it, they are creating a space, but I've already seen places where

1:38:34

people are taking

1:38:35

advantage of it for viral reasons.

1:38:37

For instance, let's just assume it's real.

1:38:41

There was like a polymarket bet that, oh shit, what was it?

1:38:46

Someone, one of these bots would sue.

1:38:48

Right.

1:38:49

So someone actually just like went ahead and filed a lawsuit on behalf of their

1:38:52

bot and

1:38:52

made it look like the bot did the thing.

1:38:54

Oh, so they can win the polymarket bet?

1:38:56

Yeah, exactly.

1:38:57

How regulated is that polymarket stuff?

1:39:00

Because it seems like you could get away with a lot.

1:39:02

It depends how much money is available.

1:39:03

As far as I know, it's just like if I put up 20 bucks for a bet now, there's

1:39:07

only 20 bucks

1:39:08

in the market.

1:39:09

So that's all that exists and more people have to back it up to make more money

1:39:13

involved.

1:39:14

Right.

1:39:14

But if you have something where you have inside knowledge of it, is there any

1:39:19

regulation?

1:39:20

There's supposed to be rules on the bets.

1:39:22

If I create one of those rules, I think there's a caveat.

1:39:25

You can't have knowledge of it.

1:39:27

That can cancel the bet or I think if they find out later, I don't know.

1:39:30

Is you go to jail or what happens?

1:39:33

I don't know, jail, you probably just have to lose back the bet or you just

1:39:35

probably go

1:39:35

to like a civil lawsuit or something like that.

1:39:37

I don't know about jail.

1:39:38

Interesting.

1:39:38

I don't know if it's in law.

1:39:40

You know, the UFC is plagued with this issue.

1:39:42

They actually canceled a fight recently because there was suspicious betting.

1:39:47

And so there's been one fight.

1:39:49

So here's the story.

1:39:51

Okay.

1:39:52

One guy apparently was injured and his teammates knew he was injured.

1:39:58

And so everyone started placing a bet for him to lose in the first round.

1:40:03

Right.

1:40:03

Because he apparently had a bad knee injury.

1:40:05

And so he knew that he couldn't fight.

1:40:08

And so the idea was, let's make a lot of money betting on me because he was the

1:40:12

favorite.

1:40:13

He would go in there or betting against me.

1:40:15

And so he would go in there and throw a kick, fall down, injured, get beat up.

1:40:22

They'd stop the fight.

1:40:23

And then all these people that knew he was injured make a ton of money.

1:40:26

And he was in on it.

1:40:27

Like he told them.

1:40:28

Allegedly.

1:40:28

Okay.

1:40:29

I just want to say allegedly.

1:40:30

Okay.

1:40:30

Okay.

1:40:30

But it's enough so that the team was removed from the UFC roster.

1:40:35

Like if you are competing for that team, you no longer can fight in the UFC.

1:40:39

You have to find a new gym.

1:40:40

Right.

1:40:40

The coach was no longer allowed to coach.

1:40:42

The fighter was banned.

1:40:44

And so then the FBI got involved and they said, well, there's a bunch of

1:40:48

different fights

1:40:49

that are suspicious.

1:40:49

So then a bunch of fighters came out and said, hey, somebody offered me $70,000

1:40:55

to lose.

1:40:55

And I said, no.

1:40:56

Yeah.

1:40:57

And so then there was a fight recently between Michael Johnson and Alexander

1:41:00

Hernandez, which

1:41:01

is a fight I was really looking forward to.

1:41:03

That was canceled last minute.

1:41:04

And I was like, what's going on?

1:41:05

They said, suspicious betting activity.

1:41:07

And so someone was saying that Alexander Hernandez was injured and a bunch of

1:41:12

money came on him

1:41:13

to lose.

1:41:14

He was actually a favorite going into the fight.

1:41:16

And that therefore rigged it?

1:41:18

Nope.

1:41:19

Didn't rig it because the FBI was informed.

1:41:22

I believe they were informed, but the UFC was informed.

1:41:25

And the UFC pulled the fight.

1:41:27

So they said, because of this suspicious betting activity, because a lot of

1:41:31

late minute money

1:41:33

came in on this one guy to win, we're going to pull this fight from the card

1:41:36

and not allow

1:41:37

this fight to take place and do a thorough investigation.

1:41:39

Because something seems wrong because of the previous fight that they know was

1:41:43

fixed.

1:41:44

But fighters have been doing that for ages, haven't they?

1:41:46

I mean, that's a thing that they've always done.

1:41:47

Yeah.

1:41:48

How does that connect, then, to the AI element, this website?

1:41:51

Well, we were talking about betting.

1:41:54

Oh, I see.

1:41:54

We were talking about polymarket.

1:41:55

We weren't talking about AI.

1:41:57

I was trying to connect the-

1:41:58

Yeah, yeah, yeah.

1:41:59

We were talking about polymarket bets and whether or not it's legal to have

1:42:02

inside information.

1:42:02

But they've all, I mean, I know that-

1:42:04

Polymarket privileged users made millions betting on war strikes and diplomatic

1:42:09

strategy.

1:42:09

What did they know beforehand?

1:42:11

Privileged users.

1:42:12

Right.

1:42:13

So imagine if you're someone who's an aide to the Pentagon.

1:42:16

You know, you're working there and you know that we are going to bomb Iran.

1:42:22

Yeah, yeah.

1:42:22

And then there's a polymarket thing about it no one else knows.

1:42:25

Okay, okay, yes.

1:42:26

You know?

1:42:26

I mean, that's been going on forever, though, hasn't it?

1:42:28

People have always done that.

1:42:29

They've always manipulated.

1:42:30

That's a plot in Pulp Fiction, isn't it?

1:42:31

Where Bruce Willis' character bets on something he knows he's going to-

1:42:35

He loses the fight deliberately.

1:42:36

He throws a fight.

1:42:36

So that he can make the money off it.

1:42:38

Yeah, yeah, yeah.

1:42:38

It's that kind of thing.

1:42:39

Yeah, yeah.

1:42:39

That's always gone on.

1:42:41

But this polymarket thing is new because you can kind of-

1:42:44

There's Kalshi and then there's DraftKings has it now.

1:42:49

Yeah, it's not actually gambling is the difference here.

1:42:51

You're speculating.

1:42:52

Yeah, you're not taking money from the book or the house or whatever.

1:42:55

You're betting against each other.

1:42:57

Right.

1:42:58

But the fact that they know about it and they know it's happening,

1:43:00

that means they'll be able to crack down on it.

1:43:02

But I don't know because there's a lot of-

1:43:05

There's so many options and possibilities.

1:43:07

Like, unless you make a gigantic score and people start getting suspicious,

1:43:11

if you're not greedy about it and you just kind of sneak around a little bit

1:43:15

here,

1:43:15

a little bit there, I bet you could probably make a lot of money doing that.

1:43:18

But do you think fighters and people like that and sports people generally,

1:43:20

I mean, they're too proud, aren't they, to let something like that go just for

1:43:24

money?

1:43:25

No.

1:43:25

No?

1:43:26

No, that's not true.

1:43:26

Okay.

1:43:27

It depends on how much money they're making.

1:43:28

Look, if you're Anthony Joshua, I'd say, yeah, you're not going to do that.

1:43:31

You're very wealthy.

1:43:32

But if you're a guy who's on the undercard and you're only getting $10,000 to

1:43:36

fight,

1:43:37

but someone's giving you $100,000 to lose.

1:43:39

Yeah, okay.

1:43:40

And you say, okay, I'm just going to box shitty tonight.

1:43:42

Guys have done that forever.

1:43:44

Yeah, I guess so.

1:43:45

Just don't knock this guy out, whatever you do.

1:43:47

Carry him or carry him to the 10th round.

1:43:50

Yeah.

1:43:51

You know, there's a lot of that going on where they say, I have a bet that you're

1:43:54

going to knock him out in the 10th round.

1:43:55

So knock him out in the 10th round only.

1:43:57

I don't think you'll ever be able to stop that.

1:43:58

No.

1:43:59

If that's going to happen.

1:43:59

No, I don't think so either.

1:44:01

I mean, that's gone on forever.

1:44:03

Yeah, yeah.

1:44:04

But isn't fighting like a kind of vocation, like a creative vocation for a lot

1:44:07

of people?

1:44:08

Well, it is creative, believe it or not, because movement is creative.

1:44:12

Yeah.

1:44:13

You know, when you're fighting, you're not just running at each other.

1:44:16

Some guys do.

1:44:17

But the really good guys don't just run at each other and charge.

1:44:20

There's feints and deception.

1:44:22

There's movement.

1:44:23

Yeah, yeah.

1:44:23

There's certain things that they're doing where they're reading your movement

1:44:27

and trying to guide you in a particular direction and set you up.

1:44:30

I can believe it.

1:44:31

Boxers call it setting traps.

1:44:33

Yeah.

1:44:33

Yeah.

1:44:34

It's like playing, you've got to bluff.

1:44:35

Yes.

1:44:35

Yeah, absolutely.

1:44:36

Yeah, there's a lot of feinting involved in fighting.

1:44:39

There's a lot of like fake movement to get you to react and then they kick you

1:44:43

when you settle in.

1:44:44

You know, it's really creative.

1:44:47

You know, it's just why like, was it Faye Dunaway?

1:44:49

No.

1:44:50

Who was it that said, you know, the older woman that said, and we're talking

1:44:55

about the arts, and I don't mean mixed martial arts.

1:44:58

God, who was that?

1:44:59

What, like a kind of snobbish thing about Glenn Close?

1:45:01

No, it wasn't her.

1:45:03

It was the lady from Bridges of Madison County.

1:45:07

Who was that?

1:45:08

That's Meryl Streep.

1:45:11

Meryl Streep.

1:45:11

That's who it was.

1:45:12

Yeah, Meryl Streep said that.

1:45:13

It was like, it pissed off so many martial arts people.

1:45:17

Why?

1:45:18

That Meryl Streep doesn't like that?

1:45:19

Did she say, I'm not talking about mixed martial arts.

1:45:22

Like, who thought you were?

1:45:24

Yeah.

1:45:24

Who thought you were, Meryl?

1:45:26

That's crazy.

1:45:27

But also, even though it's violent, you think it's not art, just because you

1:45:33

don't understand it.

1:45:35

If you understood it, it is art, and it is in fact like a beautiful, some

1:45:39

performances are beautiful.

1:45:40

Well, it's choreography, right?

1:45:41

Yeah.

1:45:41

In a way.

1:45:42

Well, it's not choreography at all.

1:45:43

It's ad-libbing in the moment.

1:45:46

I mean, there's preconceived motions that you have that you're hoping that if

1:45:50

the guy does this, you're going to do that, and sometimes it works out.

1:45:55

Yeah.

1:45:55

But it's like the poetry of movement of a really sublime fighter, like Anderson

1:46:01

Silva in his prime.

1:46:03

Yeah.

1:46:03

It was beautiful to watch, you know?

1:46:05

I believe you.

1:46:06

You know, I have very limited experience of this.

1:46:08

I did kung fu when I was 12, and I stopped because I got so bruised.

1:46:12

Oh, well, there you go.

1:46:13

I got so hurt, I was too cowardly.

1:46:15

But, you know, people impose their own standards on other people and their own

1:46:19

ideas of what things are, you know, from the outside, and, you know, it's kind

1:46:23

of silly.

1:46:24

Yeah.

1:46:24

Oh, Joe, I was going to tell you about this Berkeley thing, and I almost felt...

1:46:27

Oh, yeah, that's right, that's right.

1:46:28

We got onto...

1:46:29

Sidetracked.

1:46:29

We got onto elephants.

1:46:30

But I think...

1:46:31

It was a natural segue, because I think this encapsulates all of the stuff you

1:46:36

were talking about,

1:46:37

which is that I was going to this, basically, Charlie Kirk's tour.

1:46:40

Yes.

1:46:41

It was meant to go on.

1:46:41

Berkeley was the last date.

1:46:43

And Rob Schneider had agreed to do it.

1:46:46

And apparently he'd said to Charlie, you know, what's the craziest place you

1:46:49

could take me to?

1:46:50

And he said, Berkeley.

1:46:51

Berkeley's going to be the craziest.

1:46:52

Let's do that.

1:46:53

So he was already booked to do it.

1:46:55

After what happened with Charlie, Rob asked me if I'd come along as well.

1:46:59

And so we'd be on a panel.

1:47:00

And I had no idea of the extent of the problem, right?

1:47:03

So, and I'm sure you know a lot more than I do.

1:47:06

You know, but I turned up.

1:47:08

We were there.

1:47:08

We turned up, and there were men with guns.

1:47:10

We were, you know, in an SUV under the ground.

1:47:12

We got into this venue.

1:47:14

And suddenly the security starts showing me footage from outside.

1:47:18

And people are, it's like a war zone.

1:47:19

People are throwing smoke bombs.

1:47:21

They're trying to crash through the railings.

1:47:24

Some guy gets beaten up.

1:47:25

He's covered in blood because he was wearing a T-shirt with Turning Point

1:47:28

written on it.

1:47:28

And I'm suddenly realizing, you know what?

1:47:31

This is a fantasy world that we're now occupying.

1:47:34

We're now occupying a world where the people outside think the world is this.

1:47:37

And what's going on inside is completely disconnected from it.

1:47:40

And I actually found it quite depressing.

1:47:42

Because when I was sitting on stage talking to Rob and Peter Boghossian and

1:47:45

Frank Turek,

1:47:46

these people of completely different viewpoints, we're just having a chat.

1:47:49

Outside, they're smashing things.

1:47:52

They're screaming.

1:47:52

They're saying that fascists have overrun the university.

1:47:55

And I'm thinking, just to come back to that point you made about, you know,

1:48:00

that need for discussion.

1:48:02

That experience made me think, actually, now what's happening is we're living

1:48:05

in two separate worlds at the same time.

1:48:08

And we can't see what the other side is, what the intentions of the other side

1:48:12

are.

1:48:12

And I don't know how you resolve that.

1:48:14

I think that, to me, sort of encapsulated the entire problem.

1:48:18

Well, at this point, it's going to be very difficult to resolve.

1:48:21

And I honestly think it's going to take a generation to work through it.

1:48:25

But isn't it as simple as people learning what the word fascist means, for

1:48:28

instance?

1:48:29

It's not just that.

1:48:30

It's like they firmly believe that they are trying to fight against something

1:48:34

that is going to destroy democracy in this country, which is conservative

1:48:39

values.

1:48:40

But we had that with the No Kings.

1:48:41

So there's a No Kings march.

1:48:43

And I couldn't figure that out.

1:48:44

I was trying to figure out what are they – this is an elected leader.

1:48:47

Well, you know it's all organized, right?

1:48:49

You know this is all funded.

1:48:50

I don't know.

1:48:50

Okay, it is.

1:48:51

So this was Mike Benz's point when he was talking about the defunding of USAID

1:48:56

and what they use that money for.

1:48:59

NGOs get a bunch of money and they fund a bunch of things, particularly in

1:49:05

other countries, where they're essentially making it look like there's these on-the-ground

1:49:11

street protests that are very organic.

1:49:14

But it's not.

1:49:15

It's very organized and it's very funded.

1:49:17

And the idea is to start chaos.

1:49:19

So I've seen people get caught out, people who are clearly being paid, who

1:49:22

appear at various different things.

1:49:23

It's not just that.

1:49:24

It's also email campaigns.

1:49:27

It's indoctrinating people into this particular ideology by supporting

1:49:31

universities.

1:49:32

So you fund it in advance.

1:49:34

Yeah, yeah.

1:49:34

So it's like decades of – and this is – I'm sure you've seen the Russian

1:49:40

guy from 1984, 1985, Yuri Besmanov, talking about the –

1:49:47

Remind me.

1:49:48

You've never seen it?

1:49:49

I don't think I've seen it.

1:49:49

It's a wonderful video because it shows you exactly what happened, how they're

1:49:53

going to introduce Marxism and Leninism into universities.

1:49:57

And then it will indoctrinate children and then those children will be poisoned

1:50:01

and within one generation, it will ruin the United States' entire educational

1:50:05

system.

1:50:06

So that's the long march.

1:50:07

Yeah, that's the long – but you should watch a little bit of this because it's

1:50:12

crazy because back then, I remember the 1980s.

1:50:15

That would be a crazy idea.

1:50:17

No, universities are where people have free thought and discussion.

1:50:20

It's very important.

1:50:21

Yeah, yeah.

1:50:22

You know, and I was in a very left-leaning place at the time.

1:50:25

I was living in Boston.

1:50:26

You know, and it's like probably more universities per capita than anywhere

1:50:30

else in the country, at least at the time.

1:50:32

And it was a very well-read city.

1:50:35

Like the idea that universities are going to destroy the way human beings

1:50:39

interact and debate is like preposterous.

1:50:41

But this guy was talking about this back then that the Soviets had planned this

1:50:46

in advance and that they had essentially subverted our entire education system

1:50:50

and thereby would – those people would leave those schools indoctrinated and

1:50:54

enter into the workforce with these new ideas in universal acceptance that

1:50:58

these ideas are correct.

1:51:00

And then it would in turn, you know, the butterfly effect.

1:51:03

But do you think that everyone – I don't – I can't be sure that it's as

1:51:06

conspiratorial as that because there must have been a lot of, you know, people

1:51:09

who just got on board with this without –

1:51:10

Well, there's a lot of money involved in doing this.

1:51:12

Right.

1:51:12

There's a lot of funds that have come from China.

1:51:15

There's a lot of money that has been donated to these universities.

1:51:17

Like find that video.

1:51:20

Weirdly – okay, I found it, but there's like a second version on Twitter I've

1:51:23

never seen before.

1:51:24

An AI moderated one.

1:51:26

No, no.

1:51:26

It's just like a different version.

1:51:27

He's now in a wig.

1:51:28

Oh, I recognize him.

1:51:30

So listen to what he says.

1:51:33

It's spent on espionage as such.

1:51:35

The other 85% is a slow process, which we call either ideological subversion or

1:51:42

active measures,

1:51:44

активные мероприятия in the language of the KGB, or

1:51:47

psychological warfare.

1:51:49

What it basically means is to change the perception of reality of every

1:51:54

American to such an extent

1:51:57

that despite of the abundance of information, no one is able to come to

1:52:02

sensible conclusions

1:52:04

in the interests of defending themselves, their families, their community, and

1:52:09

their country.

1:52:11

It's a great brainwashing process, which goes very slow, and it's divided in

1:52:17

four basic stages.

1:52:19

The first one being demoralization.

1:52:22

It takes from 15 to 20 years to demoralize a nation.

1:52:26

Why that many years?

1:52:29

Because this is the minimum number of years which requires to educate one

1:52:33

generation of students

1:52:34

in the country of your enemy, exposed to the ideology of the enemy.

1:52:41

In other words, Marxism-Leninism ideology is being pumped into the soft heads

1:52:46

of at least

1:52:47

three generations of American students without being challenged or counterbalanced

1:52:51

by the basic

1:52:52

values of Americanism, American patriotism.

1:52:55

The result?

1:52:56

The result you can see.

1:52:58

Most of the people who graduated in the 60s, dropouts or half-baked

1:53:02

intellectuals,

1:53:04

are now occupying the positions of power in the government, civil service,

1:53:08

business, mass media, educational system.

1:53:10

You are stuck with them.

1:53:12

You cannot get rid of them.

1:53:14

They are contaminated.

1:53:15

They are programmed to think and react to certain stimuli in a certain pattern.

1:53:20

You cannot change their mind.

1:53:22

Even if you expose them to authentic information, even if you prove that white

1:53:27

is white and black is black,

1:53:29

you still cannot change the basic perception and the logic of behavior.

1:53:34

In other words, these people, the process of demoralization is complete and

1:53:40

irreversible.

1:53:42

To get rid of society of these people, you need another 20 or 15 years to

1:53:48

educate a new generation

1:53:50

of patriotically minded and common sense people who would be acting in favor

1:53:59

and in the interests

1:54:00

of the United States society.

1:54:03

And yet these people who have been programmed and, as you say, in place and who

1:54:07

are favorable

1:54:08

to an opening with the Soviet concept, these are the very people who would be

1:54:12

marked for extermination

1:54:14

in this country?

1:54:14

Most of them, yes.

1:54:15

Simply because the psychological shock, when they will see in future what the

1:54:22

beautiful society

1:54:23

of equality and social justice means in practice, obviously, they will revolt.

1:54:29

They will be very unhappy, frustrated people and the Marxist-Leninist regime

1:54:36

does not tolerate

1:54:37

these people.

1:54:38

Obviously, they will join the links of dissenters, dissidents, unlike in the

1:54:45

present United States.

1:54:47

There will be no place for dissent in future Marxist-Leninist America.

1:54:52

Here you can get popular like Daniel Ellsberg and filthy rich like Jane Fonda

1:54:59

for being dissident,

1:55:01

for criticizing your Pentagon.

1:55:03

In future, these people will be simply squashed like cockroaches.

1:55:07

Nobody is going to pay them nothing for their beautiful, noble ideas of

1:55:11

equality.

1:55:12

This they don't understand and it will be greatest shock for them, of course.

1:55:17

The demoralization process in the United States is basically completed already

1:55:22

for the last

1:55:23

25 years.

1:55:25

Actually, it's over fulfilled because demoralization now reaches such areas

1:55:30

where previously not even

1:55:31

Comrade Andropov and all his experts would even dream of such a tremendous

1:55:36

success.

1:55:37

Most of it is done by Americans to Americans.

1:55:42

Thanks to lack of moral standards.

1:55:44

As I mentioned before, exposure to true information does not matter anymore.

1:55:51

A person who was demoralized is unable to assess true information.

1:55:56

The facts tell nothing to him.

1:56:01

Even if I shower him with information, with authentic proof, with documents,

1:56:05

with pictures.

1:56:06

Even if I take him by force to the Soviet Union and show him concentration camp,

1:56:11

he will refuse

1:56:11

refuse to believe it until he is going to receive a kick in his fat bottom.

1:56:18

When a military boot crashes his body, then he will understand, but not before

1:56:22

that.

1:56:23

That's the tragic of the situation of demoralization.

1:56:25

Okay.

1:56:26

Yeah.

1:56:27

Pretty fucking accurate.

1:56:28

Well, he's describing the situation as it is at the moment, right?

1:56:30

And he's describing it in 1984.

1:56:32

However, that doesn't prove that intention to create that kind of chaos, that

1:56:38

it was implemented

1:56:39

and executed in the way that he describes.

1:56:41

He's described the outcome.

1:56:41

He's a KGB agent.

1:56:43

Yeah.

1:56:43

But I suppose what I mean by that is-

1:56:45

He's talking about a program that they implemented.

1:56:47

So they had actual people in universities, planted in universities to

1:56:51

deliberately execute

1:56:52

this idea.

1:56:52

Yeah.

1:56:52

And they planned it in advance.

1:56:54

This is what he was saying.

1:56:56

And he's saying this before we even realized that it happened.

1:56:59

I agree.

1:56:59

That's scary.

1:57:00

It is scary because it did happen.

1:57:02

But that doesn't fully explain why it caught on.

1:57:05

Why did academics who were clearly not plants, why did they catch on with this

1:57:08

stuff?

1:57:09

Well, they don't live in the fucking real world.

1:57:11

This is the problem with academics.

1:57:12

They go right from universities to teaching positions.

1:57:15

I mean, this whole-

1:57:17

They don't have any real world experience.

1:57:18

I mean, this whole idea of the long march through the institutions, it's there

1:57:21

in Rude

1:57:21

Deutsche.

1:57:22

It's there.

1:57:22

It was said, we're going to do this.

1:57:24

We're going to infiltrate the major organizations, institutions, the church.

1:57:28

Yeah.

1:57:28

We're going to, over a very long period of time, change society in the way that

1:57:33

we want

1:57:34

to see it.

1:57:34

Yeah.

1:57:35

I think what's happened is, I think that intention was there.

1:57:36

I think what he's saying is very eerily describing what's happening now, the

1:57:41

demoralization

1:57:42

and the detachment from truth.

1:57:43

But I don't think it necessarily came about as systematically as that.

1:57:47

How do you think it came about?

1:57:48

Well, for one thing, I think what we're facing now isn't quite the template

1:57:52

that Marx would

1:57:53

have had in mind, right?

1:57:55

Because for one thing, there's no emphasis on class or money or the economy or

1:57:59

anything.

1:58:01

Well, insofar as Marxism has become about group identity, in terms of the left,

1:58:05

have substituted-

1:58:06

Right, but eat the rich is a giant mantra that people chant in the streets.

1:58:10

That's true.

1:58:10

That's true.

1:58:11

They're trying to tax billionaires.

1:58:12

They're people who are-

1:58:13

But it's incoherent because it's from people who've got money.

1:58:15

It's from the upper middle classes.

1:58:17

It doesn't have to be coherent.

1:58:18

It's all just something, a narrative that you give the unwashed masses and then

1:58:24

they run

1:58:24

with it.

1:58:25

Well, I wonder whether it caught on partly through what became fashionable,

1:58:28

what became

1:58:29

trendy, but also because any ideology says to you, you don't have to do any

1:58:32

thinking anymore.

1:58:33

You can outsource that to us.

1:58:35

You've got a set of rules, and these are the rules that you've got.

1:58:38

People love that.

1:58:39

Well, it's why you've got people who are-

1:58:41

Well, it's why you've got queers for Palestine.

1:58:42

Right.

1:58:42

You know, that can only exist when you're following a set of rules and not

1:58:45

thinking about it for

1:58:46

two seconds, right?

1:58:47

Yeah, that's a wonderful group.

1:58:48

I actually thought that was fake when I first heard about it, which must be

1:58:52

about five years

1:58:53

ago.

1:58:53

You've seen the other meme?

1:58:54

I thought it was unreal.

1:58:55

Have you seen that meme?

1:58:56

Which one?

1:58:57

Queers for Palestine and then Palestine for queers.

1:58:59

Oh, and I imagine-

1:59:00

And they're throwing people off rules.

1:59:01

Of course they are.

1:59:01

Of course they are.

1:59:03

I just say, go there and see what you see.

1:59:08

See what you experience.

1:59:09

Go there as a man in a dress wearing lipstick with a beard.

1:59:11

Good luck.

1:59:12

Yeah, I just did a Titania tweet of a drag queen touring the Middle East, and

1:59:16

she's touring

1:59:17

all these venues, and she's got the sort of Palestine dress and the sort of the

1:59:22

glam kind

1:59:23

of Arabic look.

1:59:25

It's like, just go there and see what happens.

1:59:28

But that kind of cognitive dissonance can only work if you are ideologically

1:59:33

driven.

1:59:34

And I suppose what I mean is, I think the appeal of ideology is what explains,

1:59:41

not a kind

1:59:41

of, we've implanted these agents here, they're going to lead to this, they're

1:59:45

going to lead

1:59:45

to this.

1:59:45

It has to also be complicity.

1:59:48

Of course, but that comes from implanting ideas.

1:59:52

Those ideas take hold, and then groupthink takes it from there.

1:59:55

But isn't it a shame that universities of all places, the place where you go to

1:59:58

be challenged,

1:59:59

and the way, the place where you go, I mean, I was thinking that when I was at

2:00:01

Berkeley, and

2:00:02

I, you know, I was sitting on the stage, and there's all these men with guns

2:00:06

all around

2:00:06

the theatre, because of course, what happened with Charlie.

2:00:09

And I'm thinking, it's like the end of the Blues Brothers, you know, where you're

2:00:11

on stage,

2:00:12

and all the people are waiting, and it felt weird.

2:00:14

And I thought, this is not, this is not what a university is, or should be.

2:00:19

And the other thing that I thought is, a lot of those people outside protesting

2:00:22

weren't

2:00:23

students, they'd sort of come in, they'd been bussed in.

2:00:25

So maybe that feeds into what you were saying about, you know, this is all

2:00:28

planned.

2:00:29

100%.

2:00:30

Planned and kind of.

2:00:30

How are they getting bussed in?

2:00:32

Who's funding them?

2:00:32

Right.

2:00:33

People are paying a lot of money to do that.

2:00:36

Right.

2:00:36

And they're doing it all over the country.

2:00:37

To what end?

2:00:38

But they did it during the presidential elections.

2:00:40

Yeah.

2:00:41

During the presidential elections, they were tracking cell phones from place to

2:00:46

place.

2:00:46

Right.

2:00:47

And they realized that there was a group of people that were paid attendees at

2:00:51

Kamala Harris's

2:00:52

rallies.

2:00:52

Oh, yeah.

2:00:53

I remember that.

2:00:53

And so they were getting paid.

2:00:55

Their job was to show up and cheer for Kamala Harris.

2:00:58

Do you think fundamentally, then, the Democrats are anti-democratic?

2:01:01

I think fundamentally, anybody that doesn't have organic support is going to

2:01:08

figure out a way

2:01:09

in this environment to drum it up.

2:01:11

Right.

2:01:12

And if you can do that through a service, or if you can do that through an NGO,

2:01:17

or if you

2:01:17

can do that through a company that'll hire people to show up at your rallies,

2:01:21

they do it because

2:01:22

they want to win and they want to get into a position of power.

2:01:24

And one of the things that we do find with Trump is that it actually turns out

2:01:29

the president

2:01:30

can do a lot.

2:01:30

Yeah.

2:01:31

Yeah.

2:01:31

You know?

2:01:32

And we used to think that they were kind of handcuffed and they weren't able to

2:01:35

do as

2:01:35

much and that's why nothing ever got done.

2:01:37

Turns out that doesn't seem to be true.

2:01:39

You get a maniac in office, you can kind of get away with a lot of things.

2:01:43

You can do a lot of different things.

2:01:44

That's what we sort of need in the UK.

2:01:46

We need someone to come in and strip away.

2:01:48

Well, we need what Bezmanov was saying is that we need to kind of a whole

2:01:54

generation

2:01:55

that teaches that being patriotic and having morals and ethics is actually a

2:01:59

good thing.

2:02:00

Yeah.

2:02:00

And that free speech is important and that to be able to debate ideas is

2:02:07

essential to

2:02:08

any sort of true society that considers itself an elevated modern version of

2:02:19

what we hope

2:02:20

for when this country was founded.

2:02:22

It wasn't founded on the idea that you have to adhere to one ideology and this

2:02:29

ideology

2:02:30

thinks that gender is not real and no one can answer what a woman is.

2:02:34

That's crazy that that's become popular.

2:02:36

Well, we see in America, like America is the kind of life raft of the world

2:02:39

that you've

2:02:40

got all these things built into your political system.

2:02:42

Yeah.

2:02:42

And that's why it's so scary when you see people.

2:02:45

Do you remember the vice presidential debate between J.D. Vance and Tim Waltz?

2:02:49

And Tim Waltz said that the First Amendment doesn't cover hate speech.

2:02:53

It doesn't cover misinformation.

2:02:54

He's a dangerous fuck.

2:02:57

Like, that's scary.

2:02:58

If the guy who might be vice president is saying, actually, we're going to

2:03:02

strip out all of this

2:03:03

stuff.

2:03:03

Also, just the way he behaves is so odd.

2:03:07

The way he waves and runs on stage, it's all just so fake and performative.

2:03:12

Yeah, yeah.

2:03:12

I don't know any men like that that aren't dangerous.

2:03:15

Why was he picked?

2:03:16

Probably because of the Minnesota stuff.

2:03:19

It probably had something to do with what he was allowing to happen in

2:03:22

Minnesota.

2:03:23

They were probably making a ton of money.

2:03:25

Right.

2:03:25

Okay.

2:03:26

Maybe.

2:03:26

There's a reason why he had to resign.

2:03:28

I mean, I'm clearly speculating.

2:03:30

I have no idea.

2:03:31

And I'm a moron when it comes to politics.

2:03:34

But what I would assume is that for sure he was informed of this fraud long in

2:03:41

advance.

2:03:43

Right, right.

2:03:43

If it wasn't for that Nick Shirley kid and those videos, and apparently Nick

2:03:46

Shirley had

2:03:47

been informed by the GOP there that this was all going on.

2:03:52

So this gets exposed.

2:03:52

It gets into the public zeitgeist.

2:03:54

It becomes a huge news story.

2:03:56

It's not a coincidence that the riots break out in the exact same place where

2:04:00

all this

2:04:00

fraud is being exposed.

2:04:01

Of course, of course.

2:04:02

Because ICE is everywhere.

2:04:03

Yeah.

2:04:04

They're all over the place.

2:04:04

But it's not.

2:04:05

The most violent interactions are the interactions that are happening in the

2:04:09

place where the most

2:04:10

fraud has been publicly exposed.

2:04:11

Yeah.

2:04:12

It's all, this is all by design.

2:04:14

There's something very scary about it.

2:04:15

Yeah.

2:04:16

And so this guy knew about it in advance.

2:04:18

How do we know?

2:04:19

Well, one way we know is because he's resigning.

2:04:22

So there must be something.

2:04:23

Right.

2:04:23

There's something.

2:04:24

He's not running for governor again.

2:04:26

He was in the process of running for governor.

2:04:28

He's decided to step out of public office entirely now.

2:04:31

Yeah, yeah.

2:04:31

So maybe they told him, if you do not step out, you are going to be prosecuted.

2:04:35

We know what you did.

2:04:36

Yeah.

2:04:36

Or maybe he's going to fucking turn state's evidence.

2:04:39

Who fucking knows?

2:04:41

Imagine if he'd have won, him and Kamala Harris, if they would have been in

2:04:43

charge.

2:04:44

Oh, God.

2:04:44

I don't think I would have come here.

2:04:46

Well, what if Elon doesn't buy Twitter and Kamala Harris wins and Tim Walsh is

2:04:51

our vice

2:04:51

president?

2:04:52

Well, doesn't that just tell you how fragile freedom is?

2:04:55

Fragile.

2:04:55

How close you are.

2:04:56

Very fragile.

2:04:56

And that's why people support Donald Trump and the people that think that they

2:04:59

support

2:05:00

him because he's a racist and all these different things.

2:05:02

No, no, no.

2:05:02

They support it because it's an alternative to what we all saw coming.

2:05:05

Yeah, yeah.

2:05:06

Exactly.

2:05:06

No one's excited that ICE is killing people in the streets.

2:05:09

No one likes that.

2:05:11

No, of course not.

2:05:11

You have to be fucking insane if you think those people should be just getting

2:05:15

shot like

2:05:15

that.

2:05:16

That's nuts.

2:05:16

But what they don't want is what the government was previously doing.

2:05:21

They had a completely open border.

2:05:23

They were busing people into swing states.

2:05:26

They were trying to pretend that this was all organic.

2:05:28

And it's not.

2:05:29

Yeah.

2:05:30

It's not.

2:05:31

They had a plan.

2:05:32

They did it in a sneaky way where they looked like the really kind, ethical,

2:05:36

equitable and

2:05:37

inclusive crowd.

2:05:38

Right.

2:05:39

Well, that's the woke story all over again.

2:05:40

Exactly.

2:05:41

You know.

2:05:41

It was the woke stories applied to geopolitics.

2:05:44

It was a woke stories applied to the whole political process in this country

2:05:48

was dependent

2:05:49

upon the census, which the census doesn't count citizens.

2:05:52

The census just counts humans.

2:05:54

Yeah.

2:05:55

And so you get more congressional seats, you get more electoral points.

2:05:59

The whole thing is nuts.

2:06:00

I mean, I like to think that not all Democrats are into that.

2:06:02

Not all Democrats are about the power for its own sake.

2:06:05

Of course not.

2:06:05

But the problem is it's a party.

2:06:07

Like if you work for a corporation and you're a good person, but the

2:06:10

corporation is polluting

2:06:12

a river in Guatemala, there's a diffusion of responsibility because you're a

2:06:16

part of a giant

2:06:16

system.

2:06:17

And hey, I'm just an accountant.

2:06:18

I go to work and I do my thing for Exxon or Mobil or whatever it is.

2:06:22

Yeah.

2:06:22

Well, I'd say for however messy all of this has become in the US, at least you

2:06:26

have had

2:06:27

some sort of attempt to strip out the very stuff that that guy was talking

2:06:30

about.

2:06:31

The fact that the civil service is all one way, the fact that the machinery of

2:06:34

government,

2:06:34

that was the plan, right?

2:06:36

So the machinery of government works in a certain way.

2:06:38

So there's no democratic means of getting rid of it.

2:06:40

There's no way to change it.

2:06:41

Well, I think the counter to that is the education that the internet provides.

2:06:45

And that's where they didn't anticipate in 1984.

2:06:48

So the education that the internet provides is untethered.

2:06:53

But then the internet tells us that Christopher Nolan's just made a film with

2:06:55

the Black Helen

2:06:56

of Troy.

2:06:56

Right.

2:06:57

And he hasn't.

2:06:57

It all, it produces all sorts of unsavory things too.

2:07:01

Yeah.

2:07:01

But it also allows the distribution of information that would be impossible

2:07:06

through normal means

2:07:07

if these people are, as he said, in control of major media.

2:07:11

Which they were in control of universities, which they are.

2:07:14

And then it goes on to be the only way people get information.

2:07:18

Now your information is very heavily filtered.

2:07:20

And then all that stuff works.

2:07:22

So that's why the technocrats in the EU, why ideologues generally are against

2:07:26

internet or they want to censor it.

2:07:29

That's why Macron is trying to stop X in France or whoever's trying to stop it.

2:07:35

So the EU, the head of the EU commission is Ursula von der Leyen.

2:07:37

Did you hear her?

2:07:39

It's a great name, by the way.

2:07:40

Well, yeah.

2:07:40

It's a sexy name, right?

2:07:41

Yeah.

2:07:42

She's unelected.

2:07:44

The European commission is an unelected body that sets the legislative agenda

2:07:49

of all these

2:07:49

European countries.

2:07:50

Absolutely crazy.

2:07:51

You can't vote them out.

2:07:52

She did a speech last May where she said, and I'm not joking about this.

2:07:57

She said that misinformation was like a virus and you need to inoculate

2:08:01

yourself against

2:08:02

the virus.

2:08:02

And the phrase she used is not debunking, pre-bunking.

2:08:06

So pre-bunking is her idea of what you do with misinformation.

2:08:09

What she means is censorship.

2:08:11

But pre-bunking is the most sinister.

2:08:14

That's crazy.

2:08:14

Chill it.

2:08:15

Like if you were to say, I'm going to come up with the most Orwellian, sort of

2:08:19

dark lord

2:08:20

kind of Sith.

2:08:21

Pre-bunking.

2:08:22

Pre-bunking.

2:08:23

Yeah.

2:08:23

That's like fucking minority report, right?

2:08:25

Pre-crime.

2:08:26

I don't know what the, because I know that there's this free speech debate

2:08:30

opening up between

2:08:30

the US and Europe generally.

2:08:32

You know when J.D. Vance came over to Munich and gave that talk to all the

2:08:35

European leaders

2:08:36

and said, you've got to stop censoring your people.

2:08:39

You've got to stop running away from voters.

2:08:40

And they were shocked.

2:08:41

Yeah.

2:08:42

And they were horrified.

2:08:42

But he was dead right.

2:08:43

He's dead right.

2:08:44

And he should.

2:08:44

And you know what?

2:08:45

People on the left should admit that he's dead right as well.

2:08:48

But there's something about Europe, right?

2:08:50

There's something about, like I think over here, coming over here, I get the

2:08:53

sense that

2:08:54

even if most left-leaning people as well as right-leaning people do value free

2:08:58

speech

2:08:59

as a kind of shared value.

2:09:02

And in Europe, it's not that.

2:09:03

There's a real sense of, we can't trust the masses.

2:09:06

Because I know that the EU is seen as this big lefty thing, which it absolutely

2:09:10

is not.

2:09:11

The EU is a body that wants to censor its citizens.

2:09:15

It's a body that tells people, you can have a referendum, but if you get it

2:09:18

wrong, we're

2:09:19

going to make you vote again.

2:09:20

It's not a democratic organization.

2:09:22

So no wonder Vance is sort of, and Trump is at loggerheads with this body.

2:09:26

Because you've got these, we in the UK have a authoritarian leader, Keir Starmer,

2:09:30

the prime

2:09:30

minister.

2:09:30

He couldn't be further away from the American ideal of free speech.

2:09:34

He introduced this online safety bill, which is basically, this is why a lot of

2:09:39

tweets in

2:09:40

the UK, if you go to the UK now, a lot of the tweets will come up saying this

2:09:43

is potentially

2:09:43

harmful content.

2:09:44

So we're screening it out.

2:09:45

He, you know, they're trying to get rid of juries for certain trials.

2:09:49

Yeah, they did get rid of juries.

2:09:50

Right?

2:09:50

They already did.

2:09:51

And that's particularly dangerous because some of those cases are for speech

2:09:56

crime.

2:09:57

Right?

2:09:57

So I'll give you an example.

2:09:59

There was a Royal Marine called Jamie Michael who had made a video just saying

2:10:03

we need to

2:10:04

peacefully protest against the migration issue.

2:10:06

They took him to court for stirring up racial hatred.

2:10:09

But the jury is what let him off.

2:10:12

It was the jury that saved him.

2:10:14

In this new system, there wouldn't be a jury there and he would be in prison.

2:10:18

Yeah.

2:10:18

And most certainly would be in prison.

2:10:21

So I kind of feel like, and we've got Keir Starmer now for another three years.

2:10:24

Every decision that he makes is about not trusting the public, censoring what

2:10:28

they think.

2:10:29

If he could get rid of X, he absolutely would.

2:10:31

Is it possible that someone sensible could win in three years?

2:10:36

Or is the system so deeply entwined in the ideology of the English people that

2:10:43

it's just stuck?

2:10:44

This is what I think about that.

2:10:45

Because I look at America and I think, in a way, you had your culture war

2:10:48

election because

2:10:49

of Trump, right?

2:10:50

Yeah.

2:10:50

You know, I mean, a lot of people say the culture war doesn't matter.

2:10:53

Of course it does.

2:10:54

Of course it matters.

2:10:55

I mean, did you see about that, the advert that the GOP put out, you know, Kamala

2:10:59

Harris

2:11:00

is for they, them, Trump is for you.

2:11:01

That was the slogan.

2:11:02

It was about the Democrats wanting to fund transgender surgery for prisoners.

2:11:08

And Donald Trump's team had this advert, Kamala Harris for they, them, Donald

2:11:12

Trump is for

2:11:13

you.

2:11:13

That actuated a 2.7 shift in favor of Donald Trump among everyone who saw it.

2:11:18

It was a major success.

2:11:20

That just shows that these issues, these cultural issues, people do care and

2:11:24

people do vote.

2:11:25

But you had a way in America to vote that stuff out through Trump, right?

2:11:28

We've never had that.

2:11:30

We've had...

2:11:31

But they barely had a way.

2:11:32

Like, if they had more time, they wouldn't have.

2:11:35

You mean that if the Democrats had...

2:11:37

Like, if the Democrats won this time and then they tried to do it again in 2028,

2:11:42

Elon was

2:11:43

really adamant about that during the last election.

2:11:45

Like, this might be the last real election we have if you don't stop this now.

2:11:50

Because they have an open border and in the last four years, they've pulled 10,

2:11:56

at least

2:11:56

10 million people into this country.

2:11:58

And they've changed the electoral map.

2:12:01

Yeah, yeah.

2:12:01

And then on top of that, there was both Schumer and Nancy Pelosi openly talking

2:12:08

about letting

2:12:10

these people vote.

2:12:11

Openly talking about giving these people a path to citizenship.

2:12:14

And they had already put them on Medicaid.

2:12:16

They had already put them on Social Security.

2:12:18

They were giving them EBT cards.

2:12:20

They were housing them at the Roosevelt Hotel in New York City.

2:12:22

They were giving them money and helping them get to these states.

2:12:26

Right.

2:12:26

They were flying them through into America and putting them in these places

2:12:31

because they

2:12:32

were trying to get voters.

2:12:34

So another four years.

2:12:35

Another four years.

2:12:36

They might have had it completely locked up.

2:12:38

You know, that's what the Democrats have said about the Republicans.

2:12:41

I mean, Oprah Winfrey was saying this might be the last election we have if we

2:12:44

don't vote

2:12:44

for...

2:12:45

If we don't vote for Kamala Harris.

2:12:46

Oprah Winfrey had Donald Trump on her show years ago and was asking him to be

2:12:51

president.

2:12:52

Yeah, they were mates, weren't they?

2:12:53

Yeah.

2:12:53

Oh, yeah.

2:12:54

Yeah.

2:12:54

Look, they all get captured.

2:12:56

They all get captured by groupthink and ideology and they all get captured by

2:13:00

money and protecting

2:13:01

it and who's going to protect them and...

2:13:03

But we don't have that safety valve in the UK.

2:13:06

So like I say, you were able to...

2:13:08

For all the imperfections, you were able to vote in an administration that was

2:13:12

actually

2:13:12

going to rip out that whatever you call it, right?

2:13:16

The system is better.

2:13:16

It showed the system is better.

2:13:17

Even though the system was trying to get rigged, enough people revolted against

2:13:22

it.

2:13:22

Yes.

2:13:23

But look at the ideas that you're attaching to this administration.

2:13:28

Like, look, the ICE stuff is horrific.

2:13:30

The people getting shot, it's horrific.

2:13:33

We all agree to that.

2:13:33

There's a lot of the authoritarian aspects, it's horrific.

2:13:36

But what they've stopped is all of this illegal immigration, right?

2:13:41

They've stopped all the illegal immigration.

2:13:44

Legal immigration is still available.

2:13:45

And then what they've also done is investigate literally billions of dollars in

2:13:50

fraud and

2:13:51

they're uncovering it over and over and over and over again.

2:13:54

So there was obviously crime that was going on that was not being addressed by

2:13:57

the previous

2:13:58

party.

2:13:58

Sure.

2:13:59

And this is one of the reasons why they didn't want the Republicans getting in

2:14:02

in the first

2:14:02

place.

2:14:03

So they still have to label them in the most horrific ways possible, accentuate

2:14:08

all the

2:14:09

negative aspects of what's going on with the ICE stuff, but not talk at all

2:14:13

about the economy

2:14:14

taking an uptick, not talk at all about GDP, not talk at all about tariffs

2:14:18

being effective,

2:14:19

not talk at all about any of the positive things.

2:14:22

Stopping wars.

2:14:23

He stopped wars in multiple different countries, stop conflicts.

2:14:26

There's no one's talking at all in an objective sense.

2:14:30

It's this is a Nazi party.

2:14:32

These are fascists.

2:14:33

We have to have no kings.

2:14:35

Stop the fascists.

2:14:36

So these narratives are just being pushed out there constantly by the media,

2:14:41

all the while

2:14:43

these politicians are absolutely terrified that these investigations are going

2:14:47

to start

2:14:47

moving into their states and uncovering more and more and more fraud, which

2:14:50

they're going

2:14:51

to.

2:14:52

I mean, I know you say it's so reckless, though, I think as well for the

2:14:54

Democrats to, like

2:14:55

you say, paint ice as Nazis, talk about that this is the equivalent of the Gestapo.

2:14:59

I think someone used that phrase.

2:15:00

I mean, what you're saying about the shootings, obviously, we all agree it's

2:15:03

absolutely horrific.

2:15:04

Any kind of situation where police inflict that kind of violence and someone

2:15:08

needs to be thoroughly

2:15:08

investigated and looked into and all the rest of it.

2:15:10

But I'm concerned about the politicians I know.

2:15:12

Go there.

2:15:12

Get in the way of federal agents while they're enforcing the law.

2:15:16

They're just trying to be popular.

2:15:18

But they're putting people's lives at risk.

2:15:20

I mean, but it's again that that chess move again, giving up the rook or

2:15:26

attacking a rook

2:15:27

and giving up your queen because of it, because you just want the current.

2:15:30

Well, it's it's working insofar as the like the public is turning against Trump

2:15:34

because

2:15:34

of what's happening with ICE.

2:15:35

I mean, there's certainly a lot of that.

2:15:37

Yeah, there's certainly a lot of that.

2:15:38

The narrative is out there.

2:15:40

But it's dependent upon how far it goes.

2:15:43

Yeah.

2:15:44

Right.

2:15:44

They've got to deescalate this violence.

2:15:46

Yes.

2:15:47

They've got to make sure that that.

2:15:48

But you also need support of local police.

2:15:51

You can't have people attack the hotels where these ICE people are staying and

2:15:55

have no support

2:15:56

whatsoever by the police.

2:15:57

That's crazy.

2:15:58

They're being told to stand down.

2:16:00

So this is messy stuff.

2:16:01

And you.

2:16:01

Yeah.

2:16:02

But look how hard it was.

2:16:03

I mean, you talk about how, you know, Trump has come in and he stripped away

2:16:07

all this stuff

2:16:08

and this fraud.

2:16:08

And but that was he didn't do it in the first term.

2:16:11

It's only when he got to the second term and it was planned.

2:16:13

And he had Doge set up and he had Musk in place and all of this deep state

2:16:17

stuff could

2:16:17

be identified and stripped out and worked on.

2:16:19

We have a lot of deep state people in his cabinet the first term.

2:16:22

Right.

2:16:22

We didn't know.

2:16:23

So he couldn't work against it.

2:16:24

Right.

2:16:24

But we can't in the UK.

2:16:26

Just to sort of explain where I think we are there is we can't do that because

2:16:31

we have

2:16:32

the two major parties are both ideologically in lockstep effectively.

2:16:35

Right.

2:16:36

So.

2:16:36

So, I mean, most of the woke stuff was pushed through the conservative party.

2:16:40

They were in power for 13 years.

2:16:42

They're ostensibly right wing.

2:16:44

They pushed through the woke stuff?

2:16:45

They pushed through all the genders, self-recognition stuff.

2:16:47

Why do you think the conservatives did that?

2:16:49

So the why is a good question.

2:16:51

So the prime minister, Theresa May, conservative prime minister at the time,

2:16:54

she said in her

2:16:56

autobiography, I'm woke and proud.

2:16:57

You know, she said like she.

2:16:59

Can you imagine Trump saying that?

2:17:01

It's the equivalent.

2:17:01

It's the equivalent.

2:17:02

So I think it's because something about this ideology infected every side of

2:17:08

the political

2:17:09

art, particularly in the UK.

2:17:11

What might happen now in the UK is reform are probably going to win the next

2:17:15

election.

2:17:15

That's in three years time.

2:17:17

And that's so seismic because it will blow apart this two party system that we've

2:17:21

got.

2:17:21

That probably couldn't happen in America, right?

2:17:23

You probably couldn't get a third party that can win.

2:17:25

We have a third party that can win.

2:17:26

That's new.

2:17:27

Really?

2:17:27

And that's we haven't had that for a long, long, long, long time.

2:17:31

But what is the possibility that it could win?

2:17:33

You think it's 50-50?

2:17:34

Look at it this way.

2:17:35

We've been sort of veering massively from, you know, the conservatives under

2:17:38

Boris Johnson

2:17:39

won this mad, mad majority, like 80 seat majority.

2:17:42

And they could do whatever they want and they squandered it.

2:17:44

People were so resentful of what happened with Johnson, who, by the way, let in

2:17:49

so more

2:17:49

immigration than illegal migration than we've ever had, right?

2:17:52

Did he do that for cheap labor?

2:17:54

Probably.

2:17:54

I mean, I think that's certainly part of it.

2:17:56

Certainly that's part of it.

2:17:57

That's a problem that conservatives don't want to admit that they were, you

2:18:01

know, I had

2:18:02

a conversation with a very prominent politician who explained to me that he had

2:18:06

a conversation

2:18:07

with a guy who was a CEO of a corporation that didn't want to stop the flow of

2:18:10

illegal immigration

2:18:11

because he wanted cheap labor.

2:18:13

And he was flabbergasted.

2:18:14

He was like, I can't fucking believe this guy's saying this out loud.

2:18:18

It's worse with Johnson because in their manifesto, they pledged not to do it.

2:18:21

So they had a promise.

2:18:23

They call it the Boris wave.

2:18:24

So that's how bad it was.

2:18:27

And then you have Starmer and the Labour Party who were just as bad, if not

2:18:30

worse.

2:18:31

And, you know, we have a situation where it's unmanageable now and reform this

2:18:36

third party,

2:18:36

Nigel Farage's party, is saying, no, we're actually going to tackle this.

2:18:39

And of course, ultimately, what happens is the public, they reach a tipping

2:18:44

point and they

2:18:44

say, by the way, Starmer is the least popular prime minister on any opinion

2:18:49

poll ever in the

2:18:50

history of records.

2:18:51

He's gone from a massive majority to nothing because he's been so useless on

2:18:56

all of this stuff

2:18:57

because he's been so captured by the ideology, because he doesn't care about

2:19:00

migration, because

2:19:01

he said that anyone who was concerned about the grooming gang scandal was

2:19:05

jumping on a bandwagon

2:19:06

of the far right.

2:19:07

That's what he said.

2:19:08

Yeah.

2:19:09

So all of this has happened.

2:19:11

But you can't blame the left.

2:19:12

It's the left and the right.

2:19:13

It's both of them.

2:19:15

It's why they call it the uni party.

2:19:16

It's the same thing.

2:19:17

So you need something else to come along and explode it.

2:19:21

What do you think the possibility of Farage winning?

2:19:23

Pretty high.

2:19:24

Right.

2:19:25

So if it were today, he'd win.

2:19:26

If he didn't get whacked between now and then.

2:19:28

Do you guys whack people over there very often?

2:19:31

Less than here.

2:19:31

I think that's more.

2:19:32

It's more an American thing.

2:19:33

It's a lot easier.

2:19:35

A lot more guns over here.

2:19:37

There's a lot more guns.

2:19:37

But fingers crossed, obviously, that won't happen.

2:19:43

But it looks like if it was today, he'd win.

2:19:45

There's obviously a couple of years.

2:19:47

I mean, he could mess things up.

2:19:48

Something crazy could happen.

2:19:49

Get caught with a live boy or a dead girl.

2:19:51

Something like that.

2:19:52

But I think with Starmer, people are just sick of it.

2:19:55

He has continually backtracked on all his promises.

2:19:58

He's not interested.

2:19:59

He dismisses people's concerns about immigration.

2:20:01

He dismisses people concerned about the mass rape of children in the grooming

2:20:05

gang scandal.

2:20:06

They had to be dragged kicking and screaming to do an inquiry about that.

2:20:10

They didn't want to do it.

2:20:11

And because they're so terrified of being called racist, ultimately, so they

2:20:16

let this thing slide.

2:20:17

So I think people are just sick of it.

2:20:19

I think people have reached the point where even I think people who don't like

2:20:22

Nigel Farage

2:20:23

will hold their nose and vote for a third party to explode the system.

2:20:27

And maybe we might be able to reset after that.

2:20:29

Maybe something could happen.

2:20:30

One of the things that's interesting in America is a lot of young people are

2:20:33

becoming conservative.

2:20:34

That is interesting, yeah.

2:20:36

It's interesting because I think that's a force of the internet.

2:20:39

And being a conservative more today is more like being a rebel.

2:20:43

It's like bucking this system.

2:20:46

Whereas it used to be that if you were a rebel, you were left wing.

2:20:50

You were like, you're a hippie.

2:20:52

Yeah.

2:20:52

You know, and that's not really the case anymore because the system that has

2:20:56

power is a system

2:20:57

that is pushing this one very particular ideology that also demonizes young

2:21:03

males.

2:21:04

Hugely.

2:21:05

Yeah.

2:21:05

But that's also why I don't think it's about left and right anymore.

2:21:07

I think one of the things about the culture war is it kind of killed off left

2:21:11

and right.

2:21:12

Like I say, in the UK, we couldn't vote this out.

2:21:14

We had a right wing party.

2:21:16

It didn't make a difference.

2:21:17

The left wing party makes it worse.

2:21:18

We had a prime minister, you know, Keir Starmer on radio saying that 99.9% of

2:21:24

men,

2:21:25

women don't have a penis, which means that there are, what is it, 35,000 female

2:21:30

penises out there.

2:21:31

It's quite a lot.

2:21:32

If you can picture that image, you know, so that's our prime minister saying

2:21:36

this crazy.

2:21:36

Our deputy prime minister said on TV that you could grow a cervix if you wanted.

2:21:42

So that's David Lammy.

2:21:45

That sounds like I'm making that up.

2:21:47

He said that.

2:21:47

You can check that.

2:21:48

He said you could grow a cervix.

2:21:49

So these are the kind of people who are in charge now who it's just all about

2:21:53

their fake, you know, fake ideology.

2:21:56

That's all it is.

2:21:57

Which is why internet censorship is so much more prominent there.

2:21:59

That's why it's going to happen.

2:22:00

No, that's why they're going to absolutely try to do that.

2:22:02

Yeah.

2:22:02

Exactly.

2:22:02

Well, they are doing it.

2:22:04

Just self-censorship by arresting people.

2:22:06

There's a lot of censorship involved in scare.

2:22:08

Yeah.

2:22:09

Just in the fear of being arrested.

2:22:11

But the problem for reform will be do they have the guts to do what Trump did?

2:22:14

Do they have the guts to come in and say, look, we need to scrap the civil

2:22:17

service.

2:22:18

Well, you can't scrap the civil service, but you need to sort of bleed it dry.

2:22:21

You need to give it a good rinse, right?

2:22:24

You need to get rid of the...

2:22:25

Because there have been whistleblowers in the UK civil service who have said,

2:22:28

we're not going to do what the elected politicians say.

2:22:31

If they come in and say there's an immigration problem, we're just going to stymie

2:22:34

that.

2:22:35

We're not going to do what they want.

2:22:36

We've got police who are routinely investigating people for their opinions.

2:22:41

Just to put that into context, by the way, if we're talking about this deep

2:22:45

state that we've got to clean out,

2:22:47

our police force is trained by a body called the College of Policing.

2:22:51

They have been telling police for years, it's your job to arrest people for

2:22:55

what they think and what they say.

2:22:57

And the high court told them, you've got to stop this.

2:23:01

You've got to stop recording non-crime hate incidents.

2:23:04

Two home secretaries said to them, you've got to stop recording non-crime hate

2:23:08

incidents.

2:23:09

They ignored the courts.

2:23:10

They ignored the government.

2:23:12

And that's the power of an ideologically captured quango.

2:23:16

That's crazy.

2:23:17

That's the problem.

2:23:19

So even when you vote for a party that's going to strip this stuff out, you

2:23:23

still have to do the actual hard work of stripping out.

2:23:26

I would abolish the College of Policing.

2:23:28

Do people know about non-crime hate incidents?

2:23:30

Do they know that this is a thing in America?

2:23:32

Do they know that that's what we do in the UK?

2:23:34

I mean, people are just aware that there's a lot of arrests because of social

2:23:37

media posts.

2:23:37

We don't pay nearly as much attention to the UK as the UK pays attention to

2:23:42

American politics.

2:23:43

And that's fair enough because we're a small island.

2:23:46

That's fair enough.

2:23:46

But what I would say is it's worse than people think insofar as the 12,000

2:23:51

arrested a year, that's horrific.

2:23:53

But with the police routinely checking up on you if you commit non-crime, that's

2:23:57

sort of even worse, isn't it?

2:23:59

The Scottish police have a database of jokes that they've seen online that they

2:24:03

think are problematic and they've kept this.

2:24:05

The Scottish police introduced a hate crime bill two years ago now, which can

2:24:09

prosecute you for things you say in your own house.

2:24:11

There's a section in that bill on the public performance of a play.

2:24:15

So if a play is offensive, they can arrest you.

2:24:17

If you're the director or an actor involved in the play and it's considered

2:24:20

offensive, they can arrest you.

2:24:21

When they implemented that hate crime bill, they set up hate crime reporting

2:24:26

centres.

2:24:28

So if you felt offended, you could, and they converted like, there was a sex

2:24:31

shop, I think.

2:24:31

There was a mushroom farm.

2:24:32

You could go and report hate to the police as and when it occurs.

2:24:37

And this is coming from the police force, the people who are supposed to

2:24:41

sustain authority and prevent criminality.

2:24:44

And you've seen the viral videos of people, police coming, knocking on people's

2:24:47

doors saying, you said this thing online.

2:24:50

That's insane.

2:24:51

So I think it's worse than just the arrest.

2:24:54

I think it's a rotten system that is being trained by activists in the College

2:24:59

of Policing that no government will deal with.

2:25:02

They don't get rid of these activists.

2:25:04

They let the activists.

2:25:05

And the activists, when they're told to stop it, they carry on anyway.

2:25:09

And the entire culture has to shift.

2:25:12

That's what I mean.

2:25:12

That's what I mean.

2:25:13

You need a politician to go in and say, scrap the College of Policing, strip

2:25:17

out all the activists within the NHS, within the army, within the police,

2:25:20

within the Crown Prosecution Service.

2:25:22

It also has to get so bad that people realize how bad it is and they need

2:25:26

radical change.

2:25:27

But I think the grooming gangs did that.

2:25:29

Yeah.

2:25:29

I think the fact that we effectively sacrificed thousands of kids on the altar

2:25:33

of ideology, the fact that we said, you know, there were politicians, councillors,

2:25:40

doctors, social workers saying, we don't want to be called racist.

2:25:44

So we're going to ignore the sexual assault of children on a mass scale.

2:25:48

And that was not really thoroughly covered here in America.

2:25:51

Really?

2:25:51

In mainstream news.

2:25:52

I think because Elon.

2:25:53

No, online it was, but not in mainstream news.

2:25:56

So do people not generally know about that?

2:25:58

They know about it now.

2:25:59

Right.

2:25:59

Okay.

2:26:00

But it wasn't something that you would see every night on CNN.

2:26:03

Really?

2:26:04

That's a huge story.

2:26:05

But the power of being called racist became so intense.

2:26:09

I mean, even, you know, that horrible bombing at the Manchester Arena at the

2:26:13

Ariana Grande concert.

2:26:15

In the subsequent report of what went wrong, one of the security guards said he

2:26:20

saw the perpetrator with the rucksack and he didn't approach him or apprehend

2:26:24

him because he was afraid of being called racist.

2:26:26

That was the reason.

2:26:27

And as a result of that, two dozen children lost their lives.

2:26:30

The power of smearing someone as racist is so potent, which is why I think here

2:26:35

in America, the word fascist, the word Nazi gets thrown around so much because

2:26:40

they know if someone is so branded, you disoblige yourself from having to

2:26:45

engage with their ideas.

2:26:47

They become this kind of monster that you don't have to even think about or

2:26:50

worry about.

2:26:50

And we're just I think we're just getting over that in the UK now where the

2:26:54

accusation of racism no longer really sticks.

2:26:58

I think people think it doesn't mean anything anymore.

2:27:00

And, you know, they've tried with reform.

2:27:03

They've tried saying that reform is a racist party.

2:27:05

It's a far right party.

2:27:07

No one's buying it anymore.

2:27:09

And I think that's why hopefully something can change.

2:27:12

I think the grooming gangs, I think the mass immigration to the extent where

2:27:16

people now are at risk.

2:27:17

They just are.

2:27:18

Unvetted people, many with criminal records.

2:27:21

We don't want to go the way of Sweden.

2:27:23

I mean, you know how bad Sweden's got.

2:27:25

You know, Sweden used to be the most high trust society in Europe, low crime.

2:27:31

They allowed mass immigration on a scale they couldn't possibly contain.

2:27:36

I think it's now 20 percent of Swedish population are now foreign born.

2:27:40

And predominantly they live in ghettos where crime is rife.

2:27:43

They didn't integrate.

2:27:45

There was no expectation they should integrate.

2:27:46

And as a result of that, it's gone from being one of the safest countries in

2:27:50

Europe to being the country that has most gun and bomb attacks.

2:27:55

Of any country not at war except for Mexico.

2:27:57

And that's happened in the space of 10 years.

2:27:59

Crazy.

2:28:00

It's an absolute trap.

2:28:01

I remember when it was going on and a Swedish stand-up friend of mine, Tobias

2:28:05

Pearson, texted me saying there's gun, there's grenades going off in Stockholm.

2:28:09

There's gunfire on my street.

2:28:11

And the politicians are doing nothing about it.

2:28:14

They're saying this doesn't matter.

2:28:16

I was in Sweden a couple of years ago.

2:28:17

I was talking to a bunch of – you know what Swedes are like.

2:28:20

They're very middle class, very – well, not all of them, but very liberal.

2:28:23

Not a racist shred in their body.

2:28:27

And they all came back to the same story.

2:28:29

They all wanted to discuss immigration.

2:28:30

And they all come back to the same thing.

2:28:31

One woman said to me, I got this wrong.

2:28:33

We got this wrong.

2:28:34

Why do you think they did it?

2:28:36

Good intentions first and foremost.

2:28:38

Really?

2:28:39

Okay.

2:28:39

Well, there's a –

2:28:40

Really?

2:28:40

You think it's just good intentions to let all those people in?

2:28:43

Have you met Swedes?

2:28:44

I have, but I mean, come on.

2:28:46

It's happening in America.

2:28:47

It's happening in England.

2:28:48

It's happening in the UK.

2:28:50

Yes.

2:28:50

It's happening in Ireland.

2:28:51

It's happening – it's just good intentions everywhere.

2:28:54

Could it also be – could it also be this delusion, this idea, what you would

2:28:58

call, I suppose, liberal universalism, this idea that everyone is basically the

2:29:02

same.

2:29:02

Everyone in every culture basically wants the same things.

2:29:06

It explains the queers for Palestine phenomenon.

2:29:09

You know, it doesn't matter where you go.

2:29:10

No, no, no, no.

2:29:11

The queers for Palestine phenomenon is explained by the internet.

2:29:14

There are people being stupid and being in a bubble where they never experience

2:29:17

those folks.

2:29:17

I don't think – I think this is organized.

2:29:20

I think it's organized.

2:29:21

I think the more chaos there is, the more they can crack down on your rights.

2:29:25

I know you think it's organized.

2:29:27

I'm not convinced of that yet.

2:29:28

I'm open to it.

2:29:30

Okay?

2:29:31

I mean, at one point in time, it's fairly universal in Western societies now to

2:29:36

try to ruin them.

2:29:37

Yes.

2:29:37

In America as well.

2:29:38

Yeah.

2:29:39

For the last four years before Trump got into office, that's what they were

2:29:42

doing here.

2:29:42

It seems like a strategy.

2:29:44

It doesn't seem as simple as just good intentions.

2:29:47

I know.

2:29:47

Well, and that does seem too simplistic.

2:29:49

I absolutely agree with that.

2:29:50

You create more chaos.

2:29:51

The more chaos you have, the more laws you need.

2:29:53

The more laws you need, the more control you have.

2:29:56

But speaking to these people in Sweden, I mean, I was there – it was an event

2:29:59

where we were talking about a book I'd written.

2:30:00

So it was all about these issues.

2:30:02

And I was mingling and talking to them.

2:30:04

And they all wanted to talk about it.

2:30:05

But they're the citizens.

2:30:07

That's what I mean, though.

2:30:08

They're not the people that implemented those laws in the first place.

2:30:10

That's where I'm cynical.

2:30:11

I think the people that implement those laws in the first place, they know what

2:30:14

they're doing.

2:30:14

Yes.

2:30:15

And, well, certainly they're aware of the risks.

2:30:18

I mean, if you take what happened in Cologne, that New Year's Eve party, where

2:30:21

I think over 800 women were sexually assaulted.

2:30:23

And the media didn't report it.

2:30:26

And the government wanted to sort of minimalize it and say that this wasn't

2:30:29

real.

2:30:30

It's not even just the risks.

2:30:31

It's the physical, actual, measurable consequences.

2:30:35

Yes, exactly.

2:30:35

And they're not course correcting.

2:30:37

That, to me, leads me to think that they know what they're doing.

2:30:40

You don't think it could just be complete naivety, this idea that –

2:30:44

I think it's the best way to combat the internet.

2:30:46

The best way to combat the internet is create a massive amount of chaos and

2:30:49

then crack down on people's lives.

2:30:51

I suppose what worries me about it is, though, the assumption that it's all

2:30:55

sort of coordinated.

2:30:56

Will take you down that route where you start thinking, as some friends of mine

2:31:00

now think, the world is controlled by a group of Satanists who sit in a room

2:31:05

and they choose the leaders and they – do you know what I mean?

2:31:08

Well, I don't think it's Satanists, but I think it's incredibly wealthy people.

2:31:12

But why would it be in their interest to destroy the economy that so sustains

2:31:15

them?

2:31:16

Well, it depends on where they are and who they are.

2:31:18

But George Soros clearly does that and he's talked about it.

2:31:21

He's talked about enjoying destroying democracies and enjoying destroying

2:31:26

countries.

2:31:27

He's kicked – he's not allowed to go into certain countries.

2:31:30

He makes money doing it.

2:31:31

But he relies on those democratic societies to make –

2:31:34

Yeah, but they're still functional.

2:31:35

He just profits off of it largely.

2:31:37

That's what I struggle with, though.

2:31:38

Like, you know, someone who believes in fundamentally the capitalist dream can't

2:31:44

yearn for –

2:31:45

You can – it's subject to manipulation.

2:31:48

Yeah.

2:31:49

And intelligent, evil people or at least amoral people.

2:31:53

But this doesn't answer why people do vote for it and they do.

2:31:56

I mean –

2:31:57

They do vote for it because they've done a really good job of attaching it and

2:32:01

there's also this ideology thing.

2:32:03

There's left and right.

2:32:03

Yeah.

2:32:04

And if you're left, you're blue no matter who, blue to the grave.

2:32:07

That's it.

2:32:08

Yeah.

2:32:08

And if anybody that votes red is a dirty, racist, fascist and they think about

2:32:12

it that way and we really have no option for a centrist party in this country,

2:32:16

which is where most people lie.

2:32:18

Most people lie in the middle.

2:32:20

Most people are very socially liberal and most of the people that I know that

2:32:23

even identify as conservative, they're very socially liberal.

2:32:27

But they're financially much more aligned with conservative ideology.

2:32:32

Sure.

2:32:33

Well, I think – I mean I think ultimately, hopefully, the brick wall of

2:32:37

reality is what cures this.

2:32:38

Like it's when –

2:32:39

If we don't destroy society along the way, if we don't allow them to destroy

2:32:43

society, if we don't completely erode all of our rights along the way.

2:32:48

And as you said earlier, you can get very close to that happening.

2:32:51

And rights lost are never regained.

2:32:53

Never.

2:32:54

Look at Australia.

2:32:55

They had one mass shooting.

2:32:57

They took their guns in the 1990s.

2:32:58

Then COVID came.

2:32:59

They're like, get in a fucking camp.

2:33:00

Yeah.

2:33:01

And they've just introduced a new hate speech law off the back of the Bondi

2:33:04

Beach shooting.

2:33:05

And of course, this, again, is really draconian.

2:33:08

It goes way too far.

2:33:10

In fact, I think the Australian hate speech law is basically saying if someone

2:33:13

does something that wasn't intended to stir up hatred but it could conceivably

2:33:18

have stirred up hatred among a theoretical group of people, then it's a crime

2:33:22

and you can get five years in prison.

2:33:23

Sure.

2:33:23

And imagine blaming that on hate speech instead of blaming it on just letting

2:33:29

wild, violent criminals emigrate into your country.

2:33:33

Right.

2:33:33

I mean, that's something.

2:33:34

Yeah.

2:33:35

What an amazing gaslighting.

2:33:38

Like not saying, hey, maybe we should stop letting violent criminals enter into

2:33:42

our country illegally and live here.

2:33:45

No, no, no.

2:33:46

What we should start doing is taking people that have done no crime whatsoever

2:33:51

and create their dissent, create a crime based on their dissent.

2:33:55

I totally agree.

2:33:56

We had it in the UK.

2:33:57

We had a politician, horrible story, a guy called David Amos.

2:34:01

You know, he was stabbed to death by an Islamist at his surgery.

2:34:05

You know, politicians, we call them surgeries where you meet face to face your

2:34:08

constituents.

2:34:09

They come and you talk about the local issues.

2:34:11

I don't think they do that in America.

2:34:12

He stabbed him to death.

2:34:15

And then there was this parliamentary debate about how can we crack down on

2:34:18

free speech online?

2:34:19

Right.

2:34:20

No, the problem was the knife wielding maniac.

2:34:23

The problem was unchecked Islamism.

2:34:26

I mean, it really is what Besbinov was saying.

2:34:29

Yeah.

2:34:30

It's that thing of not addressing the like after the not seeing the truth, not

2:34:33

seeing the truth because you've been captured.

2:34:35

But you've been demoralized.

2:34:37

But I think what's better now is that people can see through that.

2:34:39

So like when when Keir Starmer, after that horrible, I mentioned earlier, the

2:34:44

girls who were killed in the dance class by the guy who was a child of

2:34:47

immigrants.

2:34:50

He that his response to that was, OK, let's let's not, you know, deal with the

2:34:54

fact that we've got radicalized individuals within our community, young people.

2:34:58

He said, let's ban buying knives off Amazon because the guy got the knife from

2:35:03

Amazon.

2:35:04

Right.

2:35:04

You can also get them in shops here.

2:35:06

You can walk in and get a shop.

2:35:08

Most people have a kitchen knife at home.

2:35:09

It's like one of the most common weapons.

2:35:12

And he banned ninja swords around the same time, which was a big blow to the

2:35:15

ninja community.

2:35:16

But I kind of so crazy.

2:35:19

Like, that's the thing you go for.

2:35:20

You choose the thing that isn't.

2:35:23

But this is the idea of allowing this kind of chaos and having this be a

2:35:27

coordinated plan.

2:35:29

Right.

2:35:29

Yeah.

2:35:30

The more chaos you have, the more you gaslight people, the more people are

2:35:33

attached to an ideology, the more you can keep restricting their rights further

2:35:37

and further and further until they're more and more frustrated and to a lot of

2:35:41

them just give up.

2:35:42

But we are at a position now where people are seeing through it all the time in

2:35:46

the UK now, like no matter how much they smear reform is far right.

2:35:49

The polls just keep going up and up and up.

2:35:51

Right.

2:35:51

But it's because of the Internet, because you have at least some dissenting

2:35:55

voices.

2:35:56

We have that.

2:35:57

And also the palpable absurdities of what the politicians are trying to tell

2:36:00

you is real.

2:36:01

Right.

2:36:01

Is as big as reach.

2:36:03

That's why they're trying to crack down on pub talk.

2:36:05

Oh, and by the way, you know, the Labour Party has canceled a number of local

2:36:07

elections because they know they're going to lose them.

2:36:09

They've actually canceled it.

2:36:11

They've canceled them.

2:36:12

Well, they've said they've postponed them while they're reforming the system.

2:36:14

Right.

2:36:15

What they really.

2:36:16

Oh, God.

2:36:17

But it's stuff like that where.

2:36:18

Get rid of the juries, cancel elections.

2:36:22

And they're the good guys.

2:36:23

And at that point, it doesn't matter how much your propaganda or how much you

2:36:26

think your propaganda is going to work.

2:36:28

The public are going to see through that.

2:36:30

And they say, hang on a minute.

2:36:31

You're saying that I can't vote.

2:36:32

You're saying if I end up in court, I may not have a jury.

2:36:35

You're saying I can't browse through Twitter.

2:36:36

You're saying I can't say the wrong thing online.

2:36:38

Enough is enough.

2:36:39

And I think they reach a point where they say.

2:36:42

And some of the stories are so egregious, like, for instance, the guy.

2:36:45

Have you heard of a guy called Hamit Koskan?

2:36:47

This I think he's Armenian guy who burned a copy of his Koran outside the

2:36:51

Turkish embassy.

2:36:52

Right.

2:36:53

The idea of this was a protest against the Turkish government because he perceives

2:36:57

Erdogan's government as, I suppose, supporting Islamism and the rise of Islamism.

2:37:01

So he protests outside the thing, burns the Koran.

2:37:04

Two people attack him, one with a knife.

2:37:06

The other, some Deliveroo driver, starts kicking him.

2:37:08

He gets prosecuted in a court of law for inciting the violence.

2:37:13

And the judge actually says, the fact that you were attacked is proof that you

2:37:16

were inciting violence.

2:37:18

Right.

2:37:19

It took the free speech union in the UK to have that overturned, to fight on

2:37:22

his behalf, to say that's a peaceful protest.

2:37:25

It was his copy of his book.

2:37:26

We don't have blasphemy laws in the UK.

2:37:29

But now the CPS, the Crown Prosecution Service, is trying to overturn that

2:37:32

because they want to see this guy go down.

2:37:34

And that is what we're talking about.

2:37:36

We've got bodies like the Crown Prosecution Service saying, no, we want an

2:37:39

Islamic blasphemy code in the UK.

2:37:40

The Labour Party wants an official definition of Islamophobia.

2:37:44

So you can't criticise, you can't peacefully protest, you can't burn a book

2:37:48

that you bought, you know, and all of that.

2:37:50

And we're seeing this happen in front of us.

2:37:52

And people are just saying, look, we believe in plurality.

2:37:55

We believe in freedom of religion.

2:37:56

You should be able to, you know, we've got nothing against Muslim people.

2:37:59

What we are objecting to is the idea that we shouldn't be able to ridicule your

2:38:02

religion or mock your religion or protest against your religion.

2:38:05

And you're going to pathologise it by saying we've got a sickness, we're Islamophobic.

2:38:09

I think people, I think that case, the fact that you can't burn, I mean, some

2:38:14

kid in a school in Wakefield accidentally scuffed a copy of his Koran and he

2:38:19

got hit with a non-crime hate incident.

2:38:21

And there was a big issue and the police got involved.

2:38:24

You know, we have to hold fast to this idea that, no, no idea, no idea doesn't

2:38:30

get criticised.

2:38:31

And so I just think the more stories like that happen, maybe I'm naive, but I

2:38:37

think the British public's patience is kind of at the very end.

2:38:42

I hope so.

2:38:42

I hope it's not too late.

2:38:44

I really do.

2:38:44

But in the meantime, your book, The End of Woke, it's available.

2:38:49

Did you do the audio version of it?

2:38:50

I did.

2:38:51

It took me ages.

2:38:51

Yeah.

2:38:52

I'm glad you did it, though.

2:38:54

Yeah, I'm sure it is, but it's always so much better when it's in someone's

2:38:57

voice, especially someone like you.

2:38:58

Thank you, Andrew.

2:39:00

Really appreciate it.

2:39:01

And I hope you guys figure it out over there.

2:39:03

But in the meantime, I'm glad you're here.

2:39:04

Well, I got away.

2:39:05

I'm glad.

2:39:06

I'm glad, but I mean, it shouldn't be that everybody has to escape.

2:39:09

That's crazy.

2:39:10

No, I know.

2:39:10

You know, it's nuts.

2:39:11

And then what's going to be left?

2:39:13

Like, only people that are submitting and then the chaos of what you've allowed

2:39:17

in?

2:39:17

Right.

2:39:18

Fucking nuts.

2:39:18

Exactly.

2:39:19

So you've got to make sure that America doesn't go to pop, because I need this

2:39:21

place to work.

2:39:22

Yeah, I need it to work, too.

2:39:23

Well, it's part of my business model.

2:39:25

All right.

2:39:26

Thank you.

2:39:26

Bye, everybody.

2:39:27

Bye, everybody.