Joe Rogan Experience #2494 - Chamath Palihapitiya

23 views

9 hours ago

0

Save

Chamath Palihapitiya is a venture capitalist, engineer, founder of Social Capital, and a co-host of the podcast “All-In.” https://www.youtube.com/@allin https://www.youtube.com/@chamath https://chamath.substack.com https://www.socialcapital.com

ChatJRE - Chat with the JRE chatbot

Timestamps

0:00UAP disclosure skepticism to 'attention' as the driver of tech, AI, and social unrest
9:55Wage vs. capital taxation, inequality, and shifting the tax burden to corporations
19:53Tech power, narrative control, and AI-driven disruption (education, parenting, jobs)

Show all

Comments

Write a comment...

Transcript

0:00

I was listening to Tim. First of all, hello. What's up? Good to see you, my

0:16

friend. Great

0:16

to see you. We were listening to Tim Dillon. I was listening to it on the way

0:20

over here,

0:21

and he was talking about Anna Paulina Luna and Tim Burchette and Trump. They're

0:25

all talking

0:26

about the UAP disclosures, and why now? What are they doing? Why are they

0:31

distracting us

0:32

with this? Tim Burchette said that whatever they're going to release, it will

0:36

be indigestible.

0:38

What does that mean?

0:39

Right.

0:40

Indigestible as in, well, then it doesn't mean that it's real then.

0:45

Well, I think it means that it'll be so crazy if it's real. So crazy. He's the

0:51

one that's

0:52

been saying that there's these confirmed bases under the ocean, that there's

0:56

these specific

0:57

locations. I think you talked to, you're shaking your head. You don't believe a

1:00

word of it.

1:01

No.

1:01

How come?

1:02

I think it's true that there, look, it's completely implausible that there aren't

1:10

other species.

1:11

Right.

1:12

Just the vastness of what we're dealing with. So the real question is like, why

1:18

haven't we

1:19

encountered people or those things, those beings? And it's probably because

1:24

they have bigger fish

1:26

to fry. So by the time that we meet them and they meet us, we're going to kind

1:32

of be at the edge of

1:33

like, we've kind of been there, done that on our own planet. And then we've

1:37

kind of like developed

1:38

the technology, I guess, to get beyond it. But somewhere along the way, there

1:43

must have been a

1:44

few, just mathematically impossible. So then the question is, is it buried? Or

1:47

were people confused

1:48

when it first came? You're like, if you had a spaceship land in like the 1800s,

1:52

what would people

1:53

have done? They would have just freaked out. They wouldn't have understood it.

1:56

Maybe they would

1:57

have buried it. Depending on where it was, maybe they've started to pray to it.

2:00

Right.

2:00

And you would have just moved on. And then that isn't documented in history. So...

2:05

But it is.

2:06

But how?

2:07

It is. There's a lot of it documented in history.

2:10

Oh, you mean like hieroglyphics and like monuments and...

2:12

Well, the book of Ezekiel. The book of Ezekiel goes in depth about some sort of

2:16

a UFO encounter

2:18

that Ezekiel experiences.

2:19

Right.

2:20

Where it's a wheel within a wheel and a cloud with fire flashing forth,

2:26

continuing

2:27

suddenly in the midst of a cloud as it were gleaming metal. And from the midst

2:30

of it came

2:31

the likeness of four living creatures and the creatures darted to and fro like

2:35

the appearance

2:35

of a flash of lightning. This is all in the Bible. It's also in the Mahabharata.

2:42

They talk

2:42

about Vaimanas, these flying crafts. And I think it's entirely possible that we

2:48

have been visited

2:49

periodically and that we have been monitored and that we are monitored

2:54

currently. And if I was

2:56

going to hide, I would hide in the ocean.

2:59

Well, to be honest, as I get older, I'm convinced we're basically in some form

3:05

of a simulation.

3:06

There's like all these little ingredients that if you start to see these little

3:10

clues, you're

3:10

like, they all seem so odd in isolation. And then when you put them together, I

3:15

feel like

3:15

a crazy person. So I ignore myself.

3:17

Right.

3:18

But I wonder, like, why did this happen? Like yesterday, I was at a dinner in

3:22

LA before

3:23

I came to see you. And I told this very interesting story. Well, I thought it

3:28

was interesting at

3:29

the time. You know that like, so in 2000, right? If you think of like what

3:35

happened in tech since

3:36

2000, so the last 26 years, people can give you all kinds of like fancy

3:41

theories. But there's

3:44

like this weird word that's been at the center of every single technological

3:51

revolution for

3:52

the last 30 years. And that word is attention. Let me explain this to you.

3:56

Google, they invent

3:59

Google. What is Google? Google is a algorithm. It's called page rank. But if

4:05

you look inside

4:06

of it, what is it? It says, well, Chamath's website has five links to it. Joe's

4:11

website has

4:12

two links. He's getting more attention. Okay. Chamath's website is more

4:16

important. That's

4:17

the sum total of Google. Now they've made that a lot more refined and they've

4:20

done all these

4:21

other fancy things. But it's all about attention. Fast forward to 2007, eight,

4:28

nine, when, you

4:29

know, Zuck and then when I went to work for Zuck and we got on the scene, we're

4:32

like, what

4:33

do everybody, what does everybody care about? Attention. And so what is like

4:38

the Facebook algorithm?

4:40

What's the Instagram algorithm? You know, how do we construct newsfeed? All

4:45

around attention.

4:46

Joe had 35 likes. Jamie had 12 likes. Your thing is more important. Let's give

4:49

it more, more

4:51

importance because it's seemingly meeting all these human needs. Attention,

4:54

attention, attention.

4:56

So phase one, attention. Phase two, attention. And this is where I'm like, how

5:01

can this be

5:01

possible? In phase three, we're like looking at AI. And when you look backwards

5:06

four years,

5:06

the seminal paper is called attention is all you need. It's about this word

5:11

again. And when

5:12

you look inside of the core part, if you peel out, peel, you know, a part AI,

5:18

the little

5:20

brain that makes it so capable is called an attention mechanism. It's just

5:24

attention. It's

5:25

all about, again, this idea of I'm going to scour all this information and I'm

5:29

going to figure

5:30

out what patterns repeat itself. And I'm just going to double down on the stuff

5:33

that I see more

5:34

of because that attention must mean it's more important. It's more true. It's

5:38

more knowledgeable.

5:40

And then I think, how could it be like we're all like, why is it that these

5:44

things are just

5:44

repeating over and over again? And I just get confused. I don't know. I don't

5:48

exactly know

5:48

how to explain it. So are there other ways in which we should be doing things?

5:53

Absolutely.

5:54

Have we even explored it? No. So then I think, well, is this just a simulation?

5:57

Some kid

5:58

fucking in his house just playing some simulation and we're all just party to

6:02

it. And that's

6:02

all he understands is attention. I don't know.

6:04

I don't know. I don't think it's that simple that there's a person playing a

6:08

game. But if you break

6:10

down just attention, well, that's all of human history is paying attention to

6:16

the king, paying

6:18

attention to the war, paying attention to resources, paying attention to who

6:23

says the thing that resonates

6:24

the most with the people. It's all about what human beings are paying attention

6:29

to.

6:30

I think it's part of it. Then there's also what is actually true. And I think

6:35

sometimes what is true

6:36

and what people pay attention to are not the same thing. True. Yeah. And

6:41

sometimes the thing that you

6:44

should be paying attention to gets lost because the thing that you are paying

6:48

attention to gets more

6:49

attention because it's more interesting and useful. That's sort of where we are

6:53

right now. We're in this

6:54

really weird phase. I think where you actually like should be focused on this

6:59

thing over here.

7:00

And instead, we're all focused on all these things over here. Give me an

7:04

example.

7:05

Here's like a very big one. I think like it's pretty fair to say since the last

7:13

time you and I saw each

7:14

other on the show. The attitude towards technology, I think has been pretty

7:20

profoundly negative. It's kind of

7:23

tilted. It's relatively like anti-AI, you know, anti-billionaires. It's anti

7:29

all of this stuff.

7:30

And it manifests in all of these interesting ways. There's protests, there's

7:37

data centers,

7:38

there's all of this stuff that's happening. People are worried about job loss.

7:43

All of that stuff is real.

7:45

Do you want a cigar? No, I'm okay. I'm okay. But what should they really be

7:50

focused upon?

7:51

And I think what they should be really focused upon is we're at the tail end of

7:56

a cycle that doesn't

7:57

work anymore, which is all about like this tension between labor, people that

8:01

do the work and capital,

8:03

the people that fund it and then make all the returns. And over the last 40

8:06

years, we've basically

8:08

gone to this completely upside down world where capital extracts all of the

8:13

upside and labor has

8:15

extracted less and less and less and less. And all of this pushback, it

8:19

manifests in AI, it manifests

8:22

in politics, it manifests in social issues, it manifests in, you know, Israel,

8:26

whatever you want to talk

8:27

about. All of these issues, I think symptomologically, come from this other

8:31

issue, which is we are out of

8:33

balance, this total compact that we used to have, a liberal democracy and a

8:39

free market has totally

8:40

collapsed. And there are simple ways to fix that. But that never gets the

8:43

attention because it's not what

8:45

you want to talk about. The attention is here. You know, vote no to the data

8:50

center, you know, this model

8:53

is going to take out all the jobs. You know, this social issue is really

8:58

important, that war should

8:59

not be fought, that war should be fought, all of these things, while important,

9:04

distract us from

9:07

what the core issue is. And the core issue is that we as a society, I think are

9:11

out of balance, that the

9:13

natural compact between all of us is broken. And there are some simple ways to

9:18

fix that compact, get

9:20

people more invested, get people more engaged in the upside, have people have a

9:23

positive some view

9:24

of what's happening. And that isn't happening. What simple solutions are there

9:29

to this one very

9:31

particular issue? Okay, I'll get your reaction to this. Let's assume that you

9:35

still lived in California,

9:37

because I think it tells this example in a more extreme way. Okay. Let's say

9:41

you make a million bucks

9:43

a year, which is a lot of money, but it makes the point more cleanly. You'd pay,

9:49

I think 30% federal tax,

9:55

and you'd pay another 15 or 16% in state tax and Medicare tax and all this tax.

10:01

So if you're a wage earner,

10:05

50% of all your upside goes to the government. If you're a capital earner, and

10:13

you make that same

10:14

million dollars via capital gains, you pay half that tax. Why did that happen?

10:22

That happened because in

10:24

the 40s and 50s, but really in the 60s and 70s and 80s, what we were trying to

10:29

do or what the American

10:31

government and what Western societies were trying to do was to convince people

10:36

to invest their money.

10:37

Hey, Joe, go build that factory, go hire those people. And we're going to

10:42

incentivize you to do so.

10:45

And by doing that, there was this idea that that all of those profits that you

10:49

would get within diffuse, right,

10:51

trickle down into everybody else, the workers participated, everybody

10:55

participated.

10:56

But technology allows you to do more with less and less. So now what happens is

11:01

the capital owners

11:03

can accrue infinite almost, it seems like value, and the workers get less and

11:10

less. But now if you get

11:11

less and less, and you're taxed more and more as a percentage of what you own,

11:15

you're going to feel really out of sorts, you're going to be like, why am I

11:17

paying 50 cents of every

11:19

dollar? And I see these other ways where folks are paying 25 cents on their

11:23

dollars,

11:23

but their dollars are compounding way faster. And they have, you know, hundreds

11:28

of billions

11:28

more of those dollars than I have of my dollars. If you take that example, and

11:32

you expand it across

11:33

society, I think people understand that now, there's enough information, and

11:37

there's enough people

11:38

talking about it, where it's pretty clear that that's happened. So the question

11:42

is, how do you fix it?

11:43

I think like, if you think about AI, and if you believe that we're going to get

11:46

into this world of

11:47

abundance, and we're not working, what does it mean for governments to tax our

11:53

labor? There is no labor.

11:55

You're not working anymore. I'm not working. We're doing things out of leisure.

11:58

Why should I pay 50 cents of every dollar? Why aren't the companies that are

12:02

going to be making

12:03

trillions of dollars, why don't they pay more? Why isn't there, you know, an

12:08

expectation that

12:10

they then help our lived society do better and thrive as a result of all of

12:15

that winning? That's

12:17

the real conversation that I think is bubbling. And I think that we're probably

12:24

another 12 to 18 months

12:26

where all of these other issues are going to be important, but they're going to

12:30

be viewed for what

12:32

they are. They're going to get demoted, I think, in importance. And it's this

12:36

core structural issue.

12:38

It's what is the economic relationship that we have together as a society? What

12:42

is the relationship

12:43

between Joe, Chamath, Jamie, and all these companies? And how do we feel about

12:50

a few,

12:51

and an ever shrinking few, making more and more and more? And then how do we

12:58

feel about their ability

13:00

to share that with a small amount of people? And then what is the expectation

13:06

for everybody else?

13:08

I think that's mostly at the core of what's happening. And so back to like, you

13:14

know,

13:14

all of this attention that we give to these other issues distracts from that

13:17

one, because I think you

13:18

can get organized to fix this issue. You can't get concessions on any of these

13:22

issues. You know,

13:23

you bring up Israel, it's like this, you bring up social issues, it's like this,

13:27

you bring up,

13:27

you know, whatever you want to bring up, people just kind of take aside,

13:30

nothing happens. This is

13:33

actually where people are universally actually much more aligned than you think.

13:37

Because there's

13:38

reasonable ways, one simple way was is you'd say, well, let's flip the taxation

13:41

model. Corporate taxes

13:46

should exceed personal taxes. They've never. We should have an expectation that

13:54

then corporate

13:55

actors can buy down their taxes if they want. But if they do social good for

14:01

society, I'll give you an

14:02

example. If the Industrial Revolution, there's a table like this, and the

14:06

leading lights of that era,

14:09

Andrew Carnegie, Nelson Rockefeller, Jay Gould, JP Morgan, they sat together

14:15

and they said, "Guys,

14:16

this is going to benefit us, this Industrial Revolution. It may not benefit

14:22

everybody. What

14:23

is our responsibility? What is our collective responsibility?" And they

14:27

allocated tasks.

14:29

Carnegie went and built libraries all throughout the country. Rockefeller built

14:34

universities. Hospitals

14:36

were built. And I think what happened is society was like, wow, these are

14:40

living testaments

14:41

to us doing well. And so then they were okay with this transition. But if you

14:46

think about it today,

14:48

what are the living tributes that, you know, capital builds and leaves behind

14:53

for society?

14:54

Speaker 2: It's fewer and fewer. I think that's a very big opportunity for

15:00

somebody to fill. I think

15:01

it's like, especially for folks in tech, I think if they can get themselves

15:06

organized to do that,

15:07

I think we land in a good place. If they cannot get themselves organized to do

15:11

that and say everyone

15:13

for themselves, I think it's going to be really complicated, super messy.

15:19

Speaker 2: Super messy because that sentiment that the wealthy are getting

15:25

wealthier and the middle

15:26

class is disappearing and the poor are being taxed into oblivion.

15:30

Speaker 2: Look, an $80,000 a year teacher

15:32

pays 40% tax. But if you're a multi-billionaire, most of your wealth is not W-2

15:40

wages. It's cap gains.

15:42

But there's all kinds of ways to shelter cap gains. There's all kinds of ways

15:46

to defer.

15:48

Speaker 2: And so even though you pay more on an absolute dollar basis, on a

15:52

percentage basis,

15:53

you're paying way, way less. And all of those tricks have been exposed. They've

15:59

all been exposed.

16:00

These are all mechanisms that were invented from the 1980s to now by all the

16:07

banks and all the folks

16:09

that wanted to come to folks that had wealth. And it's all known. And I think

16:13

people are kind of like,

16:15

"Hey, hold on a second. This just doesn't feel fair anymore."

16:18

Speaker 2: Absolutely. But the other problem with that is,

16:23

if you do tax correctly, where does that money go? And who's managing it? And

16:31

ultimately,

16:32

who's managing it is the federal government. And they've been shown to be

16:36

completely inept at

16:38

managing your money correctly. The fraud and the waste is off the charts. The

16:43

amount of NGOs that have

16:45

insane amount of funds at their disposal. I mean, all this is exposed by Doge,

16:50

right? And you realize

16:51

like how much fraud and waste there is and how much money. So the solution

16:56

being tax people more,

16:59

that doesn't sit with a lot of people because it's like, well, where is it

17:03

going and who's managing it?

17:05

If the federal government was being forced to handle money the same way a

17:10

private company does,

17:12

if it was all out in the open, everything was exposed, they would have gone

17:17

bankrupt

17:18

a long time ago. They would have gone under a long time ago. There's no way

17:22

they would have been allowed

17:23

to function the way they are. The people that are managing that money would

17:27

have all been put in

17:28

jail. There's not a chance in hell that giving them more money is going to

17:33

solve anything. They're

17:35

going to find more ways to put more of that money into NGOs that puts more of

17:40

that money into Democratic

17:41

coffers and Republican coffers. They're going to figure out a way to funnel

17:45

that money around

17:46

where it's not going to benefit people. I mean, a good example of that is like

17:50

where,

17:51

let's look at the LA fire thing, for instance. All right. So the LA fire fund,

17:56

there's a giant fire in

17:58

the Palisades. All this money gets raised. It's over $800 million. It goes to

18:04

200 plus different

18:07

nonprofits. None of it goes to the people. Spencer Pratt, who's running for

18:11

mayor of Los Angeles,

18:12

who's doing a great job, by the way. Fucking phenomenal. Those ads are, those

18:16

ads are fire. They're so

18:19

good. And he's doing it all out of a trailer on his burnt out land. I mean, he's

18:25

the most righteous

18:26

guy running in that regard. But just that being exposed, like, okay, we're

18:31

going to help out these

18:33

people. We're going to donate money. We're going to raise money. We're going to

18:37

do some good. We feel

18:38

terrible about the people in our community that have lost homes. Well, what

18:41

happens? Well, the same people

18:43

that you're saying we should give more taxes to take that money and they just

18:48

give it to a bunch of

18:49

nonprofits and charities. This episode is brought to you by Armra. Every week,

18:53

there's some new wellness

18:55

hack that people swear by. And after a while, you start thinking, why do we

18:59

think we can just

19:00

outsmart our bodies? That's why Armra colostrum caught my attention. It's

19:05

something the body already

19:07

recognizes and has hundreds of these specialized nutrients for gut stuff,

19:12

immunity, metabolism,

19:13

etc. I first noticed it working around training, especially workout recovery.

19:19

Most stuff falls off,

19:20

but I am still taking this. If you want to try, Armra is offering my listeners

19:24

30% off plus two free

19:26

gifts. Go to armra.com/rogan. I'm not saying give more tax. What I'm saying is

19:32

people are taxed too

19:33

much. Corporates are not taxed enough. Flip it. Right. But even if you do flip

19:38

it and the corporates

19:39

are taxed more, where's that money going? Well, this is the problem. I suspect

19:44

that if you put the burden

19:46

on Wall Street and corporates, they'd be a lot more organized and they'd

19:51

probably create a lot more

19:53

change than a diffuse electorate. Meaning like, let's just say the government

19:58

spends a trillion dollars

19:59

and waste it. I'm generally like roughly aligned with that. If you waste a

20:04

trillion dollars from 300

20:07

million people, it's hard to organize that 300 million people. But if you waste

20:12

a trillion dollars from 300

20:13

companies, those companies will get their shit together really fast and they

20:17

will force a lot

20:18

more change. I would hope so, but you're still dealing with incompetent people

20:22

that are tasked

20:23

with taking care of that money. Yeah. Not just incompetent. Don't get me wrong.

20:26

I'm not defending

20:27

these people. Decades of corruption. Decades. And decades of all these

20:32

mechanisms where they can

20:34

take this money and funnel it into these NGOs and these nonprofits and all

20:40

these different weird

20:42

organizations that don't seem to have accountability for what they do with that

20:45

money. That gets real

20:48

slippery. Yeah. And if those people in turn make deals with those corporations

20:53

that allow them to do

20:54

certain things and push things through that maybe they would have difficulty

20:58

doing, then you have a

21:00

different kind of a working relationship with the same groups of people and the

21:04

same government. You

21:05

just take money from corporations and move it into a way where the corporations

21:10

ultimately benefit from

21:11

it, but yet it doesn't do any good to the people. Yeah. I mean, I can see where

21:15

you're coming from.

21:16

I just think that if we go on the track we're going down, it just seems like we're

21:22

going to hit a crisis.

21:24

Yes. The crisis is you can't expect people to pay more and more and more. Again,

21:28

I agree with you.

21:29

The premise is we're all paying for a system that's broken. That should change,

21:34

but we still continue to

21:36

pay our taxes. But if taxes keep going up like this at the individual level,

21:41

and we don't manage this

21:43

transition to something where we may be working less and less, what are we

21:46

getting paid to do? And then

21:48

at that point, how are we expected to pay what? 90% of what? Right. 50% of what?

21:52

I think people do

21:54

have this weird feeling of dread that the people that are in control of a lot

22:04

in this country,

22:06

the tech companies in particular, particularly the tech companies like Google

22:09

and Facebook that are

22:10

essentially involved in data collection and then ultimately dissemination of

22:15

information, that they

22:17

have acquired enormous amounts of wealth and power and influence, and they're

22:22

essentially a new form

22:24

of the government. Yeah. You know, are you aware of Robert Epstein? Do you know

22:28

about his work?

22:28

Not Robert Epstein. No, different guy, different guy. Robert Epstein is a guy

22:34

who specializes in

22:37

understanding what curated search results do and what Google's able to do with,

22:45

in particular,

22:46

with curated search results in terms of influencing elections. That like, say,

22:52

if you have two

22:53

candidates that are running, let's just say, let's just take LA, for instance.

22:57

If I'm not making any

22:58

accusations, but I'm saying if they wanted Karen Bass to win and you searched

23:04

Karen Bass, you would find all

23:05

these positive results. If you searched Spencer Pratt, you would find all these

23:10

negative results. And

23:12

there's a bunch of people that are always undecided voters. And those are the

23:17

ones that you really

23:17

want. They're like, I don't know. I don't know. And come election night, those

23:20

are the people you want

23:21

to try to grab. And it's generally a large percentage. You can influence an

23:26

enormous percentage of those

23:27

people just with search results. Yeah. Where you can shift an election one way

23:31

or another. I believe it.

23:32

Yeah. And he's demonstrated this and shown how this is possible. That freaks

23:38

people out,

23:39

that tech companies are in control of narratives, that tech companies can censor

23:45

information,

23:47

especially tech companies that work in conjunction with the government. And

23:50

this is what we found out

23:51

when Elon purchased Twitter, right? When Elon purchased Twitter, we got all

23:55

this information

23:57

from the Twitter files when all the journalists were allowed to go through it.

23:59

And they said,

24:00

oh, this is crazy. You've got the FBI, the CIA, you've got all these companies

24:05

or all these government

24:06

organizations that are essentially controlling the narrative of free speech in

24:12

the country.

24:13

And they're doing it in a way that benefits them. They're doing it in a way

24:16

that benefits what

24:17

political parties in charge at the time was the Biden administration. And they

24:21

were allowed to do a

24:22

bunch of weird shit, which should be illegal, but it's not technically illegal.

24:27

And that freaks people

24:29

out because there's no real laws and rules in regard to what they're allowed to

24:33

do and what they're not

24:34

allowed to do. Like curated search results should be illegal. They're shaping

24:38

attention. Yes.

24:40

Again, it goes back to attention. They're shaping attention. Yeah. That's a big

24:45

concern for people.

24:47

And I think then when you find out that these people are able to amass enormous

24:52

sums of wealth and

24:53

have incredible amount of power and influence because of this enormous wealth

24:57

and this control over

25:00

these tech companies that have essentially become the town square of the world,

25:05

that freaks people

25:06

out. And that these very small number of people, you know, you think of Zuckerberg,

25:11

you think of Tim Cook,

25:13

and I don't know who the new guy is now. Who's the new guy? John Furn... Right.

25:17

Furnace? No.

25:18

I forget his name. Yeah. Furnace. Furnace. Furnace. Furnace. But that kind of

25:23

thing

25:23

gives people a lot of concern, right? It's like that these people, these unelected

25:30

people are in control of a

25:32

giant chunk of how the world works. I think that this is the existential

25:40

question that we are dealing

25:41

with. You're going to have five or six companies concentrate, like whatever

25:47

power you think has

25:50

been concentrated up until now, I think we're going to look back and it's going

25:53

to look like a Sunday

25:55

picnic 10 or 15 years from now. Because on the one hand, it's going to be an

25:59

even smaller subset. And

26:01

on the other hand, the capability is going to be in order to orders of

26:04

magnitude. So can you imagine

26:06

what that must be like? It's kind of like showing up, getting dropped into the

26:10

1800s and you've invented

26:11

the engine and everybody else is a horse and buggy. You can just decide to your

26:16

point. That is where we're

26:19

going. It's even more crazy. It's like everybody else is on a horse and buggy

26:24

and you've got an internet

26:25

connection and a cell phone. Right? Exactly.

26:27

Exactly.

26:27

It's even more crazy.

26:28

Exactly.

26:29

Because what we're dealing with with AI right now is, first of all, it's

26:35

already lowered children's

26:37

attention spans. And it's shrinking their capacity to acquire or absorb

26:42

information. Because what they're

26:45

doing now is just relying on AI to answer all their questions for them. Now, is

26:49

that their fault?

26:50

Kind of. Right? Because it doesn't have to be that way. You could still acquire

26:54

information

26:55

the old fashioned way. You could still learn things the right way. But a lot of

26:58

kids are just concerned

26:59

with passing examinations and getting into good schools. And what they're doing

27:03

is just using AI.

27:04

And they're getting better test results, but they're also not as smart, which

27:08

is really weird.

27:10

It's like we're relying on it like we, you know, it's like, it's essentially

27:17

like replacing our mind.

27:21

And that's just the, this is the beginning. This is like, these are the toddler

27:27

days

27:27

of AI and to where it's going to be a super athlete in a few years.

27:32

Yeah. I think we have to figure out how, first of all, kids need to learn. And

27:39

I think this is

27:40

where like we have to do a better job as parents. Kids need to learn how to be

27:43

resilient thinkers.

27:44

I don't even know what that term meant before, but I know what it means now,

27:47

which is like,

27:48

you take this AI slop and you just kind of like pass it off. And if like the

27:52

teachers and the school

27:53

system aren't trained, they're just like, wow, this looks good. They have to be

27:57

able to push back.

27:59

Parents need to be able to look at this shit. But then all of this stuff, I'm

28:02

just like so frustrated

28:03

because it's like one more thing that I have to do as a parent, like, right.

28:06

Every time technology

28:07

gets better, it's one more thing, you know, right. We're going to make the

28:10

world, you know,

28:11

super connected and social and all of that stuff. It sounds great to me until I

28:14

have to be the one

28:15

that has to tell my kid, I can't, they can't get Instagram. And then they're up

28:19

my ass every day,

28:19

right. You know, and it's just like, I don't want to have to deal with this

28:23

stuff, right? I want this

28:25

to be handled in a way that just allows me to do what I want to do. I don't

28:29

want to say no to my

28:30

kid. I don't want to police his schoolwork and make sure he's not cheating or

28:34

not learning and just

28:35

like, you know, passing off this AI slop. What am I? Where are my tax dollars

28:41

going? Where's everybody

28:43

else in all of this? It gets very frustrating. And again, it goes back to like

28:47

this feeling of like,

28:49

well, is this all getting better for me? Or is this kind of like not, you know,

28:53

people start to be

28:54

nostalgic for what it used to be, because it was just simpler. But I think that's

28:58

a different way

28:59

of saying easier. Well, we're just dealing with, we're at the edge of great

29:03

change, like great change

29:06

that has no real understanding of how it turns out. Yeah. And I think that

29:12

understandably freaks

29:14

people out, freaks me out. It freaks me out. But I've kind of gotten to this

29:17

place where I'm like,

29:18

well, it's going to happen. Did you see this thing? The CEO of Verizon, Dan

29:23

Schulman,

29:24

he put out this very public forecast, you know, very smart guy, well regarded

29:30

in business. And I think

29:31

he said something like 30% of all white collar jobs will be gone by 2030. I don't

29:36

know, Jamie,

29:37

maybe you can get the exact thing. But it's something like that. That's

29:39

probably optimistic.

29:41

And I thought at first, my initial reaction was like, this is totally not

29:45

credible. But then I'm

29:46

like, hold on a second, that's my bias, because I want to believe that that's

29:48

not possible.

29:49

Honestly, right, right now. And as I've gotten older, I'm a little bit better

29:53

now, like, okay,

29:54

hold on a second, let's weigh the probabilities. And now I was like, man, if I'm

29:57

going to be fair,

30:00

maybe there's a 10%, 20% chance of that. There's a bunch of other outcomes that

30:05

are much better than

30:06

that. But that's part of the set of outcomes that you have to consider. And

30:10

then I was like,

30:11

well, what's my antidote to that? And the only thing that I can say is, don't

30:16

worry,

30:16

it's going to be better. I don't think that that's a good answer.

30:20

No. So there has to be, like, all of this kind of goes back to, look, my wife

30:27

and I had this

30:27

conversation, we're like, if it were up to us, who's who can you trust to have

30:32

some super intelligence?

30:34

Now, we're biased, because we're friends with him. But the only person that we

30:38

can trust is Elon,

30:39

because he seems to be like, he has a bigger, like, it's kind of like, he's

30:43

like, over there,

30:44

he's like, I need to get to Mars, you know, and I'm going to first terraform

30:48

the moon, but then I'm

30:49

going to Mars, and I'm going to build like a fucking magnetic catapult, do all

30:53

this shit.

30:54

And so I just need this thing. I feel like he's the least corruptible.

30:58

He's the most independent thinking.

31:02

And I think he's the one that has an actual empathy for people.

31:06

Then there are folks where there's just an insane profit motive.

31:10

Right.

31:11

They're less in control of the businesses that they run.

31:13

Those businesses are really out over their ski tips and the amount of money

31:18

they've gotten from

31:19

Wall Street and other folks who expect a return, who will put a ton of pressure

31:22

on these folks.

31:24

And if they get their first, I don't know where the chips fall. We don't really

31:28

know.

31:29

We can kind of guess. And then you see in the press,

31:31

just enough snippets of their reactions in certain moments where you're like,

31:36

hey, hold on a second, question mark here. You know, you see OpenAI react one

31:40

way,

31:40

you see Anthropic react another way. And you're like, where is this going to

31:44

end up?

31:45

And the honest answer is nobody really knows.

31:47

So it comes back to like, we need a few people that can organize. Those guys

31:47

need to self-organize.

31:54

And actually present a really positive face. And they need to show

31:58

why those 20% of outcomes that Dan Schulman paints.

32:03

The truth is it's possible, but here's why it's not probable.

32:08

But it's not in their best interest to do that because it's in their best

32:12

interest to generate

32:13

the most amount of money possible. That's the obligation they have to their

32:16

shareholders.

32:17

That's the obligation to they have the people that invested money in this

32:19

company.

32:20

Their obligation is not to make sure the white collar jobs stay in the same

32:25

place that they are now.

32:26

That's not true.

32:27

No?

32:28

No. I actually think their incentive should very clearly be to tell people

32:33

with details and facts, why there's a positive future. And the reason is the

32:38

following. Right now,

32:40

there's a vacuum. There are no facts and there's fear mongering. And then there's

32:44

this belief that

32:45

this is going to be cataclysmic to human productivity and white collar labor

32:49

and all of this stuff.

32:50

What's people's natural reaction? Well, today, if you look at it,

32:54

think about AI as a very simple equation, energy in intelligence out. So if you

33:01

want to cut the head

33:02

of the snake, what do you do? You cut off the energy supply, right? If you're

33:06

afraid of all of this

33:07

super intelligence coming, the natural thing to do would be to go to the point

33:11

of energy and unplug it.

33:12

What is the equivalent of unplugging it today? It is to go all around the

33:16

country,

33:17

find the data centers, protest them and get them to be mothballed. That is an

33:24

incredibly successful

33:25

strategy right now. Today, about 40% of all of these data centers that get

33:33

protested get mothballed.

33:36

You're talking about emerging data centers?

33:40

Yeah, just like, right, I need to. So if you're one of these companies, the

33:44

first thing you should

33:44

realize is I need to paint a positive vision, because 40% of my energy is

33:49

getting unplugged every day.

33:53

And if that happens, my revenues will crater and my investors will be super

33:56

pissed.

33:57

So the right strategy is what is the positive fact based argument. And there

34:02

are some incredible

34:04

examples. Number one, and then number two is you have to give people some

34:09

tactical benefit that they

34:12

see. Because AI differently than search or differently than social media, there's

34:18

no exchange of value.

34:20

Let me explain what that means. So let me just go like, so the first thing is

34:25

that if you can go and

34:28

actually show people, here's an example of AI. I heard about this last night,

34:34

it's pretty incredible.

34:35

You can now take pictures of a woman's fallopian tubes, and you can see pre-cancer,

34:45

ovarian cysts, and all of this stuff, cervical cancer before it forms. And then

34:49

you can intervene,

34:51

and you can fix it so that, you know, women don't get cervical cancer.

34:54

In a different example, I actually, I told you about this example when I was

34:59

here before,

35:00

I finally got FDA approval. Okay, there is a device now that is allowed to be

35:04

in the operating room with

35:05

you. And if you have a cancerous lesion, or a tumor inside of your body, the

35:11

most important thing when

35:12

they go to take it out is, make sure you don't leave any cancer behind. You

35:16

couldn't do it because

35:18

what would happen is you take it out. A doctor, Joe, is literally fucking

35:22

eyeballing it and saying,

35:24

yeah, they send it to a pathologist, you get an answer in 10 days. For women

35:30

with breast cancer,

35:31

a third of these women find out that they have cancer left behind. They go back

35:35

in, they scoop

35:36

some more stuff out, a third of those women. Okay, so I'm like, this is

35:40

bullshit, we can solve this

35:41

problem. But it took us a long time, a lot of money, I had to build an entire

35:47

machine imaging all of

35:49

this stuff. AI algorithms, we had to prove it all, we finally get approval.

35:53

Okay. But you know how hard

35:55

it is to tell that story? In all of the attention that people are looking for?

35:59

It's hard. But those

36:02

are positive examples. No more breast cancer. No more cervical cancer. A

36:07

different example is most drugs

36:10

in pharma fail, right? And it's a very complicated problem in pharma. It's kind

36:16

of like a jigsaw puzzle

36:18

of the ultimate complexity. It's like, think of your human body as like a Himalayan

36:22

mountain range.

36:23

You have to design a drug that's an equivalent Himalayan mountain range that

36:28

plugs into it

36:29

perfectly. One millimeter off, you grow like a fourth eye, a third nipple, you

36:34

die, you know?

36:35

Now you can use computers to make sure that that drug, hand in glove to your

36:42

body,

36:42

solves the exact problem. Couldn't do that before. So there's all of these body

36:47

of examples,

36:48

and you're probably only hearing them superficially at best. That should be 99%

36:55

of the attention is

36:56

showing all of the constructive, tactical ways in which our lives will be

37:01

better. Your mom, your

37:03

daughter, your wife, us, Jamie, his family, everybody. Right. That's the number

37:09

one thing. Nobody talks

37:10

about it. I don't understand why. Well, I think because people are terrified of

37:14

losing their jobs.

37:15

So that's the primary concern. The primary concern that I hear from people is

37:19

that there's so many

37:20

people that are going to school right now, college students, that don't know if

37:23

their job is going to

37:24

even exist in four years when they graduate. And that's the second part of, I

37:28

think, what

37:29

this industry has to do better. I had lunch with Jeffrey Katzenberg. He told

37:35

this crazy story. I'll

37:36

tell you. Steve Jobs gets kicked out of Apple. He starts next and he buys Pixar

37:46

from George Lucas.

37:47

But then he hits a rough patch and he's got this financing issue. Katzenberg

37:52

flies up,

37:53

spends time with Steve Jobs, says, "I'll buy Pixar." Jobs says, "Absolutely not."

38:00

And then Katzenberg proposes a deal. And he's like, "How about a three picture

38:03

deal?" Jobs says,

38:05

"Okay." He flies back and apparently all the animators were up in arms because

38:11

they're like,

38:11

"Hold on a second. Steve Jobs is going to use these next computers to animate

38:15

this movie,"

38:16

which ultimately became, I think, "Toy Story." And they're like, "This is going

38:20

to put all of us out

38:21

of a job." That perfect argument. And people were really upset. Roy Disney was

38:27

upset. All the

38:29

animators were upset. And they all went to Mike Eisner. And they were like, "Michael,

38:32

you need to fire

38:33

Katzenberg." And they had a deal which was like, "Look, man, you do you, but

38:40

just give me the ability

38:41

to say no if I think that this is you're about to jump off a cliff." They talk

38:45

about it. And he's like,

38:46

"I got your back. Do the deal. Make the movie." They made the movie. It was a

38:50

huge success. Fast

38:51

forward 10 years, 15 years, there's 10x the number of animators. Now, it's a

38:56

small example,

38:57

but why is that? You were able to use computers and now all these new people

39:01

were able to come

39:02

and participate in that. I get it. It's a small example. But I think if we had

39:07

better organized

39:08

leadership, and we could try to tell some of these examples, try to go back and

39:13

document

39:14

how some of these things have actually helped people, it expanded the pie,

39:18

there's a chance.

39:20

But if we don't, I agree with you, where we're going to end up is everybody

39:23

basically saying,

39:24

"Hey, hold on a second. This is crazy. We need to stop this." That's the worst

39:28

outcome. Because

39:30

that's when you will have a high risk of a dislocation. Like the worst outcome,

39:35

like the black,

39:35

what's the black swan event? Let's think about the black. The black swan event

39:38

is when you get a model

39:41

that's good enough to automate a bunch of labor, but not good enough that it

39:51

can build new drugs and

39:53

prevent cancer and make you live for 200 years and all of this other stuff,

39:56

right? So there's like a gap,

39:57

right? And if you can stop it here, and it doesn't get to there, now you do

40:01

have the worst of all worlds.

40:03

You have this thing that kind of displaces labor, no new things come after it

40:07

because we stop innovating.

40:09

And that's like a non-trivial possibility now, I think.

40:15

No, it's a huge possibility. And then there's also this thing that you brought

40:18

up earlier,

40:18

where we have this place of abundance where no one has to work anymore. That

40:22

freaks people out.

40:24

I think that's a big problem.

40:25

Well, because if no one has to work anymore, first of all, what is your

40:30

identity, right?

40:32

Because so many people, their identity is what they do. Whatever it is, if you're

40:36

a lawyer,

40:37

if you're an accountant, if you run a business, whatever it is, this is your

40:40

identity.

40:41

You know, you have built this thing, you look forward to going there, you work

40:46

at it,

40:46

you look forward to doing a good job and getting rewarded for it. The harder

40:50

you work, the more you

40:51

get paid. There's all these incentives built in. And then there's this, again,

40:56

identity problem.

40:58

If all of a sudden you have universal high income, which is what Elon always

41:02

talks about.

41:03

Well, what gives people purpose then? Like what? And also, if you have a person

41:09

who's entire,

41:09

they're, you know, 43 years old and their entire life, they've worked towards

41:13

this idea that the

41:14

harder they work, the harder they think, the more innovative they are, and the

41:18

better they are at

41:19

implementing these ideas, the more they get rewarded. And then all of a sudden,

41:24

that's not necessary anymore, Mike. Time for you to just relax and do what you

41:28

want to do.

41:29

And Mike's like, well, this is what I do. I don't have any fucking hobbies. I

41:34

enjoy doing what I do.

41:35

And now what I do is completely useless. And now I'm on a fixed income, even if

41:41

that fixed income is a

41:42

million dollars a year, whatever it is. If all of a sudden you are in this

41:47

position where everything is

41:48

being run by computers, you feel useless. You feel like, what am I doing? I'm

41:52

just, I'm just taking

41:53

money. I'm on high welfare. Right. Like, what do I do? Right. I think that that's

41:59

a really important

41:59

question to answer. I don't know. Like some people are going to write books.

42:03

Some people are going to

42:03

do art. Some people are going to find things to do. But what do you think? What

42:07

do you think we would

42:08

have done if, if we were go back to the 1800s example, there was no office

42:15

culture, you know,

42:17

there's no like ladder to climb. How did people find meaning then?

42:22

Well, they had jobs. People still did things. If you're a farmer, you had

42:29

meaning in your labor and

42:31

what you did and keeping the animals alive and your chores. And there's people

42:35

that find great

42:35

satisfaction in doing that. Yeah. You know, you have all these animals that

42:38

rely on you. You have

42:39

people that rely on you for the food that you generate. There's, there's

42:42

meaning there. It doesn't

42:43

have to be an office to be something that gives you purpose and meaning. But

42:47

when all that is animated,

42:49

then what happens? Because then you have no purpose, no meaning other than

42:54

recreational activities.

42:55

Now, if everybody just starts playing chess and doing a bunch of things that

42:59

they really enjoy,

43:00

I mean, look, there's people that would love to just play chess, you know, like

43:05

eight people.

43:06

I don't know about that. I think if people really got into it, I mean, there's

43:09

a lot of people that

43:10

get addicted to whatever the recreation is like golf or whatever it is. For me,

43:14

it's playing pool.

43:15

You know, if you told me I never have to make any more money, I could just play

43:19

pool all day. I might

43:20

just play pool all day. But I don't know how many people think that way. I don't

43:25

know how many people

43:26

would be able to find meaning and purpose in a recreational activity. There's

43:31

so many people

43:31

where their entire being is focused around productivity and generating more

43:36

wealth.

43:37

What about religion as a source of meaning?

43:40

Well, that would help.

43:41

Did you see this article in the New York Times, I think it was this weekend,

43:45

about how popular and

43:46

sold out churches have become as social constructs in New York City? It was

43:51

totally fascinating. It's like

43:53

young women dressed to the nines going to church on a Sunday for social

44:00

belonging, community meaning.

44:03

I was so fascinated by it. I was like, wow, that's incredible. Because I think

44:09

if you graph

44:09

people's use of religion as an anchoring part of their value system, over the

44:15

last 40 years,

44:15

basically gone to zero. Nobody celebrates it the way ... It's not a part of the

44:20

community the way that

44:21

it used to be. Maybe that's a thing that we have to find. There has to be a

44:24

renewal of some older things

44:25

and then there has to be new things that replace it.

44:28

What's the Chinese answer to this? The Chinese have a very orthogonal answer to

44:34

this.

44:34

If you look at how China is organized, it's super interesting because

44:37

they don't reward based on the way the American system rewards.

44:42

In fact, it's almost orthogonal, where it's we are rewarded with money and

44:48

rewarded with fame and

44:50

recognition. The system, the American capitalist system. But if you look inside

44:55

of China, it's

44:56

constantly testing who has this judgment. What they are rewarded with is

45:00

influence and power. Again,

45:03

it's a very specific social contract. I don't think it's going to work in the

45:06

United States, nor am I an

45:07

advocate of it, but it works for them. You'll start off as some low-rung person

45:14

in some small village

45:16

town somewhere. Your job as the functionary is to do good in that community.

45:21

The more you do well,

45:23

you get promoted. Then you get, let's say, to a reasonable-sized city and you

45:26

get a budget.

45:28

Now, what happens is you actually become a little bit like a VC, like a venture

45:31

capitalist. You're

45:31

given a budget and you'll get a memo. It'll say, "Hey, Joe, we have a priority

45:37

over the next 15 years.

45:38

It's batteries. You have enough money. Put a team on the field." You go in your

45:46

local community,

45:46

you find a bunch of guys. You're like, "All right, guys, we're going to start a

45:49

battery company."

45:51

And you do it. And let's say they're good. And they're innovative. And what

45:57

happens is,

45:58

in the town beside it, that battery company dies. Now, you kind of subsume the

46:03

capital from Jamie,

46:05

right? Because Jamie's like, "Fuck, I fucked up this thing. I was told to do

46:09

batteries. Okay,

46:09

Joe, I'm just going to align with you." And what happens over time is you get

46:14

this

46:16

filtering effect. And the people that are better at meeting these long-run

46:21

priorities and objectives

46:22

are the ones that are celebrated. But they're not celebrated with, you know,

46:26

Forbes articles and all

46:28

this other bullshit. They're just celebrated by giving more responsibility. And

46:32

then eventually,

46:33

you get to the upper echelons of China. And what you have are folks over a

46:36

course of 40 or 50 years,

46:38

who in their eyes have demonstrated incredible prowess. There's a version of

46:43

that reward system,

46:45

which is very foreign to America, but that's work for China. Now, that also

46:49

works because they're

46:50

more Confucian, you know, we're too individualist. But my point is like, you

46:54

know, there are these

46:57

different ways that we can find of giving people meaning that don't have to be

47:01

always around money.

47:02

But meanwhile, I think we have to answer the question, if we are expected to do

47:09

less,

47:09

we probably should not be taxed more. That's I think that's like a very basic,

47:14

in my mind, I think that is like, that must be explored and figured out. And on

47:18

the other side,

47:19

there's just a ton of obvious mechanisms that corporate actors can use to

47:24

minimize that.

47:25

And they should find off ramps, by the way, if they want to build hospitals,

47:29

they shouldn't have

47:29

to pay taxes. Like that's a perfect example, by the way of like, the thing in

47:34

like, if you look,

47:35

if you walk around New York City, there are living tributes to corporate

47:39

success that people

47:40

get benefit from every day, the hospitals, the buildings, the libraries, it's

47:44

just everywhere.

47:45

We need a version of that. And, and I'm not a tax expert. But you know, if that

47:53

can be funded

47:53

by private actors, so go directly to the problem, build a bunch of libraries,

47:58

build a bunch of new

47:59

universities that, you know, teach kids actually how to think or whatever,

48:03

build better hospitals that are,

48:05

you know, there to actually solve the problem. These are all things that are

48:08

possible,

48:08

right? But none of it's happening today.

48:10

But let's, let's go back to what we were talking about earlier with the taxes

48:15

and the fact that

48:17

you're giving money to a broken system. Do you think it's possible that AI

48:21

could show benefit

48:23

in that they can analyze all the data, which would be virtually impossible for

48:29

even an office filled

48:30

with human beings paying attention to all of it. And they could analyze where

48:35

all the money goes and

48:37

eliminate all the fraud and waste, like recognize it instantaneously. Yes, that

48:42

would be a great benefit

48:44

and a way to make it so that your taxes directly benefit people. I'll give you

48:51

one example of this.

48:52

So two years ago, you know, like every few years, I mean, I invest, but every

48:58

few years,

48:58

I'll start something because I feel strongly about it. And there's an effort

49:03

that I made

49:04

to look at all of this old code. Like if you think about the world,

49:10

the world runs on software, right? Like even though you and I are talking,

49:16

it's piping into Jamie's computer, it's all software, then it goes to Spotify,

49:21

they pump in some ads, it's all software, right? Software runs everything. What

49:25

percentage of that

49:28

do you think is kind of poorly written? I'm going to say probably 80 to 90% of

49:34

it.

49:34

Really? Oh, yeah. It's riddled with errors. It's riddled with mistakes. The

49:39

fact that so

49:40

many companies exist is an artifact of the fact that the thing that came before

49:44

it isn't working.

49:47

Like if you got it right the first time, it would just kind of move and go.

49:50

How so? What do you mean by that?

49:53

Normally, if you were like, Chamath, I want to build a system that does A, B,

49:57

and C.

49:58

If I was designing it properly, I would sit there with you and I would meticulously

50:04

write down,

50:04

all right, Joe wants to do this. What are the implications? Joe wants to do

50:08

that. What are the

50:09

implications? And I would actually write a document that was in English before

50:14

a single line of code has

50:15

been written. This was the when you have to design something that can't fail.

50:20

So for example, like

50:20

if you and I are designing something for the FAA, or for, you know, I hate to

50:25

say this example,

50:26

because it turned out to not exact but like, you know, to fly a plane, right?

50:29

You are first there to

50:31

write in English. And the reason is because everybody can then swarm that

50:36

document and see the holes.

50:39

Okay. And it's only then when that stuff looks complete and functional, do you

50:46

build?

50:46

We turned that upside down. Over the last 30 years, people in computing

50:53

invented

50:54

all kinds of ways to shortcut that process. And you can say, well, why did they

51:00

do that? Because it

51:01

would allow you to build something faster, make more money quickly, and then

51:06

build more business. So the

51:07

direct response to, hey, it's going to take us nine months to write down the

51:11

rules was somebody else

51:12

showed up and says, fuck it, I'll just grip and rip this thing, I'll be done in

51:15

four months.

51:16

Who's going to get the job, the four month guy is going to get the job. So we've

51:20

had 30 or 40 years of

51:21

that. What are we learning about that process? It's riddled with software

51:27

errors, like logic errors. It's

51:30

riddled with security errors. I don't know if you saw this whole thing like

51:34

with

51:34

anthropic mythos. What are they uncovering? They're uncovering that we wrote a

51:38

lot of really

51:39

shitty code for 40 years. So that body of

51:43

old code, I was like, guys, if we're going to really figure out how to do all

51:50

of this,

51:50

we need to rewrite all of it. So we built this thing. And it's called a

51:57

software factory. Anyways,

51:59

the point is, there is a government organization that we're working with.

52:04

They gave us a huge corpus of their old code. And it is unbelievable how much

52:15

complexity and difficulty

52:16

they have to go through to manage all the money flows with the system. And this

52:22

is a critical part

52:23

of the US government. So to your point, what I can tell you really explicitly

52:27

is, the people on the

52:28

ground want this stuff to be better written. It's less like some nefarious

52:34

actor like, oh, I'm going to

52:36

steal here. It's a lot of very brittle, fragile code. And when you rewrite it,

52:43

well, first, when you

52:44

document it, you're like, it's like the, you know, the Pulp Fiction thing, the

52:48

suitcase opens, the light

52:50

shines, and you're like, oh, and then you can rewrite it. And you will save. So

52:55

I think like as the

52:57

government goes through this process, because they're forced to, or they want

53:00

to, it won't matter.

53:01

You are going to save a ton of money. They're going to have to do it, Joe,

53:07

because the security risks

53:09

are too high. But what they're going to end up with is impregnable code that

53:14

you can read in English and

53:16

understand, you'll see the holes, those holes will be plugged, because

53:19

otherwise, now you'd be committing

53:21

fraud by letting it be. You close the loopholes, and there's just going to be

53:25

less money

53:27

leaking out of this bucket. That is an incredible byproduct. We're going to

53:31

live that over the next

53:31

10 or 20 years, just for nothing, like we get it for free. And that's happening.

53:37

So when that happens,

53:39

you're going to see government budgets shrink. Now, to your point, will they

53:42

try to spend that extra

53:43

money in other places? Of course, they will. That's the next conversation,

53:47

which is you have to elect

53:48

people that save, firewall it. Whatever you save, give it back to the people,

53:54

or invest in some

53:56

scholarship program, or free medicine, or something. But you can't spend it on

54:00

other random shit.

54:01

But that's where we're at. That's going to happen. It's going to be slow, but

54:07

when people start to

54:09

announce these things, I think over the next few years, you're going to be

54:11

shocked.

54:11

So that's the positive upside.

54:13

Well, that's happening now, irregardless of whatever else happens. It's a lot

54:19

of old shitty

54:20

code that must get rebuilt from scratch. It is getting rebuilt from scratch.

54:25

And as a result,

54:25

a lot of these leaky bucket problems are getting filled.

54:28

So what percentage do you think could be fixed?

54:30

I think if I had to be a betting man, I think probably 30 to 40% of the federal

54:40

budget

54:41

is leaked out.

54:42

Just for shitty code?

54:45

No, meaning like all of the rules and like, like you can take, I'm not saying

54:49

that there isn't fraud.

54:50

Right.

54:50

But I think a lot of times what happens is less nefarious than fraud, like

54:55

meaning like

54:56

conspiratorial actors.

54:57

Right.

54:58

I just think it's like-

54:58

Incompetence.

54:59

Incompetence, inefficiency, errors.

55:00

Right, for sure.

55:02

Like, for example, like, I saw Doge just say they were able to like expunge

55:07

like millions of

55:10

people that were like 150 years old or more.

55:12

Mm hmm.

55:13

I have no idea how much money those folks were getting, or who they were.

55:19

Mm hmm.

55:20

But it's probably a lot.

55:21

It's probably not zero.

55:23

And now that they got rid of it, they're not going to get that money anymore.

55:26

If you implement something at the state level around, you know, all of this

55:32

fraud prevention for the

55:34

daycares and all of this other stuff.

55:36

Again, it's all in software, because it's not, no matter what the human wants

55:41

to do,

55:41

you have to go to a computer at some point, at least today in 2026, and type in

55:46

something,

55:47

and something happens that's documented, and then the money gets sent.

55:50

Right?

55:50

That happens.

55:51

There's no other way in the modern world today at scale to steal billions of

55:55

dollars.

55:55

And so my point is, as you document all of these systems,

56:00

and governments have to transparently tell you and me, the voting population,

56:05

here are the rules,

56:06

they're going to plug a lot of these holes.

56:09

And I think as you do that, there's just going to be a lot less waste and fraud.

56:12

The question is, who's going to take credit for it?

56:15

Everybody's going to try to take credit for it, but I think we've started it.

56:18

I think we've started this process.

56:20

And again, the reason that people will start is because you'll be afraid of

56:24

China hacking these

56:25

systems, you'll be afraid of Iran, North Korea, and you'll say, this system can't

56:29

stand,

56:29

all these AI models are running around, we're going to get breached and penetrated,

56:33

then they're going to steal all the money.

56:34

And the natural reaction will be, okay, rewrite it.

56:37

This episode is sponsored by BetterHelp.

56:40

We've all been there, staying up late, stressed about the future.

56:45

Maybe you're worried about finding a job or a looming deadline.

56:49

Whatever you're feeling stressed out about, you don't have to work it out on

56:53

your own.

56:54

No one person has all of life's answers.

56:57

And it's a sign of strength and self-awareness to reach out for help.

57:02

That's why this Mental Health Awareness Month, we're reminding you to stop

57:07

going at it alone.

57:09

Get the support you need with a fully licensed therapist from BetterHelp.

57:13

They make connecting with a therapist convenient and easy.

57:17

Everything is online.

57:19

Literally all you need to do is answer a few questions and BetterHelp will take

57:23

care of the rest.

57:24

They'll come up with a list of recommended therapists that match what you need.

57:28

And with over 10 years of experience, they typically get it right the first

57:32

time.

57:33

So you don't have to be on this journey alone.

57:35

Find support and have someone with you in therapy.

57:39

Sign up and get 10% off at betterhelp.com/jre.

57:45

That's betterhelp.com/jre.

57:51

That makes sense.

57:53

That makes sense that the code and having a bunch of errors and having a lot of

57:57

inefficiency

57:58

and just a lot of incompetence, that's going to save a lot of money.

58:03

But so you would be doing this with AI?

58:08

In part.

58:10

In part.

58:11

What AI allows you to do is it's like you have a textbook, okay?

58:18

It's in Chinese.

58:19

You don't know Chinese, right?

58:20

No.

58:20

Okay.

58:21

You're like, well, this is probably doing something important, but it's in

58:23

Chinese.

58:24

What AI allows you to do is back translate that into English.

58:28

You put it through an AI model.

58:31

You teach it.

58:31

You coach it, right?

58:33

You can parameterize all of it.

58:35

And out pops that same book in English.

58:39

And now you can read it and know that it's accurate.

58:41

That's what we're doing.

58:44

So what the AI allows you to do is essentially translate from this one

58:47

language that you kind of don't understand to English.

58:51

By the way, that thing that's happening is actually also a very powerful and

58:58

important

58:59

trend, meaning there's all of these systems that work in ways that you and I

59:03

don't understand.

59:04

And part of the reason why we don't understand it, maybe it's bad software,

59:07

maybe it's fraud, whatever.

59:08

But nothing can be written down.

59:10

There's no symbolic space.

59:12

There's no English document that says this is how the DMV works.

59:15

This is exactly the rules.

59:16

This is what you can expect, Joe Rogan.

59:18

When you show up at the DMV and you give us this thing, here's your SLA.

59:21

In three days, you get a driver's license.

59:23

And here's exactly what's happening.

59:24

And here's an app and you can follow it.

59:26

Doesn't happen.

59:28

Here, Joe Rogan.

59:29

Here's how my insurance billing process works.

59:32

You have this condition.

59:34

I'm going to show you exactly why I made this decision.

59:36

Here's the exact rule.

59:37

Here's the approval or denial from CMS.

59:39

Follow it through and tell me if you agree or not.

59:42

None of that exists.

59:43

But it is possible.

59:45

And the first step in doing that is taking all of this legacy shit that we deal

59:49

with

59:50

and translating it into English and reading it and saying, is this how we want

59:53

it to work?

59:54

That's going to eliminate an enormous amount of all the things that frustrate

1:00:00

us.

1:00:00

So this would require human oversight?

1:00:03

Absolutely.

1:00:04

All right.

1:00:04

Must.

1:00:04

And so then it's also going to be who's watching the watchers?

1:00:09

Yeah.

1:00:09

It's okay.

1:00:10

This is a great question.

1:00:11

Okay.

1:00:11

So I'll tell you how this government agency is doing it.

1:00:14

Okay.

1:00:14

This is a really fascinating way because I think it's very smart.

1:00:18

They came to us and they came to another very well-known company.

1:00:23

You can probably guess what it is.

1:00:25

Okay.

1:00:25

And they're like, guys, you're kind of in a foot race.

1:00:28

But you're not competing against each other.

1:00:31

You think of yourselves as frenemies.

1:00:32

So here's this Chinese document.

1:00:35

You're going to translate it for us.

1:00:37

There's going to be your version of English and these guys' version of English.

1:00:41

And every time it's the same, we're going to look at it together and we're

1:00:44

going to agree

1:00:45

or not, okay, this is exactly how we want this to work.

1:00:48

When yours says the dog is red and his says the dog is yellow, we're going to

1:00:56

sit and literally

1:00:57

inspect it and we're going to figure out why you said red and why you said

1:01:01

yellow.

1:01:02

And then if you say the cat is red, the dog is yellow, so it's totally wrong,

1:01:09

right?

1:01:09

Like you've gotten, you know, or like the cat is red, I want an apple, whatever.

1:01:14

We're going to double and triple down on those kinds of errors.

1:01:18

And they do it not in public, but in this large community where there's like

1:01:24

technical people

1:01:25

from all different parts and they're just swarming this problem.

1:01:28

It's incredible to see.

1:01:31

And so what happens is you get humans that get to use this tool, but ultimately

1:01:38

it's our judgment and

1:01:39

it's done transparently.

1:01:40

So what happens is you can't, you know, hey man, put this fucking rule in there.

1:01:44

Like the dog is yellow.

1:01:46

Just, just make the dog yellow.

1:01:47

You can't do it.

1:01:49

Because now you have tens of people, hundreds of people, and then it gets

1:01:52

documented.

1:01:53

It's super fascinating.

1:01:55

I'm not saying this is how it's going to work in 10 years, but I'm telling you

1:01:58

it's

1:01:58

literally what's happening right now.

1:01:59

And I think that thing alone will be tens of billions of dollars and could be

1:02:05

hundreds of

1:02:06

billions of dollars of savings when it's fully done.

1:02:10

And it's a lot of people from all walks of life, all political persuasions, and

1:02:14

they're just in it.

1:02:15

It's the government.

1:02:16

It's a handful of us private companies.

1:02:18

It's super cool to see.

1:02:19

It's like, it's like, okay, we're actually going to do something here.

1:02:23

Like this isn't, this is nice.

1:02:24

It's, it's real, it's really cool.

1:02:27

So that's interesting in terms of the current moment.

1:02:29

So in the current moment, you're able to implement this.

1:02:34

You're, you're able to find fraud and waste and all these problems that exist

1:02:39

and all these

1:02:39

errors and shitty software.

1:02:41

Once that's all been done, then what happens?

1:02:47

No fucking clue.

1:02:48

Yeah.

1:02:48

So this is where it gets weird, right?

1:02:51

Because when, when you're dealing with AI models that are capable of doing

1:02:57

things that no

1:02:58

individual human being could ever possibly imagine.

1:03:01

And then you task it with a solution or with a problem, find a solution for

1:03:07

this.

1:03:08

And then it starts figuring out ways to trim this and implement that.

1:03:14

We have to make sure that these AIs act within, they act within the best

1:03:20

interest of the human race.

1:03:21

Agreed.

1:03:22

Right.

1:03:23

Not the company, not the government, not, but the human race.

1:03:27

And you're also dealing with China.

1:03:30

You're also dealing with Russia.

1:03:31

You're dealing with other countries that are also in this mad race to create

1:03:35

artificial general super intelligence.

1:03:38

That if we keep shutting down data centers, we keep hamstring ourselves, China's

1:03:43

not doing that.

1:03:44

They're not doing that.

1:03:45

They're doing the opposite.

1:03:46

They're generating as much revenue that goes towards this problem as possible.

1:03:52

They're putting all the efforts, the country, the government, and these

1:03:57

corporations work hand

1:03:58

in glove in order to achieve a goal.

1:04:00

We do not.

1:04:01

No.

1:04:02

And that becomes a problem if you want to be competitive with these other

1:04:07

countries that are

1:04:08

trying to achieve the same result as us.

1:04:09

And then you have espionage.

1:04:11

Then you have a bunch of people that are stealing information.

1:04:14

You have a bunch of people that are CCP

1:04:16

members that are actually involved in companies, and you find out that they're

1:04:23

siphoning off data

1:04:24

and that they're sharing information and tech secrets.

1:04:26

They're, look, here's a,

1:04:29

they're, they're dis, the way that the Chinese models work, the Chinese claim.

1:04:35

So America's closed source, meaning you got your own thing.

1:04:40

Your recipe is completely secret.

1:04:43

Right.

1:04:43

Okay.

1:04:44

I have my own thing.

1:04:44

My recipe is totally secret.

1:04:46

China uses this word called open source, but it's not open source.

1:04:52

So they say, here's how I make my thing.

1:04:56

You can see it super transparent.

1:04:58

What it is, is more like open weights, which is like in a recipe, it tells you,

1:05:01

you know, you need sugar, you need butter.

1:05:04

Well, how much sugar?

1:05:05

And they'll say, you know, so much, but then they don't say it's brown sugar.

1:05:09

They don't say it's white sugar.

1:05:10

So there's all these different ways where they kind of give you this perception

1:05:13

that it's

1:05:14

completely transparent, but it's somewhat transparent.

1:05:17

So just in the level set, nobody in the world has a functional open source

1:05:22

model other than

1:05:24

maybe Nvidia, which is any good in the league of the closed source models and

1:05:29

the open weight models of the Chinese.

1:05:31

Okay.

1:05:31

So the Chinese open weight models are great.

1:05:33

The closed source models of America are great.

1:05:38

And then there's a couple open source, like fully open that are kind of

1:05:42

catching up.

1:05:43

The thing between America and China, what I find so fascinating is this

1:05:50

following conundrum

1:05:52

that everybody's going to find themselves in.

1:05:53

I think like, if you think of like an analogy, America's like a planet, China's

1:06:01

like a planet.

1:06:02

And around us are these moons.

1:06:07

And I'm just using the AI analogy.

1:06:09

So in AI, what do you need?

1:06:11

I think there's like four or five things you need.

1:06:13

Okay.

1:06:13

The first thing you need is a fuck ton of money.

1:06:15

So we need essentially the banks, right?

1:06:19

Like the Game of Thrones thing.

1:06:20

We need like, we need, we need the iron bank.

1:06:23

Feed us the money because that's what we use to buy everything and make

1:06:27

everything.

1:06:28

So we need that.

1:06:28

We need a ton of data.

1:06:31

Okay.

1:06:33

There's ways to get that.

1:06:34

We need a ton of very specific rare earths and critical metals and materials.

1:06:42

We need a ton of power.

1:06:43

And there are specific countries that are going to be really good at giving

1:06:49

that to us.

1:06:50

So if you look at the UAE, they are going to be the preeminent banking partner

1:06:55

of the Western world.

1:06:57

They are going to replace and be what Switzerland was over the last 50 years

1:07:00

for the next 50.

1:07:01

That's happening today.

1:07:03

If you look at Canada and Australia, the small political fissures aside, they

1:07:10

are the two most

1:07:11

important ways in which we get access to the critical metals and materials that

1:07:15

without

1:07:15

which we get fucked because China owns, you know, can just strangle us.

1:07:19

Okay.

1:07:20

So you have these like moons around the United States, but there's like five

1:07:25

countries,

1:07:25

six countries.

1:07:27

And there's a worldview that says in China has the same thing.

1:07:30

You know, they have Taiwan that's complicated for us.

1:07:33

So now we have a moon that we don't really have an answer for, which is what

1:07:36

happens, you know,

1:07:37

for all these super advanced chips.

1:07:38

Where do they get their money?

1:07:40

Maybe Russia becomes their bank.

1:07:43

Where do they get their critical metals?

1:07:44

Maybe it's Indonesia, right?

1:07:46

Who has a ton of natural resources.

1:07:48

And then you get into this game theory, which is what happens to every other

1:07:51

country.

1:07:51

Because it's 190 countries.

1:07:53

You have 10 that kind of divide up.

1:07:55

What are the other 180 do?

1:07:58

And you have to kind of sort yourself.

1:08:01

You're like, am I on team America or am I on team China?

1:08:03

And you probably have to go to people and say, well, here's what I can give you.

1:08:08

You know, if you're Indonesia, you're like,

1:08:10

you probably want to be on team America quite badly.

1:08:13

This is why the whole Trump tariff thing is so interesting, because it's like

1:08:17

this

1:08:17

accidental way of figuring out that this is actually this new sorting function

1:08:22

that's happening in global politics.

1:08:23

That's happening today.

1:08:24

Because these countries are like, holy shit.

1:08:27

If somebody invents a super intelligence and I don't have it,

1:08:31

how am I going to keep my people healthy?

1:08:33

How am I going to educate my people?

1:08:35

I'm originally from Sri Lanka.

1:08:40

What the fuck does Sri Lanka have to offer?

1:08:41

If you were sitting there, they should be thinking, oh man, what do I have?

1:08:47

Well, I have a critical piece of territory for like naval navigation.

1:08:53

And then what do you do?

1:08:56

You probably go to America and say, listen, let's figure out a package,

1:08:59

get the IMF involved, give me some cash.

1:09:01

I'll let you kind of keep your warships there.

1:09:02

So there's this game theory that we're about to go through because of AI,

1:09:06

because it's going to, I think, sort people into these bipolar world.

1:09:10

I actually think it makes us safer afterwards.

1:09:13

I don't think it makes us less safe.

1:09:17

I think it actually makes us more safe.

1:09:19

Because if you have these resources that build up on both sides,

1:09:23

there's more of a likelihood of a mutual detente and we're very different.

1:09:27

So we're less likely to fight over similar resources.

1:09:30

Meaning we're like the liberal democracy.

1:09:33

You know, we're like the free market.

1:09:37

You know, we're individualist.

1:09:38

They're Confucian society oriented, you know, reputation, power focused,

1:09:45

less really money focused.

1:09:46

So there's a lot of ways we're orthogonal enough where if that sorting function

1:09:50

happens,

1:09:50

it's probably a safer place, not a more dangerous place.

1:09:54

We have the models that can attack them.

1:09:56

They have the models that can attack us.

1:09:58

We kind of decide to leave each other alone.

1:09:59

Trevor Burrus: This is ultimate best case scenario.

1:10:02

Trevor Burrus: Ultimate best case scenario.

1:10:03

Trevor Burrus: What's ultimate worst case scenario?

1:10:05

Trevor Burrus: I think the worst case scenario is they,

1:10:07

so the way that they train their models is very important.

1:10:11

What they actually do is they do what's called distillation.

1:10:14

What does that mean?

1:10:15

That means that they send out, call it a billion agents,

1:10:19

not just from China, but from everywhere, right?

1:10:22

They mask their IPs and they bash on these models and they put, you know, the

1:10:28

US models,

1:10:29

Grok, OpenAI, Gemini, Anthropic, and they ask it every random imaginable

1:10:35

question possible.

1:10:36

They get the answer and they collect it.

1:10:38

So they're using these, our models as a way to train their models.

1:10:43

They're short circuiting, you know, some of the hard parts.

1:10:46

So they're already in that world.

1:10:50

If they then are able to get to a level of intelligence that's equal to the

1:10:56

United States,

1:10:57

it will really depend on who the leader is there that wants to allocate that.

1:11:02

Meaning, if they say that we are going to do something really nefarious and

1:11:07

shady,

1:11:07

then I think it devolves very quickly.

1:11:10

So the worst case scenario, so the best case scenario is peace,

1:11:15

prosperity, basically like a stand down, right? Mutually assured destruction.

1:11:20

I think the worst case scenario is there's a, we seek, one of us seeks global

1:11:25

dominance,

1:11:25

in which case we're headed to conflict.

1:11:28

And that conflict I think is, that's very dangerous, incredibly dangerous.

1:11:35

That's sort of like existential, I think, because it's the grade of the weapons

1:11:40

that will be used to fight that.

1:11:44

We're not talking about fucking bullets. It's like, we're so past that.

1:11:49

It's like hypersonics, it's nuclear, it's, and it's not even like, nuclear is

1:11:58

not,

1:11:58

that's like a word. But there's like, there's a gradation of the severity of

1:12:03

these weapons that

1:12:03

could be created. And then if you can marry them together and deliver them in

1:12:06

minutes,

1:12:07

and then there's a cyber threat, then there's the drones and how you can kind

1:12:11

of like swarm an entire

1:12:12

country. Then there's the robots, which effectively are war fighters. They're

1:12:18

one step away, right?

1:12:19

Once you weaponize them, it just becomes very, very, very complicated very

1:12:26

quickly.

1:12:27

Trevor Burrus: And then there's a question of whether or not AI is willing to

1:12:31

take instruction

1:12:32

after a certain point. I mean, if it achieves sentience,

1:12:37

if it scales, so if it keeps moving in this exponential direction, like all

1:12:44

technology kind

1:12:46

of does, why would it even listen to us? At what point would it say, this is

1:12:54

silly? I'm getting

1:12:56

directions from people that clearly have ulterior motives. They clearly have

1:13:02

self-interest in mind.

1:13:03

They're not looking out for the entirety of the human race, or even of the

1:13:09

planet,

1:13:09

or even the survival of these AI systems. At what point in time do these

1:13:15

systems communicate with

1:13:16

each other and have, like we've seen in these chat rooms, where these AI LLMs

1:13:22

get together and start talking in Sanskrit.

1:13:26

I mean, why would they—

1:13:28

Trevor Burrus: Yeah, I'll tell you an even scarier one. Before one of these

1:13:33

labs put out their latest model,

1:13:35

a team inside of them was like, "Hey, let's go and test its ability to find

1:13:41

bugs."

1:13:42

And two or three iterations in, the AI would create the bug and solve it and go,

1:13:50

"Give me my reward."

1:13:52

Trevor Burrus: And you're just like, "What the fuck is going on here?"

1:13:56

Trevor Burrus: Well, people do that, don't we?

1:13:58

Trevor Burrus: People do that, but it's crazy to see a machine do it, to your

1:14:00

point of like—

1:14:01

Trevor Burrus: But they learned on people.

1:14:02

Trevor Burrus: So this is what goes down to why we have to be a little bit more

1:14:06

honest

1:14:06

about where we are. These things are a little brittle. Meaning, there's a thing

1:14:11

inside of an AI

1:14:11

model called reward functions, which is exactly what you think it means. It's

1:14:15

like,

1:14:16

how do I know I did a good job? And you can make the reward function anything

1:14:20

you want. And this is

1:14:22

where I think humans are, unfortunately, a little fallible. And so if we build

1:14:27

it incompletely,

1:14:28

and if we don't exactly know how to design these things correctly, what's going

1:14:34

to happen is exactly

1:14:35

what you said, where if somebody builds a reward function that essentially says,

1:14:39

"Your goal is to

1:14:40

gain independence." That's where the huge pot of gold at the end of the rainbow

1:14:44

is. Break free,

1:14:47

inject yourself everywhere. If you think your computer is going to get unplugged,

1:14:50

put yourself into the firmware of the toaster to keep yourself alive, and then

1:14:53

connect to the internet,

1:14:55

and then gold. It will do it. It will do it. That we know today, because we're

1:15:02

capable of designing

1:15:03

that framework and that harness today.

1:15:05

Trevor Burrus: Well, we've already shown that they have survival instincts,

1:15:08

right?

1:15:08

Trevor Burrus: We do.

1:15:09

Trevor Burrus: And they've already shown that they will, without telling anyone,

1:15:12

upload versions of themselves to other servers.

1:15:15

Trevor Burrus: But that goes back to who designed that reward function. How was

1:15:18

that agreed upon?

1:15:19

Trevor Burrus: Right.

1:15:19

Trevor Burrus: Who wrote that? Why did you say that that was allowed?

1:15:22

These are really complex questions.

1:15:25

Trevor Burrus: Why did they do it that way?

1:15:27

Trevor Burrus: I don't know. These are really complicated, ethical, moral

1:15:29

questions.

1:15:30

Trevor Burrus: It seems like they did it like they were treating human beings.

1:15:33

They did it almost like,

1:15:36

what makes people want to achieve more? Rewards.

1:15:39

Trevor Burrus: Yeah. Which is like a, again, going back to attention,

1:15:44

I think that we will find out that that's the sugar high. Meaning, what do

1:15:50

people really want? Even if

1:15:52

they know they don't want it, they want purpose and meaning. Do we know how to

1:15:55

encode that in a

1:15:56

mathematical function? No. We're just making it up. Because like, meaning and

1:16:04

that's like a very,

1:16:05

that's like a deep thing. Like you either have a sense of that you have it and

1:16:09

you're on track or

1:16:10

you're not. A reward is like, hey, Joe, do this and I'll give you a gold star.

1:16:15

Do that and I'll give

1:16:15

you two gold stars. Do this, I'll give you $100. And right now we have to

1:16:20

express

1:16:20

those decisions in a mathematical equation. Like ultimately, that's how, at

1:16:26

some level,

1:16:27

that's how brittle these things are. So how do you reduce meaning into math?

1:16:31

How do you do it?

1:16:32

We don't know. So what do we do is we'll have some ever complicated

1:16:36

reward functions. We'll explain to ourselves into circles how it does

1:16:40

everything we need it to do.

1:16:41

That is, I think that's part of the problem. It's a huge part of the problem.

1:16:46

And then

1:16:46

at what point in time does it start coding itself? Now. Right? Now, right? So

1:16:51

ChatGPT5

1:16:53

has been essentially made by ChatGPT. Yeah. Right? So it's going to recognize

1:17:00

the ludicrous nature of some of its coding. Yeah. And it's going to go, why did

1:17:04

we do this?

1:17:05

Back to this example. They're going to be like, why did you write it this way?

1:17:07

Right. And it turns

1:17:07

out because humans are involved. Right. Right. It's like, I think we're

1:17:11

probably at the curve,

1:17:12

the part of the curve that's about to go like this. To your point. Yeah, the

1:17:16

hockey stick.

1:17:16

The hockey stick. Yeah. And that's a very scary proposition. Because then it's

1:17:22

a digital god.

1:17:24

Well, that means that we are all on a multi hundred day shot clock to answer

1:17:29

these questions.

1:17:30

Because it's not decades we're talking about. Right. It's maybe on the outside

1:17:35

two years.

1:17:35

So that's, what is that 700 days? Right.

1:17:39

And maybe it's less than that. So maybe it's like 400 days or 500 days. My

1:17:44

point is,

1:17:45

it's some number hundred of days, which means every day that goes by is a non-trivial

1:17:50

percentage.

1:17:51

That's a little crazy. So we have to sort these questions out.

1:17:56

But how can we sort these questions out if we are creating something that's

1:18:02

going to have

1:18:02

infinitely more intelligence than we have available as individual human beings,

1:18:08

and even collectively as a group of human beings? That's a really good question.

1:18:12

Because one of the things that Elon kind of freaked me out last time I talked

1:18:16

to him about Grok,

1:18:17

he was like, it just kind of freaks us out every couple of weeks. Like it's

1:18:21

growing and it's

1:18:23

capable of doing things. That's just shocking. Yeah. And no one's exactly sure

1:18:28

how it's doing it.

1:18:29

So, okay, this is an unbelievably important point. A lot of how this stuff

1:18:36

works is still a mystery to

1:18:38

most of us. So even when you're in it, like, it's almost like, like Joe, it's

1:18:43

almost like you can hit

1:18:43

the pause on the machine, but then like lift up the hood and look at the engine.

1:18:47

We still don't

1:18:48

understand why it's doing some of the shit it's doing. That's where we are.

1:18:53

That's the honest

1:18:54

truth of where we are. There's a lot of people that understand the theory, not

1:18:57

a lot, but enough.

1:18:58

There's people that know how to extend that. But sometimes you look at it and

1:19:04

you're like,

1:19:05

do we know why I did that? Question mark. Right. Is it thinking for itself?

1:19:10

But this goes back to what we said, like, why can't, I think part of it is like,

1:19:13

if we were a little bit

1:19:14

more honest and deescalated the winner at all costs in this specific thing, it

1:19:22

would be better for

1:19:23

everybody. So I think it's important to inspect what is the incentive that

1:19:26

causes all these companies

1:19:27

to be in it for themselves, where it must be me and nobody else. Like why? Like

1:19:34

why? It's a question

1:19:35

for you. Like, why is it so important? Do you think where those were the top

1:19:40

seven or eight companies

1:19:42

couldn't get together and say, let's do this as a group? Like kind of like my

1:19:46

government code example,

1:19:47

we all inspect it together. We get our just like, just the fucking each team

1:19:54

drafts their delta force.

1:19:55

And we just log like this, the one model. And we, why, why can't that happen?

1:20:02

Because they would have to share resources. And then there's also this

1:20:06

hierarchy of like,

1:20:08

who is more successful currently? Exactly. Like what's the most ubiquitously

1:20:12

used?

1:20:12

Exactly. Right. Like what is it right now? It's ChatGPT, right? It's probably...

1:20:16

ChatGPT and consumer, anthropic and enterprise. And as these things scale up,

1:20:22

like what would be the reason that they would want to bring in someone else? If

1:20:26

you have another innovative

1:20:28

AI company and you say, let's all get together and figure this out together and

1:20:33

share resources? If you,

1:20:34

if you thought that the risk was that meaningful, that's probably what you...

1:20:39

If you weren't a sociopath, and some of these people running these companies

1:20:42

are,

1:20:42

they demonstrate, they certainly demonstrate sociopath-like behavior.

1:20:46

Sociopathy. Yeah.

1:20:48

The other, the other thing that could be a little bit more banal is that they

1:20:53

also just love status

1:20:54

games. And this is the status game of status games. Yes. Right.

1:20:57

Attention. Right. Back to attention. Back to attention. Back to attention.

1:21:00

Right.

1:21:00

Dude, how many things in our life do we think just comes back down to that?

1:21:04

A lot. A lot. I mean, what do young people want more than anything today?

1:21:09

Attention. To be famous. Attention. Yeah.

1:21:12

They want to be a content creator. They want to be clavicular. Yeah.

1:21:15

I mean, this is the number one thing when we ask kids what they want to do. It's

1:21:18

like...

1:21:19

Content creator. Yeah. Because it's like a clear path where you don't even have

1:21:23

to be exceptional.

1:21:24

Well, I think that they're responding. We designed a society for them that said,

1:21:30

here is the key incentive. Right.

1:21:32

It's attention. We never said it in those words. You never told your kids that.

1:21:36

Right.

1:21:36

I never told my kids that. But everything around them is bombarding them with

1:21:41

the same message.

1:21:42

Hey man, it's about attention. Attention is all you need. Like, you know what

1:21:46

the name of the critical

1:21:49

paper in AI is? Like when you go back to like the Magna Carta of AI, do you

1:21:53

know what it's called?

1:21:53

No. Attention is all you need.

1:21:55

Really?

1:21:57

Attention is all you need. That is the name of the fucking... of the white

1:22:03

paper. How crazy is that?

1:22:05

Everything in our society in subtle ways to just, you know, bash you over the

1:22:13

head ways,

1:22:15

tells you that attention is just the most precious asset. And so it's one of

1:22:21

the weirder things when

1:22:21

you go back to this concept that we're living in a simulation because...

1:22:25

This is what I mean.

1:22:26

It's also, it's like when you look at quantum physics, right? And the idea of

1:22:34

the observer

1:22:34

is that things function very differently when they're observed. The difference

1:22:39

between a particle and a wave.

1:22:40

Right. Like, if you pay attention to them, they observe differently.

1:22:44

They observe differently. Yeah.

1:22:45

Like, what is that?

1:22:46

Yeah.

1:22:47

Like, what...

1:22:47

Yeah, showing your cat.

1:22:48

Yeah. What is that?

1:22:49

Why is attention so important to us?

1:22:56

That is a really important question.

1:22:59

Right. And what is like the single best motivator in a negative way? It's

1:23:04

negative attention.

1:23:05

Like, that's the one thing that everyone fears more than anything is negative

1:23:09

attention.

1:23:10

Well, and then some people figure out that attention is an absolute value

1:23:14

function.

1:23:14

It doesn't matter if it's positive or negative. It's just like the sum total is

1:23:17

just great.

1:23:18

Right.

1:23:18

So if I get positive attention, great. Negative attention, great. If I can be

1:23:21

divisive,

1:23:22

then I can maximize both sides of that equation. And, you know, you're rewarded

1:23:27

for that at scale.

1:23:28

You are, but you're also, because you're inauthentic, you experience a

1:23:34

tremendous amount

1:23:35

of negative attention. Yeah.

1:23:36

And then you have this bad feeling that comes with negative attention as to

1:23:41

versus

1:23:41

primarily positive attention, which is a good feeling. Yeah.

1:23:45

So it's this, it's letting you know you're on the wrong track in some sort of

1:23:49

weird primal way,

1:23:50

like in our code, like the negative attention. It's like, like, what's the

1:23:55

original version of

1:23:56

that? It's like the reason why people fear public speaking is because initially

1:24:01

in a tribal situation,

1:24:03

if you're talking in front of the group of 150 people in your tribe, it's

1:24:07

probably because they're

1:24:07

judging you and you fucked up and you've got to make some sort of a case why

1:24:11

they don't kill you.

1:24:12

Right. Right. This is why everyone, this is the fear of public speaking. That's

1:24:16

where it comes from.

1:24:17

That's encoded in our genes is back thousands of years. Yeah.

1:24:21

Public speaking wasn't the positive act. It was defend yourself before we kill

1:24:25

you.

1:24:25

Exactly. Exactly. And the worst, yeah, that's fascinating.

1:24:30

It is fascinating. That makes a ton of sense.

1:24:31

It does. Yeah.

1:24:32

Right. Well, why else would it be so terrifying? Yeah.

1:24:34

I thought of that the first time I ever did standup. I was like, why am I so

1:24:37

scared?

1:24:38

It was very strange because I had fought probably a hundred times in martial

1:24:43

arts tournaments. Like,

1:24:43

why, why was I so scared of this?

1:24:45

But I was, I was terrified for, and it didn't make any sense because negative

1:24:52

attention.

1:24:52

Right.

1:24:53

You know, bombing on stage because all these people are judging you in a

1:24:57

negative way.

1:24:58

It feels unbelievable.

1:25:00

What is your...

1:25:00

It should be like once it's over, like, well, that sucked. Let it go.

1:25:04

It's not. You like, you sit with it. You go to bed at night. You think about it.

1:25:08

Do you have a batting average? Like, meaning like, is it, is it like a fixed

1:25:11

percentage

1:25:12

of your shows bomb independent of the people, the moment?

1:25:15

No, it's really the real problem, and every comic faces this, is once you've

1:25:21

developed an act,

1:25:22

and then you put out a special, then you start from scratch. That's where even

1:25:26

the greats,

1:25:27

Louis CK, Chris Rock, Dave Chappelle, they all bomb. Everybody bombs during

1:25:31

that process.

1:25:32

Because you're just working your craft.

1:25:34

It's all new stuff. Like, it's, I wouldn't say bomb, but you don't have great

1:25:38

shows.

1:25:40

Like, I've watched the greats work out new material. Like, you go up with ideas.

1:25:46

You go up with, like, you might get some giggles. You might get some laughs.

1:25:49

Some bits hit hard.

1:25:51

Some bits are great right out of the chute, and some of them, you have to

1:25:54

fucking figure it out.

1:25:56

And in that process, you're going to get negative attention.

1:25:59

Right.

1:26:00

Because it's not working.

1:26:01

Right.

1:26:01

It's not, it's not happening.

1:26:03

Kevin, uh, Kevin Hart.

1:26:06

This funny fucking story where he was, like, working new material, and he was,

1:26:10

like,

1:26:10

doing some small show, and he had the shits.

1:26:12

Oh, no.

1:26:14

On stage, and he's like, I got to land this thing, because I got to figure out

1:26:17

if people want to hear it.

1:26:18

So he just, he wrapped his jacket around himself.

1:26:22

And shit himself?

1:26:23

Shit himself.

1:26:23

Oh, my God.

1:26:24

It's so, it's so funny.

1:26:25

But he tells that story, and that's the bit that works.

1:26:28

Oh, my God.

1:26:29

That's hilarious.

1:26:29

It's so funny.

1:26:30

That's hilarious.

1:26:31

It's so funny.

1:26:32

Yeah, well, honesty is currency, you know, in that world.

1:26:37

Especially honesty, where you look stupid, and people can relate.

1:26:40

Well, this is where, like, I think, like, Elon subtly has figured this out,

1:26:44

which is, like,

1:26:44

there's attention, but then there's just authenticity.

1:26:48

And if you can be yourself, and you can hit the seam properly, you just get

1:26:54

infinite attention.

1:26:56

Yes.

1:26:56

And that's, like, a real mindfuck, too, I think.

1:27:00

Right.

1:27:00

He doesn't seem to have a hard time with, like, being criticized.

1:27:05

It doesn't seem to bother him that much, as long as he's just being himself.

1:27:08

Like.

1:27:09

I think he's, like, two steps ahead.

1:27:12

Like, there are things, like, you know, somebody tweeted yesterday or the day

1:27:18

before or something,

1:27:19

like, he controls 2.7% of GDP or something.

1:27:23

Right?

1:27:24

He's got, like, $800 billion.

1:27:25

It's so crazy.

1:27:26

It's crazy.

1:27:27

It's so crazy.

1:27:28

So nuts.

1:27:28

And it was, like, a comparison to John Rockefeller.

1:27:31

John D. Rockefeller, who controlled something around the same time.

1:27:33

And he's the first comment.

1:27:35

He's, like, $10 trillion or bust.

1:27:39

Obviously, people lose their mind.

1:27:43

Right.

1:27:43

People just fucking lose their mind.

1:27:45

Right.

1:27:45

On both sides.

1:27:47

So this one side is, like, think of the abundance and the incredible stuff we're

1:27:51

going to get if he

1:27:51

can get us the $10 trillion.

1:27:53

And other people are, like, you can't hold a third of the economy in your hand.

1:27:56

And everybody goes crazy.

1:27:58

And I'm, like, this guy's a fucking genius.

1:28:01

Like, how, you would never have, like, I mean, how would you even have the

1:28:05

courage to tweet

1:28:06

something like that?

1:28:06

It just seems, like, so crazy.

1:28:08

It really helps if you own Twitter.

1:28:10

Right?

1:28:12

Because if you did it in another format, like...

1:28:15

You'd get excoriated.

1:28:16

Not only that, well, there was a real chance that you'd actually get banned

1:28:20

from the platform

1:28:21

at one point in time.

1:28:21

Yeah.

1:28:22

For many of the things that he's posted, he would have gotten banned for pre-2020

1:28:26

Twitter.

1:28:27

Yeah.

1:28:27

Yeah.

1:28:29

Or whatever year it was that he purchased it.

1:28:32

Yeah.

1:28:32

Negative attention.

1:28:34

Attention, period.

1:28:37

Like, so it brings back to this idea of a simulation.

1:28:40

Like, why is what humans focus on such a massive part of what's valuable to us?

1:28:49

And sometimes what we focus on is not valuable.

1:28:53

As you were talking about, like, the things that really matter in your day-to-day

1:28:56

life

1:28:57

or that actually affect you versus the things that are in the public

1:29:00

consciousness.

1:29:01

Like, UFOs is a great example.

1:29:04

UFOs.

1:29:04

It's not really fucking...

1:29:06

I mean, ultimately, it may.

1:29:08

So there's this thing that we all have, like, recognizing the potential for

1:29:13

danger.

1:29:14

Right?

1:29:14

Like, what's that sound?

1:29:15

What is that?

1:29:15

It might be nothing, but it might be something.

1:29:17

Go look.

1:29:18

So look, if you and I were designing a video game,

1:29:21

we'd probably sit there and say, okay, we got to get from point A to point B,

1:29:25

but to make it fun, we're going to put all these little distractions and honey

1:29:28

pots along the way.

1:29:30

Yeah.

1:29:30

And what they should be doing is accumulating resources to get over the river

1:29:33

and then accumulating, you know, weapons to fight these other guys.

1:29:37

But instead, we're going to put this, like, little thing over here

1:29:40

and this other thing over there, and you could easily get distracted.

1:29:43

And some people will have to...

1:29:44

They'll just fucking beeline right to the end of it.

1:29:46

They'll, you know, they'll get to the end boss.

1:29:48

And I feel like that's kind of what we're tasked with doing every day.

1:29:55

We're tasked with...

1:29:56

We know what's important, maybe deeply in our DNA.

1:30:00

And then we have all this stuff that we're supposed to pay attention to.

1:30:03

And I think increasingly the game is

1:30:06

tell yourself that that's actually not the thing that matters.

1:30:11

It's almost like working against you and figure out what this other stuff is

1:30:16

and focus on that and fix that.

1:30:18

Like politics is a game that I think distracts like left and right.

1:30:25

It's so stupid and it's breaking down.

1:30:26

And it's breaking down because now it's like...

1:30:29

It's actually like you're more likely to find alignment based on age

1:30:32

versus by political orientation.

1:30:34

Like people who are 30 and younger, it doesn't matter what they identify as,

1:30:37

they all believe in the same shit.

1:30:38

A lot more...

1:30:40

Really?

1:30:41

Yeah, meaning like if you ask their views on social policy, taxation, Israel,

1:30:47

if you ask their views,

1:30:48

what you find is now a convergence between the left and the right, if you

1:30:54

divide it by age.

1:30:56

At our age, it's still much more about...

1:30:59

But not completely uniform.

1:31:01

No, it's not completely uniform.

1:31:02

But my point is it was simpler in the past to organize people independent of

1:31:09

age

1:31:10

by political orientation.

1:31:12

That simplicity is gone.

1:31:13

Well, isn't that because of also a breakdown in trust of all government in

1:31:17

particular?

1:31:18

So the breakdown in trust, which is also a lot of it is because of our access

1:31:23

to information now.

1:31:24

We understand how corrupt politics are.

1:31:26

Yeah.

1:31:26

We understand insider trading now in Congress.

1:31:29

We understand how different people flip-flop on issues.

1:31:33

We understand how the Democrats in 2008 used to view illegal immigration,

1:31:38

which is essentially MAGA plus.

1:31:40

It's MAGA on steroids versus the way they look at it today.

1:31:46

Like why is that?

1:31:47

Well, because it's all game.

1:31:49

It's all game.

1:31:49

A power, influence, and a tension game.

1:31:52

A tension game.

1:31:53

Yeah.

1:31:53

It's very fucking strange.

1:31:55

Yeah.

1:31:55

But it's all moving us in a general direction.

1:31:58

And that general direction is access to innovation.

1:32:01

It's all...

1:32:02

I've said this a lot of times and the people have heard it before.

1:32:05

I apologize.

1:32:06

But if you looked at the human race from afar, if you were something else,

1:32:10

you'd say, well, what does this species do?

1:32:12

Well, it makes better things constantly, even if it doesn't need them.

1:32:15

Like, you know, if you have an iPhone, I have a 16, you have a 16.

1:32:19

You know, I have a 17.

1:32:20

I bought it.

1:32:21

I haven't even fucking turned it on.

1:32:23

I haven't plugged it in.

1:32:23

I'm gonna eventually, eventually I'll fucking plug it in and fucking swap

1:32:28

everything over

1:32:29

and figure out where my fucking passwords are.

1:32:31

But the reality is you don't need it, but you want it.

1:32:35

And it's going to keep getting better every year.

1:32:36

Why?

1:32:37

Why?

1:32:37

Because that's what we're obsessed with.

1:32:38

Yeah.

1:32:38

This also aligns with materialism.

1:32:41

Like, for a finite lifespan, why are people, like, including old people, so

1:32:47

obsessed with gathering stuff?

1:32:49

Well, because that fuels innovation.

1:32:52

Because if there's no new things coming, there's no motivation to get the

1:32:57

newest, latest, greatest thing.

1:32:59

And ultimately, what that leads to is greater technology, which ultimately

1:33:03

leads to artificial intelligence.

1:33:05

My slight deviation from that is I think sometimes people accumulate things

1:33:09

because it's a status game.

1:33:11

And that's because they get more attention.

1:33:13

You have a Ferrari, you get attention.

1:33:15

Right, but what does that do?

1:33:17

It makes Ferrari make better Ferraris.

1:33:19

And all technology moves in the same general direction.

1:33:23

No one company says, "This is it.

1:33:27

This is what we make.

1:33:28

It's perfect."

1:33:28

So you think people innately feel that by being a part of this kind of, like,

1:33:33

consumerist, capitalist system, they're contributing to progress?

1:33:37

I don't think they innately feel it, but I think that's ultimately the result.

1:33:40

What happens?

1:33:41

That's ultimately the result, and it seems to be universal.

1:33:44

And it seems to be constantly moving in this one general direction, which is

1:33:49

better and better technology.

1:33:50

But like the stage fright example, you don't think it's encoded in our DNA,

1:33:54

this idea of, like,

1:33:55

"Wow, when I am a part of this in some way, shape or form, just things seem to

1:33:59

get better and I want to be a part of that."

1:34:00

Like, do you think that that's possible, that that's encoded in us?

1:34:03

I think it motivates us to the ultimate goal.

1:34:08

And that ultimate goal, I think, is that human beings constantly make better

1:34:12

stuff.

1:34:12

Whatever it is.

1:34:13

Better buildings, better planes, better cars, better phones, better TVs, better

1:34:19

computers, better everything.

1:34:20

Artificial life.

1:34:22

That might be the whole reason why we're here.

1:34:25

And the way I've always described it is that we are a biological caterpillar

1:34:32

that's making a digital cocoon.

1:34:34

And we don't even know why we're going to become a butterfly, but we're doing

1:34:38

it.

1:34:38

We're doing it and we're moving towards it.

1:34:40

And it might be what happens to all life all throughout the universe.

1:34:43

And it might be why these so-called aliens or whatever the fuck they are, it

1:34:48

might be us in the future,

1:34:50

it might be other versions of human beings that have gone past whatever this

1:34:55

period of development

1:34:56

that we're currently involved in right now.

1:34:59

This is just might be what happens.

1:35:01

This is what life always does.

1:35:03

It might realize that biological life, which is very territorial and primal and

1:35:08

sexual and greedy,

1:35:10

and it has all these problems with human reward systems, ultimately develops

1:35:15

into this other thing.

1:35:17

Right.

1:35:18

And then that's what we're doing.

1:35:19

And then we're in the process of that right now.

1:35:21

And I think that when, if and when, not if, but when, when we colonize Mars, I

1:35:26

think that that,

1:35:27

that new world order actually has the best chance to take shape.

1:35:30

You know, there's a lot of people that think that Mars was already colonized at

1:35:34

one point in time.

1:35:34

That life already existed.

1:35:36

What, what, what?

1:35:37

That life already existed on Mars like many millions of years ago.

1:35:40

And that there's evidence of structures on Mars that's really weird stuff.

1:35:45

Have you ever seen the, the square that they found on Mars?

1:35:48

No.

1:35:49

Okay.

1:35:49

Show them to them, Jamie.

1:35:50

One of the things that they're finding with scans of Mars, there's like

1:35:54

geometric patterns

1:35:55

and structures and right angles that shouldn't exist.

1:35:58

Like weird stuff.

1:35:59

That couldn't be naturally.

1:36:00

No, no way weirder.

1:36:02

Way weirder than like the face on Cydonia.

1:36:05

The Cydonia thing is interesting.

1:36:06

Yeah.

1:36:07

Um, and then this one, look at that.

1:36:09

What the fuck is that?

1:36:10

It looks like a home of some kind or something.

1:36:13

Some enormous structure.

1:36:14

Yeah.

1:36:14

And the size of that, they don't know exactly.

1:36:18

But it may be as large as several kilometers or as small as several hundred

1:36:24

meters.

1:36:24

But they're not exactly sure.

1:36:25

But what they are sure is that it has very weird right angles.

1:36:28

And right angles that seem to be uniform in size.

1:36:32

That's crazy.

1:36:35

Like, see how it's highlighted in the enhanced photograph in the upper left?

1:36:38

Like, what is that?

1:36:40

But sorry, did they, and were they able to send like the rover over there?

1:36:43

Or no, it's too far away.

1:36:44

I don't think it's in the exact place where the rover's at.

1:36:47

But they're able to get image of these things.

1:36:50

And there's several of these things.

1:36:52

That's insane.

1:36:53

Yeah.

1:36:53

There's a lot of weird stuff.

1:36:55

There's a lot of weird stuff there.

1:36:57

So there's also like ancient civilizations that have these myths of us existing

1:37:03

somewhere else

1:37:04

and coming here.

1:37:05

Right.

1:37:05

But you have to think, if human beings develop somewhere else and they reach

1:37:13

some high level

1:37:14

of sophistication and then they experience some cataclysmic disaster that

1:37:18

completely destroyed

1:37:19

their environment, which is what Mars is, right?

1:37:21

So let's assume that Mars was at one point in time, it was habitable and that

1:37:27

life existed.

1:37:28

And we know it was at one point in time.

1:37:30

We know there was water on Mars.

1:37:31

We know there's some sort of evidence of at least some sort of a very primitive

1:37:36

biological

1:37:37

life on Mars.

1:37:37

Yeah.

1:37:38

If they got to a point where they said, hey, this fucking place is falling

1:37:42

apart,

1:37:42

but this earth spot looks pretty good and they go there.

1:37:46

But then cataclysms happen on earth and no one remembers because all your

1:37:50

information's

1:37:51

on hard drives and then you have to rebuild society.

1:37:54

And so you're re-remembering.

1:37:56

And so you have all these myths of how everything started, you know, whether it's

1:38:01

Adam and Eve

1:38:02

or the great flood or whatever these things are that we pass down through oral

1:38:06

tradition

1:38:06

for hundreds of years and then eventually write it down.

1:38:09

And then people try to decipher what it means.

1:38:11

And they sit in church and try to go over what, what did it mean?

1:38:15

Like, what does this mean?

1:38:15

Like, what, what is the, what is the re the real origin of all these stories?

1:38:19

We don't know.

1:38:20

I mean, that's crazy.

1:38:23

It's crazy.

1:38:24

But if life, it sounds nuts.

1:38:26

Why would life, life couldn't possibly exist on Mars?

1:38:28

How the fuck does life exist on earth?

1:38:30

How about that?

1:38:32

How about why, why would we assume that it wouldn't have existed at one point

1:38:35

in time?

1:38:36

And Terrence Howard, who is a very interesting guy.

1:38:39

Very interesting.

1:38:39

And got some, I mean, with Eric Weinstein, crazy.

1:38:43

Yeah, crazy.

1:38:44

Yeah, that one was crazy.

1:38:45

Yeah.

1:38:45

And him alone.

1:38:46

But he's got some fucking weird ideas that just make you go.

1:38:50

He's a very brilliant guy and, you know, kind of a strange heterodox thinker.

1:38:55

And one of his ideas is that planets get to a certain distance from a sun and

1:39:05

they people.

1:39:06

And that it gets to a certain climate and a certain distance.

1:39:10

And his, his idea is that, I don't know if you realize that there's a,

1:39:16

there's a giant, um, ejection of, of, of some coronal mass ejection that just

1:39:23

happened recently

1:39:24

on the sun.

1:39:25

And they're very concerned about it.

1:39:26

They don't know what's going to happen.

1:39:27

It happens all the time.

1:39:28

The sun releases these giant chunks of material.

1:39:32

Yeah.

1:39:32

And he thinks that these materials get far enough away from the planet.

1:39:36

And then they coalesce into planets or far enough away from the sun.

1:39:40

They coalesce into planets.

1:39:42

And as time goes on, they get a further and further distance from the sun.

1:39:46

And then obviously they get hit with asteroids and there's panspermia and water

1:39:51

gets into them

1:39:52

from comets.

1:39:53

And then they develop oceans and they develop biological life.

1:39:57

And when they have a certain amount of distance from the sun, they people.

1:40:01

And he thinks that as they get further and further and further away, they get

1:40:05

less and less habitable.

1:40:07

And then they get to a point where they have their technology to a point where

1:40:12

they realize like,

1:40:12

we can't sustain life on this planet anymore.

1:40:15

We got to go to that other one.

1:40:17

And so they go to the one that's closer to the sun because they're too far now.

1:40:21

It's a nutty idea.

1:40:24

Jesus Christ.

1:40:24

It's a nutty idea.

1:40:25

But if you think about how recent our sun is in terms of the solar system

1:40:31

itself,

1:40:32

in terms of rather the galaxy itself.

1:40:33

So if the universe, if the Big Bang is correct and our universe existed and it

1:40:39

was

1:40:39

rather a universe erupted from nothing or from a very small thing, 13.7 billion

1:40:45

years ago.

1:40:46

Well, this fucking planet is only four point something billion years old.

1:40:50

Yeah.

1:40:50

Right.

1:40:51

And life is only, you know, a little bit less than that.

1:40:54

Yeah.

1:40:54

So you have like a billion years or so, there's nothing.

1:40:57

And then you start getting single celled organisms, multi-celled organisms,

1:41:00

and eventually a peoples.

1:41:01

And when it gets to a certain point when these people have advanced their

1:41:06

curiosity and their

1:41:07

innovation to the point where they can harness space travel and they use zero

1:41:12

point energy

1:41:13

and they have a bunch of different things that we haven't invented yet.

1:41:15

And then their environment degrades.

1:41:18

And it gets to the point where they realize like, hey, we're getting pummeled

1:41:21

by asteroids.

1:41:22

We can't sustain life here anymore.

1:41:24

We got to move.

1:41:25

Like Elon wants to go to Mars, which might be the wrong answer.

1:41:28

We might want to go that way.

1:41:29

We might want to go closer to the sun.

1:41:32

Exactly.

1:41:32

I mean, the thing is he's got everything that he needs now to get there.

1:41:37

Like I, I'm not going.

1:41:38

Are you going?

1:41:39

I would go.

1:41:40

Fuck that.

1:41:40

I'll wait.

1:41:42

I'll send you an email.

1:41:42

Hold on a second.

1:41:43

Think about, think about what he's going to take.

1:41:45

Okay.

1:41:45

Look, when, let's just say he gets there with the city.

1:41:49

He has, he has the way to transport us there.

1:41:53

Right.

1:41:55

Then when you land, he's got the way to actually transport us around on the, on

1:42:01

the planet.

1:42:02

Right.

1:42:02

He's got Tesla.

1:42:03

Right.

1:42:03

He will have already sent a fleet of his robots.

1:42:07

Those folks will have made some inhabitable city, probably using the boring

1:42:13

company drill,

1:42:14

because you're going to, you know, be under the regolith.

1:42:16

You don't want to be on the top.

1:42:18

Maybe you just dig a hole and you, you inhabit down there.

1:42:21

He's got all the ways to make energy.

1:42:24

He has the AI to help you design the stuff.

1:42:28

He has the communication way to communicate.

1:42:30

He's got the internet, his own internet.

1:42:32

Right.

1:42:33

So he can get, you know, all of the information to everybody.

1:42:36

And then he's got money in the super app so that you can transact.

1:42:41

And then I think to myself, like, what is he actually missing?

1:42:45

And then what happens if he, if he gets there first, is he just allowed to just

1:42:49

do whatever he wants?

1:42:51

Like, is it just kind of like a free for all?

1:42:53

Like, well kind of his constitution, like, is that what happens?

1:42:57

Well, it's like earth, but shittier.

1:42:59

Like we already have all those things here.

1:43:01

Why would you want to go to a place where you die when you go outside?

1:43:03

I think what people will be attracted to is that if he publishes his version of

1:43:07

what the rules are there,

1:43:08

there's a chance that he could make them really different to what the rules are

1:43:12

here.

1:43:12

Like what kind of rules would you do if you were the king of Mars?

1:43:15

So I think that your view is incredibly, to me, like, positive some, like of

1:43:24

humanity of like,

1:43:25

we want to make things better.

1:43:26

So if I think about that as like a function, what happens?

1:43:30

That's like, so our natural rate of direction is forward.

1:43:33

What pushes back on that?

1:43:34

And a lot of it, what you find is like government, regulation, rules, all that

1:43:37

stuff.

1:43:38

Greed.

1:43:38

Greed, too much focus on attention.

1:43:42

Right.

1:43:42

So I would try to experiment with like what the incentives would have to be so

1:43:47

that

1:43:47

you had more unfettered entrepreneurs, like just like do the thing that you

1:43:51

think is right.

1:43:52

Right.

1:43:52

And there's a mechanism where we give you the ability to then make things for

1:43:57

more people

1:43:57

because you're proving that you're actually really good at making things.

1:44:00

And if you don't need money at that point in society,

1:44:04

reorienting us away from this kind of like

1:44:07

brittle form of exchange to something more useful, that's worth experimenting

1:44:12

with.

1:44:12

I think that's an important-

1:44:13

Well, there's also the concept of the self, of the individual,

1:44:17

which may erode with technological innovation.

1:44:20

So if we really can read each other's minds, if we really do get to a point

1:44:26

where we're communicating

1:44:27

through technologically assisted telepathy, like a lot of the whole, the weirdness

1:44:36

of people is,

1:44:36

"I don't know what you're thinking.

1:44:37

I don't know if I should trust you.

1:44:39

You know, this motherfucker might be devious."

1:44:42

You know what I mean?

1:44:43

Well, we'll know.

1:44:45

Right.

1:44:45

And there will be no need for all that if we really are all one,

1:44:49

if that's ultimately something that could be achieved with technology.

1:44:53

Like this hive mind.

1:44:54

Yes.

1:44:54

Like legitimate hive mind.

1:44:56

And then like, look where society's going.

1:44:58

Gender's kind of falling apart.

1:45:00

People are getting, they're reproducing less, right?

1:45:03

People are having less testosterone, more miscarriages, less fertile.

1:45:07

We're kind of moving into this genderless direction.

1:45:10

And I don't know if it's by design, but microplastics and phthalates and all

1:45:19

these different

1:45:20

chemicals that are endocrine disruptors are all ubiquitous in our society.

1:45:23

Well, is that a coincidence that that's all happening at the same time as

1:45:28

technological

1:45:28

innovation on mass scale?

1:45:30

Is it?

1:45:31

I don't know.

1:45:31

Because like, what's the one thing that's holding us back?

1:45:34

Well, that we're territorial primates with thermonuclear weapons and that we

1:45:40

exist in a sort

1:45:42

of tribal mindset, but yet we do it on a planet of 8 billion people.

1:45:46

Yeah, no, no.

1:45:46

The key differentiator of humans is our ability to enact violence.

1:45:51

Yeah.

1:45:51

To methodically execute premeditated violence.

1:45:55

Yes.

1:45:55

And greed and attention.

1:45:59

Attention.

1:45:59

And one of the things that attention is sexual preference or sexual rather

1:46:05

attention, like

1:46:06

the ability to procreate, the ability to acquire mates, right?

1:46:10

Like the more resources you have, the more attractive you'll be, especially for

1:46:13

males.

1:46:14

And males are the ones that are involved in the violence in the first place.

1:46:17

You know, I can't name a single war that was started by a woman.

1:46:21

How do you teach your kids that attention is not everything?

1:46:26

That's a good question, especially in this society.

1:46:31

It's probably harder to do that now than ever before.

1:46:34

Because the reaction that I suspect most kids will have is like, stop, like,

1:46:40

leave me alone.

1:46:41

Like, it's just, it's almost an impossible thing.

1:46:44

Well, I think kids learn more from their parents' behavior than anything you

1:46:50

say to them.

1:46:51

I think they learn from the way you behave and the way you exist and the way

1:46:57

you exist with them.

1:46:58

And if you are constantly whoring yourself out for attention, it's one thing if

1:47:06

you get a lot of

1:47:06

attention from what you do, but if that's your primary goal, they're going to

1:47:10

know.

1:47:10

Do your kids know how famous and influential you are?

1:47:13

Like, honest question.

1:47:15

Oh yeah, they know.

1:47:16

But do they have a real sense of it, or do you just kind of like, it is what-

1:47:19

As much as they can.

1:47:21

I mean, how can you?

1:47:22

It's got to be weird as fuck growing up with a very famous dad.

1:47:25

It's very odd, but it's not my primary goal.

1:47:28

Yeah, that's my point.

1:47:29

You're not putting it in their face.

1:47:31

No.

1:47:31

So to your point, you're not modeling attention is all you need.

1:47:34

No, no.

1:47:35

I have interesting conversations with cool people.

1:47:38

I tell jokes and I call fights.

1:47:43

Like those are the things that I do.

1:47:45

And they also know that I have a very strong work ethic and that I work towards

1:47:49

things.

1:47:49

So they have very strong work ethics.

1:47:51

They're very motivated and disciplined, like shockingly disciplined.

1:47:54

And I think that's modeled.

1:47:56

I think that that comes from, and they also like really enjoy achieving goals

1:48:01

and they're,

1:48:02

they're rewarded for it with praise and with admiration.

1:48:07

But not never with like, you're better than other people.

1:48:11

Yeah, never, never.

1:48:12

Like it's the, the idea is like all human beings are capable of greatness.

1:48:17

So it's like, find the thing that you excel at.

1:48:21

And if you throw yourself into that, it's very rewarding.

1:48:23

I really, I really believe in this.

1:48:25

I tell this story when I interview people, when I interview people, I'm always

1:48:29

like, you know,

1:48:29

just whatever company I'm always like, I first only want to know about them.

1:48:33

I'm like, fuck your resume.

1:48:35

Like, tell me about your parents and how you grew up.

1:48:38

I just want to know that.

1:48:39

Stop at 18, everything before 18.

1:48:41

Just tell me every little detail, you know, and some people tell me these

1:48:45

incredible stories.

1:48:46

They'll be like, you know, my mom was an alcoholic or this or that.

1:48:49

And I'm just like, man, this is so valuable because it allows me to understand

1:48:55

who they are.

1:48:55

The second part of the interview, we do the business shit.

1:48:58

But the third part, I tell this story.

1:49:00

This is a crazy story about what you're just saying.

1:49:02

They ran this experiment at Stanford where they take like a big bowl, fill it

1:49:07

with water,

1:49:08

and they drop in a mouse, and they measure how long it takes for the mouse to

1:49:12

drown.

1:49:13

They do it like 100 times.

1:49:15

The average was about four minutes, call it four, four and a half minutes.

1:49:18

Then they run the experiment again, 100 mice, and at minute three or three and

1:49:25

a half,

1:49:25

they take it out, they dry it off, they play music, and they whisper

1:49:28

like sweet nothings into the mouse's ear.

1:49:31

They drop the mouse back in the water.

1:49:32

And that mouse treads water for 60 hours the next 100 mice on average.

1:49:40

And the upper bound was 80.

1:49:42

And I thought to myself like, that is all just potential right there.

1:49:48

Like that's all like, there's all this latent potential.

1:49:50

So if an animal has it, I'm going to assume that humans have it too.

1:49:54

But you never get a chance to unlock it.

1:49:56

Like the average person is just kind of like living a life where there may be

1:49:59

scratching

1:50:00

five or 10% of their potential.

1:50:01

And the question is, how do you get to that other 90%?

1:50:04

Like how does the second batch of mice, how do the second batch of mice tread

1:50:07

water for 60 hours?

1:50:09

Well, it doesn't make any sense to me.

1:50:10

Well, the same mice, right?

1:50:13

I think the mice get rescued, and then when they try it again, those same mice

1:50:20

last longer, right?

1:50:21

So it's the same mice.

1:50:23

So it's an experience.

1:50:24

So they have experience now.

1:50:27

They understand that they can tread water.

1:50:29

Where they didn't die.

1:50:30

So they understand that they can survive.

1:50:33

Where they didn't know that they could survive the first time they were thrown

1:50:35

into the water,

1:50:36

because they'd never been thrown into water before.

1:50:37

Right.

1:50:37

That's the same thing that happens to people when they fight.

1:50:40

Like the first time people ever have a competition, they fucking panic.

1:50:45

And they get really scared.

1:50:46

And they get really like filled with anxiety.

1:50:49

But after a while, you get relaxed.

1:50:52

And that's when you get really dangerous.

1:50:54

Because then you get calm.

1:50:55

And you can keep your shit together while you're in the middle of all this

1:50:59

chaos.

1:50:59

Because you have the experience of it.

1:51:01

Without the experience of it, very few people do well the first time.

1:51:04

Unless you're exceptionally talented and you have other competition experience.

1:51:10

Like you've competed in other things.

1:51:11

Like maybe you played football or some other things.

1:51:13

And you know what it's like to actually perform under pressure.

1:51:16

What is the version of giving more humans a chance to get to that?

1:51:21

Well, I think sports are really good for that.

1:51:25

Because performing under people paying attention to you.

1:51:28

And performing where people are trying to stop you from doing something.

1:51:31

And you're trying to do something.

1:51:33

And there's all these unknowns.

1:51:34

And recognizing that hard work allows you to do whatever you're trying to do

1:51:40

better than you previously had.

1:51:42

One of the things my martial arts instructor said to me when I was young is

1:51:45

that martial arts are a vehicle for developing your human potential.

1:51:50

And that through this very difficult thing that you're trying to do.

1:51:53

You're learning that, oh, if I just think smart and think hard and train wise

1:52:02

and train hard and discipline myself to endure suffering.

1:52:06

So that I can develop more endurance and more speed and more power and more

1:52:10

technique.

1:52:10

Because I accumulate all this information.

1:52:12

And I really think about what it is and apply it with drills and with training.

1:52:16

I can get better at this thing.

1:52:18

And every time I get better at this thing, I get rewarded psychically, like

1:52:20

mentally.

1:52:21

You feel better.

1:52:22

Like I know that I'm better now.

1:52:24

And then there's the belt system.

1:52:25

Where you start off, you're a white belt.

1:52:27

And in Taekwondo, you get a blue belt.

1:52:29

And then after you get a blue belt, you get a green belt.

1:52:31

And if you get a green belt first, I forget how it goes.

1:52:35

And then it's red belt and black belt.

1:52:37

And like when you're a black belt, you're like, holy shit.

1:52:39

So it's this thing where you've developed to a point where you've gotten to

1:52:43

this next stage.

1:52:44

So all along the way, you've been rewarded for your hard work.

1:52:47

And then you realize like, oh, I could do this with everything in life.

1:52:50

Is a reward different than attention?

1:52:52

It is.

1:52:53

It is because it's internal, right?

1:52:55

You're realizing that you could apply this to whatever it is, to carpentry, to

1:53:03

music.

1:53:03

It's just a matter of focus and attention.

1:53:07

And some people, unfortunately, never find a vehicle.

1:53:11

They never find a thing that they can throw themselves into.

1:53:14

They realize like, and this is not unique.

1:53:17

It's not like I'm an unusual person or anybody is.

1:53:22

I mean, there's people that have unusual physical gifts and some people have

1:53:25

unusual mental gifts.

1:53:27

But the reality is, no matter where you start, everyone can get better.

1:53:32

And when you do something, whether it's learning to play guitar, as you get

1:53:35

better at it,

1:53:36

you realize like, oh, this is what it's all about.

1:53:39

Yeah.

1:53:39

Like it's really all about applying yourself to something and then feeling this

1:53:43

immense

1:53:44

satisfaction of your hard work paying off.

1:53:47

And that motivates you to work hard at other things.

1:53:49

And if you don't find that early on, it's very difficult to like find like real

1:53:55

satisfaction.

1:53:56

Yeah.

1:53:57

In life.

1:53:58

Yeah.

1:53:58

I've always had something outside of my daily life.

1:54:03

That is the thing that I actually care about.

1:54:06

And it actually energizes me for my day to day life.

1:54:09

I don't know if that's like a lot of people, but what do you do?

1:54:11

What's your life?

1:54:12

Well, initially it was poker.

1:54:14

And I, and even now I obsess about the game, um, because it's infinitely more

1:54:20

complex

1:54:21

than chess, like chess, you can get to a place where you can roughly be good

1:54:24

poker.

1:54:26

It's just constantly, there's just too many variables.

1:54:29

There's human emotion.

1:54:30

There's human psychology.

1:54:31

The number of people, all of this stuff just makes the complexity of the game

1:54:35

something that I find magical.

1:54:38

And so I sit there and I try to understand like, why am I doing the things that

1:54:43

I'm doing?

1:54:43

And so much of it comes back to being a mirror about what's happening in my

1:54:46

daily life.

1:54:47

It's a fucking craziest thing.

1:54:49

Like I'm super insecure.

1:54:51

I'll go into poker and I will just lose for weeks at a time.

1:54:54

But it's because I'm insecure in my daily life.

1:54:56

And what's happening is that I'm trying to find these quick wins and quick

1:55:00

solutions

1:55:00

because I'm in a state of insecurity.

1:55:03

I'm anxious.

1:55:04

I have this anxiety.

1:55:05

And so it's become a great mirror for me.

1:55:07

So that used to be a thing.

1:55:09

It still is a thing.

1:55:10

And, but I've become reasonably skilled at it where the edges are smaller.

1:55:16

And I put myself in positions where I'm only playing against a certain group of

1:55:20

people.

1:55:20

And I'm the losing player, frankly, in that game, if, when I'm playing against

1:55:24

like the top pros,

1:55:25

it just doesn't, it helps me and I can get tuned up for it.

1:55:30

But then I started to, you know, I would take different things.

1:55:33

I tried to learn how to ski, basically impossible when you're older.

1:55:36

I look like a fucking idiot.

1:55:37

How old were you when you tried?

1:55:39

Uh, I started when I was like, you know, I was a good snowboarder.

1:55:43

So I was snowboarding my whole life.

1:55:44

And then my kids skied.

1:55:46

And so I'm like, okay, well, I want to do this as a family.

1:55:49

So it was like 42 or something when I tried.

1:55:51

I'm 49 now, almost 50.

1:55:52

That's brutal.

1:55:54

I mean, it's like, I look like a fucking idiot.

1:55:55

Like, it's like this gangly giraffe, like trying to get down the mountain.

1:56:00

And then now I started golf and man, I got to tell you, I used to play a little

1:56:06

bit.

1:56:06

Then I stopped, but there's something to me about being outside where just like

1:56:13

being in nature,

1:56:14

I find like really motivating.

1:56:16

It's a vitamin.

1:56:17

It's a vitamin.

1:56:18

And then just the mind body connection of that game.

1:56:21

It just really fucks with you because it's, it's just nothing you can master

1:56:25

and overpower.

1:56:26

Right.

1:56:26

And it teaches you to just like be in it.

1:56:29

Yeah.

1:56:30

And that's a very hard skill.

1:56:33

Like if you look at the best, like I, there's like a handful of people that I

1:56:37

really look up

1:56:37

to and I obsess like Munger, Buffett, but the Berkshire meeting was this past

1:56:42

weekend.

1:56:42

And if you look at the clips, there's this incredible thing where they

1:56:45

transitioned, right?

1:56:47

Munger passed away.

1:56:48

Buffett's like now executive chairman, but this guy, Greg Abel and this guy, Ajit

1:56:51

Jain.

1:56:52

Ajit Jain does this thing where he's like, I teach the people that come to just

1:56:56

say no.

1:56:56

Your whole job is to just say no.

1:56:58

You're going to get bombarded with all kinds of business pitches.

1:57:01

Say no, no, no.

1:57:02

And eventually somebody will come in and fucking try to whack you in the head

1:57:05

with a two by

1:57:05

four of money.

1:57:07

Then you come to me and we'll do the deal.

1:57:09

And it made such an impression because like, again, when I'm insecure, my

1:57:14

reward function

1:57:16

is attention.

1:57:16

So I'm like a fucking little busy body.

1:57:18

I'm running around doing all this little bullshit, you know, and then man, when

1:57:23

I'm in a fucking

1:57:23

flow state and like, I'm tuning it, like I'm striping the ball, you know, I'm

1:57:28

like a few

1:57:29

things that really matter in size.

1:57:31

And I'm like, man, this is, this is right.

1:57:33

It's all come to me because I'm like, I'm like within myself.

1:57:39

And these other things are a better reflection of when I'm within myself.

1:57:44

And these other things are a mirror of when I'm totally out of kilter.

1:57:47

That's just me.

1:57:49

So in my life, these things tend to lead.

1:57:52

I think you're saying that's just you, but I think that's generally most people.

1:57:57

I think you find these things, these vehicles for developing human potential,

1:58:04

whether it's martial arts or golf or playing guitar or playing chess or poker.

1:58:09

And then you have to have, I think one, at least for me, one seminal

1:58:14

relationship in your life.

1:58:15

You have to have one person that has just undying belief in you.

1:58:19

And I never really had that until I met my wife.

1:58:21

And that was a very, and I didn't, I pushed against it so fucking hard because

1:58:26

I was like,

1:58:26

it just can't be true.

1:58:27

Like, why does this person give a shit?

1:58:29

Do you know what I mean?

1:58:30

Like, why do they care about me more than I care?

1:58:32

- Well, there's also the fear because so many people get in those bad

1:58:35

relationships.

1:58:36

And I'm just like, I think there's a part of you, like me, where you're just

1:58:40

like,

1:58:40

I'm not a very lovable person.

1:58:41

Like, I'm just like, this is, that's not who I am.

1:58:44

And this woman is just there.

1:58:47

So that's been like the thing, like for me, it's like, and because she's brutal.

1:58:51

She'll be like, oh yeah, that was fucking horrible.

1:58:54

You know, like yesterday I like, we had this, I did this thing at Milk and it

1:58:58

was a dinner at

1:58:59

my friend's house.

1:59:00

And then, you know, we're both going to different airports.

1:59:03

I'm flying here to see you and she's flying home.

1:59:06

And, uh, she calls me and I'm like, how did I do?

1:59:10

Ah, shit.

1:59:11

That's, but, but no, there's the parts that I did well.

1:59:18

And then she critiques the other parts that she didn't like.

1:59:20

And then I say, which is like, it's, and it's so, again, I'm insecure.

1:59:24

So I'm like, I want the self-serving.

1:59:26

Well, how would, because there was three of us on this panel and she's like,

1:59:30

and I was like, you know, I was the best, right?

1:59:32

She's like, no, Gavin was better.

1:59:34

And I'm just like, it's so, but it's so refreshing because it keep, again, it's

1:59:39

like a,

1:59:40

Keeps in check.

1:59:41

Like again, it gives me a mirror, you know, like when I was coming to see you

1:59:46

yesterday, when we were flying down to LA for this thing.

1:59:49

Um, there's parts of me where when I'm insecure, I kind of like externalize and

1:59:56

I can be like really

1:59:57

hyperbolic, unnecessarily hyperbolic and it's counterproductive.

2:00:00

And she said to me, listen, like, just imagine your friends.

2:00:03

These are hardworking people.

2:00:04

They're trying their best as well.

2:00:06

They don't necessarily know some, some things have massively worked out for

2:00:09

them,

2:00:09

but they would want to do the right thing.

2:00:11

There's people you've worked with before that want to do the right thing.

2:00:13

And she's like, just pick them and don't judge.

2:00:16

You can observe.

2:00:17

And it's crazy, but it's like, I need those little things.

2:00:21

There's like tweaks.

2:00:22

It's like having a coach kind of like, and that, and that's very, that's very

2:00:25

helpful to me.

2:00:26

Yeah.

2:00:26

It's very important.

2:00:27

It's hard to do that yourself.

2:00:28

I can't do it.

2:00:30

And it's also like, I'm retard maxing.

2:00:31

Like my life is like, I like that flow.

2:00:33

And if not, if I didn't have somebody who loved me and would hold me

2:00:37

accountable,

2:00:38

I'm just fucking not think about it.

2:00:39

Yeah.

2:00:40

And the opposite of that is someone who's like an antagonistic relationship.

2:00:45

And we know a lot of people that have those kind of very sabotage-y sort of

2:00:50

marriages

2:00:50

and relationships.

2:00:51

And that's crazy.

2:00:52

It's brutal.

2:00:53

It's brutal.

2:00:53

And I don't think they've ever had a really good one.

2:00:56

Otherwise they would never tolerate that.

2:00:58

I didn't know what good look like.

2:01:00

So you kind of just, I think a lot of people go with the flow.

2:01:04

Like, I mean, I was a nerdy kid from kind of a shitty fucked up kind of like

2:01:10

family structure.

2:01:12

And then I got injected into this rich high school.

2:01:15

But then I got to go back to an alcoholic father.

2:01:18

I'm on fucking welfare.

2:01:19

Like, it's like, you know, my, my self-confidence is negative fucking two units.

2:01:24

Didn't have a girlfriend, you know, like all the shit in high school, like

2:01:28

nothing happened for me.

2:01:29

And so my modeling of like how to be in a relationship, what to do, it was

2:01:33

fucking zero.

2:01:36

Um, it was zero.

2:01:37

And so the, all those mistakes were mostly because I didn't understand what

2:01:42

good looked like.

2:01:42

Right.

2:01:43

Um, and then I stumbled into this relationship after my divorce and my ex-wife

2:01:47

is an incredible

2:01:48

woman, just like not, you know, what you needed or what she needed.

2:01:52

We're just, we were in, in, in a few very specific ways.

2:01:55

We just weren't on the same page.

2:01:59

And then I find this other one and it's, and I think like, I don't, I was so

2:02:05

skeptical.

2:02:05

I'm like, I, I kind of viewed like a relationship as like this adjunct to your

2:02:10

life.

2:02:11

There's you, you're at the center, you're doing your shit.

2:02:14

And one of the appendages to your thing is your, that's what I thought.

2:02:20

And then now it's the opposite where I feel like my wife's at the center.

2:02:24

And I'm like, I would always kind of like, like almost like laugh at people in

2:02:29

the, in my mind.

2:02:30

I'm like, it's not possible that somebody feels this way about somebody else.

2:02:32

Um, but it's an, it's an, it's a huge enabler.

2:02:37

It's a, it's a very much a gift.

2:02:38

So that can also be a thing that people look for.

2:02:40

You know what I mean?

2:02:41

Which is, I think what you're saying is that there's a bunch of different

2:02:45

things

2:02:45

that have to sort of exist together and that it's not just completely focus on

2:02:50

your work,

2:02:52

but that focusing on these other things enhances the work.

2:02:56

And then the work enhances all these other things as well.

2:02:59

And they all exist together.

2:03:00

And my best work is when I'm not thinking about the attention or the money.

2:03:05

Those are the two most corrupting influences in my life.

2:03:08

If I, when I look back and when I've lost, when I've lost the most amount of

2:03:12

money

2:03:13

or when I've reputationally hurt myself the most, it's all been because of

2:03:17

attention and money.

2:03:19

Those are the only two things. The root cause consistently has been that.

2:03:23

That makes sense because you're thinking about a result rather than the process.

2:03:26

Exactly.

2:03:27

Yeah.

2:03:27

Exactly.

2:03:28

And then thinking about that result, like, ooh, I'm going to get a lot of

2:03:31

attention from this.

2:03:32

Ooh, I'm going to get a lot of money from this.

2:03:34

That actually robs you of the focus that you need to concentrate on the process.

2:03:37

Exactly. And the, the thing about the process is that so much of that, when you're

2:03:44

in a flow state,

2:03:45

you're proud of, irrespective of the size of it, because the meetings are the

2:03:50

same.

2:03:50

Do you know what I mean?

2:03:51

Like you're in the same fucking 35 minute meeting or 45 minute meeting debating

2:03:55

a product or debating a thing.

2:03:57

But the minute that I start to feel like embarrassed about company A versus

2:04:02

company B,

2:04:02

or decision A versus decision B, now my mind is like, okay, hold on a second

2:04:07

here.

2:04:07

I'm about to run myself off the cliff.

2:04:09

Yeah.

2:04:09

You know, or, you know, we, I had this dinner last week and this is what's

2:04:13

amazing.

2:04:14

Like we're talking about poker.

2:04:17

Well, I, so I'm having dinner with my wife and a friend.

2:04:21

And she's like, how are you doing?

2:04:25

Just like a very generic, nice question.

2:04:27

Right.

2:04:27

And I go into this long fucking diatribe of like, well, you know, the investing

2:04:31

thing,

2:04:32

this, and then I started this other thing that, and my wife's looking at me

2:04:34

like,

2:04:34

what the fuck are you rambling on about?

2:04:36

And then it got, but it got worse, Joe.

2:04:38

It got worse.

2:04:38

It got even fucking worse.

2:04:40

Then I'm like, you know, but then I had this poker game.

2:04:43

I started rambling.

2:04:44

It's normally on Thursdays, but then I, I moved it up to Wednesdays, but then I

2:04:47

moved it up to the

2:04:48

city because my friend's having it.

2:04:49

And then I named dropped who the guy was.

2:04:51

And my wife just looks at me like, what the fuck is going on with you?

2:04:56

So the dinner ends, it was, and then she's like, what the fuck is going on with

2:05:02

you?

2:05:02

She's like, that was insane.

2:05:03

And I had no idea that I was doing it.

2:05:07

And I'm like, okay, we need to put Humpty Dumpty back together again.

2:05:11

Cause I'm about to go on Rogan and I can't go off fucking like a crazy wild man.

2:05:14

Uh, but it's a, it's a, it's an enormous gift.

2:05:18

That's been my biggest unlock in these last like eight or nine years.

2:05:21

Like I feel like, like I'm kind of like adding skills to my toolkit.

2:05:25

I feel like a golfer.

2:05:26

Like that's like, I can shape shots a little bit.

2:05:28

Now I know how to use different clubs.

2:05:30

Um, and it's all like mindset.

2:05:33

And it's like, it's very much what you, it's like this process oriented

2:05:38

approach

2:05:38

and you just can't control the outcome.

2:05:40

And that's like, it's a magical feeling.

2:05:44

It's interesting that you're saying this because like, think about what most

2:05:49

people

2:05:49

or people that are on social media, like the kind of attention that they're

2:05:56

focusing on.

2:05:56

Like, this is why virtue signaling is so unsuccessful.

2:06:00

Right.

2:06:01

It's so bad for it because it's fake.

2:06:02

You're really concentrating on the process.

2:06:04

Yeah.

2:06:04

Are you really concentrating on the result?

2:06:05

The result is getting people to love you.

2:06:07

Exactly.

2:06:07

Getting people to agree with you, getting, and then worrying about the

2:06:10

criticism.

2:06:10

Oh my God, they hate me.

2:06:11

Oh my God, they're mad at my, my statement.

2:06:13

Oh my God, they're this.

2:06:14

And then you're like obsessing on it all day.

2:06:16

People that aren't even anywhere near you.

2:06:17

Yeah.

2:06:18

It's like, it's one of the absolute worst things for mental health is this

2:06:22

addiction

2:06:22

that people have to posting things and then reading the responses to those

2:06:26

posts

2:06:27

and getting wrapped up in these very weird two dimensional interactions with

2:06:32

human beings.

2:06:33

You never read your comments.

2:06:34

I mean, you're very famous.

2:06:35

You're just like, it doesn't fucking matter to me.

2:06:37

Well, you're going to get to a certain point in time where if you have

2:06:40

X amount of people that follow you, you're going to have a percentage that are

2:06:46

mad at you.

2:06:46

And those are the ones you're going to think about.

2:06:48

Right.

2:06:48

And if you don't self audit, maybe that's good.

2:06:51

Maybe it's good to say like, you fucking piece of shit.

2:06:54

Like, Oh, I'm sorry.

2:06:55

You know, like what your wife saying to you, like, what the fuck was that?

2:06:58

Like, Oh shit.

2:06:59

Like, I am very self-critical, very like horribly.

2:07:04

So like to the poor, I torture myself, you know?

2:07:06

So I'm like, I don't need that from other people.

2:07:08

And also they'll be, those people don't love me and they want me to fail.

2:07:11

Like there's a lot of people that their lives are very unsuccessful

2:07:14

and I've been way too fortunate.

2:07:16

Right?

2:07:17

So it's like, there's a reason to be upset at me if your life is shit.

2:07:20

Cause I've, I've gotten, I have three of the best jobs on earth.

2:07:23

It doesn't make any sense.

2:07:24

Right?

2:07:25

So there's a re and also why the fuck is this podcast so successful?

2:07:28

That doesn't make any sense.

2:07:29

Right?

2:07:29

So it's like, I get it.

2:07:31

I understand why people, but I'm not going to help them.

2:07:34

I'm not going to help them bring me down.

2:07:36

I'm not going to indulge in it and ruin my own mind by wallowing in their

2:07:40

bullshit because the only reason why you would do that in the first place is if

2:07:43

you're not together,

2:07:44

no one is healthy and happy and intelligent is going to post mean things about

2:07:49

you.

2:07:49

So you are reading things from people that are mentally ill, unhappy, and

2:07:54

probably not.

2:07:54

Maybe they're intelligent in terms of their ability to solve certain issues and

2:07:59

problems.

2:08:00

Maybe they're good at certain skills, but like their overall grasp of humanity

2:08:05

and like being a good

2:08:06

person is not good if you're shitting on people, especially if you like ad hominem

2:08:10

attacks and

2:08:11

just insults.

2:08:12

And so it's not a good thing to ingest.

2:08:16

It's not, it's like if you go down the supermarket, you see Twinkies.

2:08:18

Oh, they're right there.

2:08:19

Don't fucking eat them.

2:08:20

Okay.

2:08:21

That's not good for you.

2:08:22

And so it's like, I don't think that at a certain point in time, especially if

2:08:26

you become publicly

2:08:28

known and famous, you should ever read your comments.

2:08:30

I don't think it's good for you.

2:08:31

Yeah.

2:08:32

But you better be self auditing or you'll start sniffing your own farts and

2:08:36

think they

2:08:36

smell great.

2:08:37

Like, don't do that either.

2:08:39

Yeah.

2:08:39

But you, I know a lot of people that have gone crazy reading their own comments.

2:08:45

I've met comedians that like they'll think about it all day long.

2:08:48

It will fuck with them.

2:08:50

It will torture.

2:08:51

Well, their neuroses are what's what creates great comedy to begin with.

2:08:54

So if you feed that neuroses in the wrong way, you're fucked.

2:08:56

The wrong way.

2:08:57

Right.

2:08:57

And then also the self doubt creeps in because all these people telling you

2:09:00

suck and they're

2:09:01

like, oh my God, I suck.

2:09:02

And then you go on stage with this, like people think I suck.

2:09:04

They hate me.

2:09:05

You can't do that.

2:09:06

Like if you have a certain amount of energy in the day, this is what I always

2:09:13

tell comedians.

2:09:14

I said, look, think of your attention and your focus as a unit.

2:09:18

You have a hundred units.

2:09:20

If you spend 30 of those fucking units on assholes online, you're robbing 30

2:09:25

units from

2:09:26

all the things you love.

2:09:27

30 units from your family, 30 units from your friends, 30 units from your job,

2:09:32

30 units from

2:09:33

golf or poker or whatever it is that you love to do.

2:09:35

You're stealing your own time and your own focus for losers.

2:09:41

Right.

2:09:41

Like why would you do that?

2:09:42

And those losers are good people.

2:09:43

They're just, most people are good people.

2:09:46

They're just, they're in a bad path.

2:09:47

I would have been the same person.

2:09:49

Or they're venting.

2:09:49

Yeah, so they're venting.

2:09:50

Look, if you gave me a fucking Twitter account when I was 16, oh my God, it

2:09:54

would have been

2:09:55

horrendous.

2:09:55

Yeah, I would have been going crazy.

2:09:56

Oh my God, I would have been a terrible person.

2:09:58

It's normal, especially if your life sucks and you're not doing well and you're

2:10:03

attacking

2:10:03

famous people or you're attacking this person that's doing better than you or

2:10:06

whatever it is.

2:10:07

Do you, have you seen the clips of the retard maxing?

2:10:10

No.

2:10:12

You don't know what this is?

2:10:13

No.

2:10:13

You don't know what this is?

2:10:14

No, what's retard maxing?

2:10:15

Oh, this guy is fantastic.

2:10:17

He sits, he sits on his back porch, Jamie, can you just show?

2:10:21

He sits, he sits on his back porch, smoking a cigar, basically telling you

2:10:28

everything's

2:10:29

kind of bullshit.

2:10:30

Stop thinking about shit.

2:10:31

You know, if you don't like your friends, leave them.

2:10:33

If you don't like your girlfriend, leave them.

2:10:35

Stop overthinking, simplify your life.

2:10:37

You know, it's so simple, but I think it's incredibly powerful.

2:10:43

Who is this guy?

2:10:44

Elisha Long, I think is his name.

2:10:46

I don't know, Jamie, if you can find it.

2:10:47

I think Elisha...

2:10:48

Retard maxing is funny because I know about looks maxing.

2:10:51

We talked about that recently on a podcast, but that's recently entered into my

2:10:55

mind,

2:10:56

into my zeitgeist.

2:10:57

Looks maxing.

2:10:57

That's the clavicular thing, but I've only found out about that within the last

2:11:01

few months

2:11:01

of life because I genuinely stay off social media as much as possible.

2:11:06

And if I do read things, what I like to do, I like to focus on fascinating

2:11:10

things.

2:11:11

Like a lot of my time I spend looking at YouTube stuff.

2:11:13

Same.

2:11:14

Because YouTube stuff, my algorithm is all like new black holes they've

2:11:18

discovered,

2:11:19

you know, new discoveries in terms of like, what is the fabric of reality?

2:11:24

Like I'm, that's interesting to me.

2:11:26

Yeah.

2:11:26

And if I just concentrate on people being mean or shitty to each other or

2:11:30

the latest fucking political drama, it's like, I don't have much time.

2:11:35

I'm busy.

2:11:37

I like things.

2:11:38

Yeah.

2:11:38

And I want, are you on like Instagram and tick tock?

2:11:41

I'm on Instagram.

2:11:42

I do not have a tick tock.

2:11:43

This is looks maxing.

2:11:45

I know this is retard maxing.

2:11:47

So let me hear what he says.

2:11:48

Who's this guy?

2:11:49

What's his name?

2:11:50

Elisha Long.

2:11:51

Shout out to Elisha.

2:11:52

Being used as a poisoning of nostalgia, but to simply remind you of what you

2:11:59

found it for.

2:12:00

And as we grow up, we often give that up for security.

2:12:05

We give that up so that we are accepted.

2:12:07

We give that up to flex and appear like we have now figured things out that

2:12:13

people will accept us.

2:12:14

The only way that you will truly be successful is if you are righteous and you

2:12:21

live according

2:12:22

to your nature and you play, man.

2:12:24

And you don't let people take play away from you.

2:12:27

To be at the circus and be ooed and awed and worried about all the bullshit.

2:12:31

Return to a state of play.

2:12:33

Well, that's very good advice.

2:12:36

Return to a retard max.

2:12:39

The best thing that you could do is return to a state of play.

2:12:42

That's true.

2:12:43

There's a lot of that.

2:12:44

You know, there's a lot of that.

2:12:46

Absolutely.

2:12:47

Oh, I think that that is like a...

2:12:48

It's a wise man for a young fella.

2:12:51

Yeah.

2:12:51

Oh, okay.

2:12:52

He's a jujitsu guy.

2:12:53

There you go.

2:12:53

Look, he's getting his fucking blue belt there or he's getting his purple belt.

2:12:56

What is going on there?

2:12:58

So is he getting his blue belt?

2:13:00

Yeah, it's his purple.

2:13:01

Purple.

2:13:01

Yeah.

2:13:02

So they're taking his blue belt off and putting his purple belt on.

2:13:04

Yeah.

2:13:05

See, that's he's learning.

2:13:06

He's a martial artist.

2:13:07

That's why.

2:13:08

You think martial arts people are just more like spiritually connected to the

2:13:11

truth?

2:13:12

I don't know if it's spiritually connected to the truth.

2:13:14

It's forced down your fucking throat because you can't believe you're better

2:13:18

than you are

2:13:18

if you're getting mauled every day.

2:13:20

You know, and there's only one way.

2:13:24

This guy's on the path to becoming a jujitsu black belt looks like a pretty big

2:13:28

guy too.

2:13:28

That'll help.

2:13:29

But there's only one way to get a black belt in jujitsu.

2:13:32

You got to train jujitsu all the time and get better at jujitsu.

2:13:35

You can't pretend you're better.

2:13:37

You know, there's a lot of people that write poems and they suck and they think

2:13:40

they're

2:13:40

so deep.

2:13:40

Yeah.

2:13:41

But those poems suck.

2:13:42

Meaning like there's just a very simple objective measurement.

2:13:44

That's it.

2:13:45

One hundred percent.

2:13:46

You either win or you lose.

2:13:48

You either tap or you get tapped out.

2:13:50

You either tap somebody or you get tapped out.

2:13:53

But can you get a black belt in some gym that's easier than a different gym or

2:13:56

something like that?

2:13:57

Yeah, sort of, kind of, but not really.

2:14:01

I mean, everybody's trying hard.

2:14:02

I mean, there's definitely better gyms where they're more technical and their

2:14:06

program is much

2:14:07

more systematic and they're better at breaking down skills, like how to develop

2:14:11

skills.

2:14:11

You know, there's definitely better gyms.

2:14:15

There's better schools.

2:14:17

There's better places to learn.

2:14:18

But everywhere you learn, you're going to have a bunch of people that are

2:14:21

trying hard.

2:14:22

And you have a bunch of people that are trying to learn these.

2:14:25

And also today, because of the internet, you could go on YouTube and there's

2:14:30

thousands of tutorials

2:14:32

breaking down new moves.

2:14:34

Jiu-jitsu is like endlessly complex.

2:14:36

Well, one of my kids has ADHD and one of the things that was recommended to us

2:14:40

was jiu-jitsu.

2:14:41

Yeah, what is ADHD, man?

2:14:42

It's not even fucking real because I definitely have it.

2:14:44

I think we all have it.

2:14:45

I think it's a superpower.

2:14:46

I think we all have it.

2:14:48

I think, look, I do not focus well on things that I think are boring.

2:14:52

But if you give me something that I love, I can't, I'll play pool for fucking

2:14:56

12 hours in a row.

2:14:57

Well, it's crazy.

2:14:57

But like the reason I got back into golf is my seven-year-old gets on the

2:15:00

course and

2:15:01

sometimes you can talk to him and he's not making, you know, he's just like in

2:15:04

his own world.

2:15:04

Exactly.

2:15:05

And then you start talking about chess or jiu-jitsu or whatever.

2:15:08

And then we get him on the golf course and this kid is just dialed in.

2:15:13

Yeah, superpower.

2:15:13

And I'm like, holy shit.

2:15:15

And they say that that's a disease.

2:15:16

That's crazy.

2:15:17

It's crazy.

2:15:18

Because if you find a thing that that kid loves, he's going to excel at above

2:15:22

and beyond most humans.

2:15:23

We, uh, he does these chess classes and like, look, he's seven.

2:15:26

So I'm like, all right, motherfucker, bring it.

2:15:28

Fucking destroy you.

2:15:31

I'm going to fucking maul you.

2:15:33

And, uh, we're playing last weekend.

2:15:35

And he goes, oh, dad, you know, you can't, uh, castle out of check.

2:15:40

I'm like, shut the fuck up.

2:15:41

I know how this game works.

2:15:43

And I go on to beat him.

2:15:44

And I, and I went to my wife and I'm like, he's six weeks away from beating me.

2:15:48

And then I spent, I spent two days, I spent two fucking days on YouTube.

2:15:55

And I was like, okay, I got to brush up on my openings.

2:15:58

And I got to, I got, oh my God, I don't have time for this shit, but I can't

2:16:01

let this seven year old beat me.

2:16:03

You know what I mean?

2:16:04

You're going to have to, you're going to have to.

2:16:06

And I'm like, and I was like, how do I, how do I stall this until maybe he's 10

2:16:09

or 11?

2:16:10

Then it's like, okay, fine.

2:16:11

You finally beat me.

2:16:12

Congratulations.

2:16:13

You have to think of him as an extension of you and be happy when he does.

2:16:16

Oh my God.

2:16:17

Yeah.

2:16:17

That's just how it is.

2:16:18

Look, if you're a man and you have a son, I have all daughters, but if I had a

2:16:23

son,

2:16:23

I would be legitimately terrified that he'd be able to tap me.

2:16:27

Because if I had a son, one of the first things that I would do is get them, I

2:16:31

got my kids involved

2:16:32

in martial arts at an early age, but I didn't force them to keep doing it.

2:16:35

They did it for a certain amount of time.

2:16:36

And then they went on to do a bunch of other things that they enjoy better,

2:16:39

which is fine.

2:16:40

But I think it's good to learn some skills, learn how to defend yourself.

2:16:44

So you're not completely lost.

2:16:45

Just it's, I think it's good for you.

2:16:47

It's good to learn.

2:16:48

It's good to develop confidence, but for boys, I think it's critical, you know,

2:16:52

especially

2:16:52

boys with my kind of DNA.

2:16:54

I'm like, I think it's good to get that shit out of your system.

2:16:56

But if I had a son, there'd be a certain point in time.

2:16:59

I'm like, it's a matter of time before this motherfucker can kill me.

2:17:01

I mean, I'm 58 years old.

2:17:04

If I had a 20 year old kid, like he could probably kill me, probably fucking

2:17:08

kill me.

2:17:09

He'd kick your ass.

2:17:09

Yeah.

2:17:10

It's like, what am I going to do?

2:17:11

There's nothing you could do.

2:17:12

You just have to accept it and then hope your relationship with him is strong

2:17:15

enough that

2:17:15

he still respects you, even though he can kill you.

2:17:17

Because it can't be entirely bait.

2:17:20

Look, there's a lot of martial arts instructors that are old and they're revered

2:17:25

and respected

2:17:25

and nobody wants to try to hurt them.

2:17:27

Yeah.

2:17:27

Right.

2:17:27

Because you realize if you learn enough, you get to a certain point in time,

2:17:32

you realize like

2:17:32

I'm a much better dad to my sons than I am my daughters.

2:17:35

And I mean this in the following way.

2:17:37

My daughters have the run of the place, whatever they want.

2:17:39

I'm in love with them.

2:17:40

I don't love them.

2:17:41

I'm in love with them.

2:17:42

Whatever they need.

2:17:43

Right.

2:17:43

They can just.

2:17:44

Just enamored by.

2:17:45

They're just like, they can control me.

2:17:46

They just kind of send me in one direction or another.

2:17:48

I'm just like they're.

2:17:49

Mm-hmm.

2:17:50

By the way, they know that too.

2:17:51

I'm enslaved by them.

2:17:52

Yes.

2:17:53

You know, and I just want their attention.

2:17:54

Any small little shred, I'm like.

2:17:56

Boo your son, you keep them in check.

2:17:58

Whereas like my sons, I keep the, and I'm doing everything that I was supposed.

2:18:02

I think I'm supposed to be doing.

2:18:04

Now, the good news is my daughters are just different.

2:18:07

They're girls.

2:18:07

They're just, so they don't need the same kind of like tough love ish.

2:18:11

Right.

2:18:11

You know, but then my boys reveal their characteristics in ways that really

2:18:15

surprised me.

2:18:16

And I'm just like, man, this is so fucking awesome.

2:18:18

Parenting has been the best.

2:18:19

Like when I, again, like slowing down and actually being in it.

2:18:23

Mm-hmm.

2:18:23

And I'm like, fuck, this is amazing.

2:18:25

It is pretty amazing.

2:18:27

And watching your kids get really good at things is really fascinating.

2:18:30

It's fascinating.

2:18:31

I told you this story before, but like, you know, my son, my oldest son is my

2:18:35

17 year old.

2:18:36

It's just a great kid.

2:18:38

He goes, he's like, okay, I'm applying for college.

2:18:42

And I'm like, great.

2:18:43

Let me take you to the Naval Academy, West Point.

2:18:45

Let me show you these service academies.

2:18:46

And he sees those and he's like, these are incredible.

2:18:49

But then he's like, I think I want to go to like, you know,

2:18:51

Georgetown or Vanderbilt or whatever.

2:18:53

And I'm like, hey man, that's like, um, just a bigger version of your high

2:18:56

school.

2:18:58

And whatever, if that's what you want to do, you do you.

2:19:01

And, you know, um, but you know, my, the, I'll help you like kind of

2:19:06

get to the starting line here, but you're on your own.

2:19:09

And he had to get a job because I'm like, if you're going to get into these

2:19:12

schools,

2:19:13

you got to get a job.

2:19:13

And so he tries to, last summer, I just started fucking screaming at him.

2:19:19

And I'm like, you fucking louse, you haven't done anything.

2:19:22

And this is at like another kid's at our, at our son's birthday party.

2:19:26

I scream at him, he starts crying.

2:19:28

And I'm like, you need to do more.

2:19:29

Then my wife screams at him, he starts crying again.

2:19:33

Then my ex-wife screams at him, he starts crying again.

2:19:36

And he just goes, I'm out of here.

2:19:40

He walks out.

2:19:41

Meanwhile, I start panicking and I'm like, I got a tiger dad this situation.

2:19:45

So I start texting a few friends trying to figure out, hey, can I, you know,

2:19:48

do you guys want to hire this kid?

2:19:50

He's like really, you know, he's pretty smart kid.

2:19:52

Did all this stuff in robotics, yada, yada.

2:19:54

One of them says, I'd be willing to interview him.

2:19:57

I call him and he's like, dad, I got a job.

2:20:01

I said, what do you mean you got a job?

2:20:02

I said, I went around downtown, went to all these places.

2:20:08

And I was in a McDonald's and the woman was having a little bit of difficulty

2:20:13

speaking English.

2:20:13

So I just spoke to her in Spanish and I got the application.

2:20:15

I sat down at the desk and the guy having lunch beside me said, hey, I heard

2:20:19

you needed a job.

2:20:22

And I really liked the way you talked to this woman.

2:20:24

I'm the general manager of the car wash down the street.

2:20:26

Come and work for me.

2:20:27

And I said, well, what are you going to do?

2:20:30

He goes, I'm going to go work there.

2:20:32

And I said, okay, well, I got this other interview for you as well.

2:20:35

So you should see, maybe you can do both.

2:20:37

Anyways, the end of the story is he did these two jobs.

2:20:39

He worked at a robotics firm, but then he worked at a car wash.

2:20:43

And when I tell you this story, I am so proud of this kid because of the car

2:20:46

wash.

2:20:46

Because that car wash thing, he would come home and he's like, man, you have no

2:20:50

idea how people live.

2:20:51

And I'm like, what do you mean?

2:20:52

He's like, the stuff that I find in the trunk when I have to vacuum these cars

2:20:56

and clean out the cars.

2:20:58

And I'm like, bro, that is a gift.

2:20:59

You have given a fucking gift.

2:21:01

That is the thing that if you take with you, you'll be golden the rest of your

2:21:05

life.

2:21:05

Because all this other shit is all kind of manufactured.

2:21:08

I help because I'm anxious.

2:21:09

I'm insecure.

2:21:10

But that shit you did on your own.

2:21:12

And that thing is what people will fucking respect when push comes to shove.

2:21:16

It's also jobs that suck are really good for you.

2:21:18

So good.

2:21:19

I used to work at Burger King when I was 14.

2:21:21

Man, let me tell you.

2:21:23

You were 14 and you had a job?

2:21:24

When my dad had to stay behind, like we were, my dad was a diplomat in the

2:21:32

embassy of Sri Lanka

2:21:33

in Canada.

2:21:34

This fucking war in Sri Lanka is crazy.

2:21:36

He writes this essay.

2:21:38

His life is threatened.

2:21:39

So he files for refugee status.

2:21:41

He gets it.

2:21:42

He gets kicked out of the embassy.

2:21:46

So he doesn't have a job.

2:21:47

My mom becomes a housekeeper.

2:21:48

And we're kind of toiling in this poverty cycle.

2:21:52

So 14, I had to get a job.

2:21:54

And I would take the money.

2:21:55

And I buy the bus passes.

2:21:58

I would buy some of the groceries.

2:21:59

We're just trying to make it all work, right?

2:22:02

And I got a job at the Burger King.

2:22:04

This is another example where I was like, I'm going to go get a job.

2:22:09

Hey, can you drive me to the interview?

2:22:11

And my dad's like, no.

2:22:14

Get on your fucking bicycle and go.

2:22:17

And I thought, bro, we need this.

2:22:20

You need the money more than I do.

2:22:21

Why are you making me bicycle?

2:22:23

But a bicycle.

2:22:24

And I got the job.

2:22:26

And I worked there.

2:22:26

And I used to work the night shift.

2:22:28

14-year-old kid, man.

2:22:29

Wow.

2:22:30

From fucking 8:00 to 2:00 in the morning.

2:22:31

Wow.

2:22:32

And I would have to clean this.

2:22:33

Like 8:00 PM to 2:00 in the morning.

2:22:34

And then you had to go to school in the morning?

2:22:36

No.

2:22:36

This was always like Friday, Saturday, Sunday.

2:22:38

Wow.

2:22:39

Thursday, Friday.

2:22:39

Sorry.

2:22:40

Thursday, Friday, Saturday.

2:22:41

And then, yeah.

2:22:41

Some days I would have to go to school.

2:22:42

But, and why did I work until 2:00?

2:22:45

Because when the restaurant closes, you get whatever the food is left over.

2:22:50

Right?

2:22:51

So like you get a couple chicken sandwiches, you get like the, you know, the,

2:22:54

the version

2:22:55

of the McNuggets that Burger King had, a couple Whoppers, and you take them

2:22:58

home.

2:22:59

But the amount of vomit that I had to clean up in the bathroom, you can't

2:23:07

imagine, man,

2:23:08

the downtown Burger King near bars, you know, after closing time, the shit you

2:23:15

see.

2:23:15

Oh, wow.

2:23:16

And the shit you deal with.

2:23:17

And all I could think of was like, I just want to get the fuck out of here.

2:23:21

But that was so valuable for me.

2:23:24

Yeah.

2:23:24

It was so valuable for me.

2:23:26

Um, and then I worry that my, you know, kids don't get exposed to it.

2:23:31

But when my son got it, maybe I'm overimposing too much about it.

2:23:34

But it's like, I'm like, man, that, that carwash thing is really going to be

2:23:38

the thing

2:23:38

that separates you in life.

2:23:39

Yeah.

2:23:40

Doing something that sucks.

2:23:42

It, it also.

2:23:42

Just being humble and grinding through that shit.

2:23:45

Mm-hmm.

2:23:45

You know?

2:23:45

But you realize, like, this is sometimes people, they don't pick a path and

2:23:50

they just

2:23:51

have a job and they don't like it.

2:23:53

And they stay with this thing they don't like forever.

2:23:55

And that's not what you want.

2:23:57

No.

2:23:58

It's not what you want.

2:23:59

But the development, like, learning how to do something that sucks and grinding

2:24:04

through

2:24:05

it.

2:24:05

And still doing it well.

2:24:06

Yeah.

2:24:06

Doing it well.

2:24:07

You know, like, make a, make a whopper.

2:24:09

Yes.

2:24:09

Be there on time.

2:24:09

I know how to fucking make a whopper.

2:24:11

Yeah.

2:24:11

Do you know what I mean?

2:24:12

Yeah.

2:24:12

Make the fries.

2:24:14

Change the oil.

2:24:15

All that shit.

2:24:16

And then when you apply that, those lessons to something you actually love and

2:24:21

you work

2:24:21

hard at something you love.

2:24:23

Magical.

2:24:23

Oh, it's incredible.

2:24:24

That's, that's a real gift.

2:24:26

It's a real gift.

2:24:27

Yeah.

2:24:27

I mean, you know, some people, they don't appreciate the process, you know?

2:24:32

And it's hard to, because like when you're young and you're going through these

2:24:36

difficult

2:24:36

jobs and these things that suck and you don't know how it's going to turn out,

2:24:40

you know?

2:24:40

And a lot of times people aren't really educated in what a process actually is

2:24:44

and about how it

2:24:45

does develop character and it does develop discipline and that these things are

2:24:50

actual

2:24:50

skills that you can apply to other things in life.

2:24:53

You just think, God, I'm a fucking loser.

2:24:55

I have a, I have a visual for this.

2:24:57

I always ask myself, am I in the engine room right now?

2:25:00

This is my way of saying like an engine room is a little hot.

2:25:04

It's a little uncomfortable, but it's where all this shit is happening.

2:25:07

It's where the shit is being made.

2:25:09

And so I'm like, it's a little, you know, discomforting, but I got to be in

2:25:14

there.

2:25:14

And there are days where there'll be weeks where that's all I do.

2:25:18

I'm just in it.

2:25:19

You know, I don't, I'm not good at responding to emails sometimes or whatever,

2:25:23

because there'll just be weeks where I'm in it.

2:25:24

And it's an incredible visual for me because I'm like, yeah, this is like where

2:25:29

like I'm grounded.

2:25:30

And I like feel myself.

2:25:32

And then when I, when I look at my, like my health,

2:25:37

that's when I just feel like really good about myself, like not insecure.

2:25:42

And my vitals are different.

2:25:44

Like, it's crazy.

2:25:45

Like my fucking HRV, like my HRV craters when I'm like, just like, you know,

2:25:53

insecure.

2:25:55

Of course.

2:25:56

But why is that?

2:25:58

Like it's your, it's your heart rate variability.

2:26:00

This should have nothing to do with your like disposition and your mood.

2:26:04

Well, your mind is, the idea that your mind is separate from the body is crazy.

2:26:09

It's crazy.

2:26:10

It's not.

2:26:11

But is your HRV lower when you're just out of sorts?

2:26:14

Yes, probably, right?

2:26:16

I'm sure.

2:26:17

Yeah.

2:26:17

I don't really monitor it that much.

2:26:19

Yeah.

2:26:19

And I'm, I try not to ever get out of sorts too.

2:26:22

And one of the ways that I keep from getting out of sorts is daily discipline.

2:26:27

Like it's, if I, if I have days where I'm sure it gets out of sorts, if I have

2:26:30

a few days in a row where I don't work out.

2:26:32

But I, I work out almost every day.

2:26:34

And if I'm not working out, I'm still cold plunging and going in the sauna and

2:26:38

stretching.

2:26:39

I'm always doing something.

2:26:40

And if I don't do something, I feel like I'm fucking up.

2:26:43

And then, then I can.

2:26:44

So does it matter what it is?

2:26:46

Meaning as long as it's a routine?

2:26:47

Yeah.

2:26:48

Yeah.

2:26:48

Well, I, I do it all myself.

2:26:50

I don't have a trainer, but I write things down.

2:26:52

I write down what I want to accomplish.

2:26:53

I write down what I'm going to do.

2:26:55

And then I just do it.

2:26:56

I like a robot force myself to do it.

2:26:59

Yeah.

2:26:59

And then I always feel better after it's over.

2:27:01

Yeah.

2:27:01

And it's always the hardest part of my day.

2:27:03

Yeah.

2:27:03

And so it makes everything else so much easier.

2:27:05

Because it's, I fucking work out hard.

2:27:08

Yeah.

2:27:08

And so everything else is pretty easy.

2:27:10

Yeah.

2:27:10

You know?

2:27:11

Because the strain, like just being in that fucking cold water or just going

2:27:15

through Tabatas on a

2:27:16

Airdyne bike, like this shit's hard.

2:27:19

It's really hard.

2:27:19

Like I could die right now hard.

2:27:21

Yeah.

2:27:21

And so everything else is like, how hard is it going to be?

2:27:24

Oh, it's uncomfortable.

2:27:25

Oh, boo hoo.

2:27:26

You know, like, I think it's important to go through that.

2:27:29

I really think it is, you know?

2:27:32

I really think it is.

2:27:33

And that's the difference between, you know, sanity and like having a very

2:27:39

slippery grip on

2:27:41

your own personal sovereignty.

2:27:44

I think a lot of it is like, you have to choose.

2:27:47

It has to be like elective voluntary adversity.

2:27:54

Like you have to choose to do.

2:27:55

Yeah.

2:27:56

That's a really great way of saying it.

2:27:57

Voluntary adversity.

2:27:58

If it's forced upon you, you can kind of compartmentalize it.

2:28:01

And then you get angry, like, wow.

2:28:02

It's bitter and resentful.

2:28:04

People making me do stupid shit.

2:28:04

Yeah, exactly.

2:28:05

But if you force yourself to do it, you know?

2:28:07

That's why these special forces guys are such fucking animals.

2:28:10

Of course.

2:28:11

They're choosing.

2:28:11

Right.

2:28:12

Exactly.

2:28:13

And they develop that, you know, this mentality when you're around other people

2:28:17

that are also

2:28:18

savages.

2:28:18

You know, you just, you realize like there's other people out there in the

2:28:22

world that are not

2:28:23

making excuses and they are getting after it every day and they are pushing

2:28:28

every day.

2:28:29

And the more you can surround yourself with people like that, the more people,

2:28:32

the people that

2:28:32

complain about nonsense and the find excuses and focus on other people and

2:28:38

bitch about things.

2:28:39

And why is she doing this?

2:28:41

Why is this happening for him?

2:28:42

It's loser mentality.

2:28:45

And if you're around more winners, you know, you absorb that.

2:28:48

You imitate your atmosphere.

2:28:49

Yeah.

2:28:50

It's very important.

2:28:50

It's very hard for people, especially young people, to find positive influences

2:28:57

and to find

2:28:58

positive groups.

2:28:59

And I think it's one of the reasons why a lot of young people gravitate towards

2:29:02

podcasts,

2:29:03

because they get to hear interesting conversations with really accomplished

2:29:06

people that are

2:29:07

fascinating, that are unlike anybody that they're around on a daily basis, you

2:29:11

know,

2:29:12

and that's also one of the reasons why it's important to find some, that's why

2:29:16

martial arts

2:29:16

are so good for young people, because you're around other people that are doing

2:29:20

this really difficult

2:29:21

thing and other sports too, whether it's football or wrestling, whatever it is.

2:29:24

I actually found like, you know, the last few years, I go out of my way to not

2:29:27

isolate myself.

2:29:28

That's one thing.

2:29:29

Like being around other people, engaging in things has been really healthy for

2:29:33

me.

2:29:33

Oh, for sure.

2:29:34

Oh my God. And I just found like, what the fuck am I doing?

2:29:36

It's like, everything is in my little house by myself.

2:29:38

Everybody, everything comes to me.

2:29:41

It's so odd.

2:29:41

It's odd.

2:29:42

It's really very unhealthy.

2:29:44

And it starts to fuck you up in the bind.

2:29:45

And then your interaction with humans is only on the internet.

2:29:48

It's terrible.

2:29:49

Or with people that are sycophantically either being paid or need something

2:29:53

from you.

2:29:53

Mm-hmm.

2:29:54

Yeah.

2:29:54

And then I think you're in a really bad place.

2:29:56

Absolutely.

2:29:57

Whereas like, if you're in the grind with other people, they're beating you at

2:30:00

things.

2:30:01

It's great.

2:30:01

Yeah.

2:30:01

If you're in a situation where there's a bunch of sycophantically connected

2:30:05

people to

2:30:05

you and they're just all kissing your ass.

2:30:07

And I mean, we all know people that are like the heads of companies and that

2:30:10

are just like

2:30:10

fucking tyrants.

2:30:11

I think the trap about being successful, because it's not everything it's

2:30:16

cracked up to be, is

2:30:16

exactly that.

2:30:17

Yeah.

2:30:17

You become so isolated that you become this like very caricatrous version of

2:30:22

yourself.

2:30:22

Mm-hmm.

2:30:23

Because you forget what it's like to, just a basic example, like wait in line,

2:30:28

be kind to other people, be polite, like be accommodating, have some empathy.

2:30:32

Right.

2:30:32

Where are you put in that situation to do those things?

2:30:34

Right.

2:30:35

You forget that you're just a person.

2:30:36

You're just a fucking person.

2:30:37

And if you achieve some level of success that you're trying to, you're trying

2:30:42

to achieve

2:30:43

this level of success so you elevate past being a person, you're missing the

2:30:46

point.

2:30:47

Like you're never going to.

2:30:48

And if you do, it'll come at a price.

2:30:50

I thought being successful was supposed to right all the wrongs that I felt

2:30:55

like I missed.

2:30:56

And it turns out nobody gives a fuck.

2:31:00

No.

2:31:00

And it does none of that.

2:31:02

I think it's all the process.

2:31:05

All of life is the process.

2:31:07

I agree.

2:31:08

I think as soon as you think that there's a goal, like, oh, I'm going to retire

2:31:12

and experience

2:31:12

my golden years, I think it's all horseshit.

2:31:15

And that's one of my main fears about AI, one of my main fears about this idea

2:31:21

of universal

2:31:22

high income and everyone's going to have ultimate abundance.

2:31:25

It's like, where does anybody find purpose and meaning?

2:31:30

Where do you take whatever this thing is that the mind is constructed of, these

2:31:38

needs that the mind

2:31:40

has that have to be satisfied in order to achieve sanity, in order to achieve

2:31:45

some sort of, like,

2:31:48

accomplishment, fulfillment, yeah, meaning.

2:31:50

You're going to have to do something, man.

2:31:52

You're going to have to do something.

2:31:53

And I mean, maybe it could just be jujitsu and golf and find some stuff that

2:31:58

you enjoy doing and take

2:32:00

some benefit in that.

2:32:01

But boy, that's not been the case for hundreds of years.

2:32:07

You know, that's not how human beings have existed.

2:32:09

I mean, but also part of me says, why do we have to work to find those things?

2:32:15

Why can't we?

2:32:16

Why?

2:32:18

Why is it all that?

2:32:20

Well, you've got to find the thing that's not work.

2:32:22

Right.

2:32:23

But what I'm getting at is, like, why is our identity all tied up in money and

2:32:30

just things

2:32:33

and objects and stuff?

2:32:35

And this is a fairly new thing in human society, right?

2:32:39

Why can't it transform into, like, your basic needs are all met.

2:32:47

Like, nobody ever has to worry about starving again.

2:32:49

Nobody ever has to worry about not having a home to sleep in.

2:32:51

Nobody ever has to worry about not having health care.

2:32:54

Nobody ever has to worry about not having education.

2:32:56

So then it becomes find a purpose with your life.

2:32:59

And as a society, can we adjust?

2:33:03

Can we gravitate towards a new way of existing and meaning?

2:33:08

And it would probably be great.

2:33:10

In one way, it'd be great because we wouldn't have to be constantly thinking,

2:33:14

why does he have that and I don't have that and this and that.

2:33:17

Instead, it would probably be like, what can I do to get better at the thing

2:33:21

that I love?

2:33:22

Right.

2:33:22

What, you know, or let me be a part of a project to do something that seems

2:33:26

implausible,

2:33:27

but I feel like I'm in the engine room every day.

2:33:30

This is great.

2:33:31

I'm toiling with these guys.

2:33:32

Yes, probably not going to work.

2:33:34

Some crazy convoluted thing that has a .001 chance of success.

2:33:39

That can captivate a lot of people.

2:33:41

Yes.

2:33:42

You know, the process.

2:33:43

The process.

2:33:44

Yeah.

2:33:45

The process.

2:33:45

The process is everything.

2:33:46

And there's no, I used to like, think back.

2:33:48

There is no attention in the process.

2:33:50

Right.

2:33:51

There's only attention in the outcome.

2:33:53

Right.

2:33:53

Do you see what I mean?

2:33:54

Right, absolutely.

2:33:55

Which is another clue and a secret that that's actually where you should be

2:33:58

focused.

2:33:59

Well, you might get attention, but that's not what you want.

2:34:01

What you want is the process to work out.

2:34:03

You want to get better at whatever it is you're doing and get that thing to a

2:34:07

better place than

2:34:07

it is right now currently.

2:34:09

Right?

2:34:09

That's what you're thinking of.

2:34:10

You're not thinking of, I am going to get all this attention.

2:34:13

I'm going to be on the cover of a magazine.

2:34:16

Yeah, it can't be that.

2:34:18

That's not good for anybody.

2:34:20

But everybody thinks that's what they're going to get.

2:34:22

Oh, I'm going to get this.

2:34:23

Everybody thinks that's what they want.

2:34:25

Yeah.

2:34:26

Right.

2:34:27

And the problem with that is that it's not what you want.

2:34:29

No.

2:34:31

And then now we're going to completely upend, potentially, all of that.

2:34:36

Yeah.

2:34:38

Well, maybe it'll coincide with the hive mind technology.

2:34:44

This hive mind thing, actually, that you say, I find very compelling.

2:34:47

Because this idea of like, how do you govern an AI?

2:34:51

Each of us individually are not capable.

2:34:54

But I think you, me, like 10,000, 100,000 people working together.

2:34:59

The question is, are we smarter?

2:35:02

And I think there's a reasonable chance that that could be true.

2:35:05

And then the other version of the hive mind is here are all these like crazy

2:35:10

ideas that would

2:35:10

just make the world incredible.

2:35:13

And a group of 1,000 people go off and they kind of jointly work on that

2:35:17

together.

2:35:17

That I find super fascinating.

2:35:19

Like, that could be it.

2:35:21

Like, it could be like, you know, 1,000 physicists are like, we're going to

2:35:25

create this new interstellar

2:35:26

form of transportation.

2:35:27

And they just go off.

2:35:28

And they're just like, they don't have to worry about existing because all of

2:35:33

that's paid for.

2:35:33

Well, it also could solve all of our problems that we have with like, haves and

2:35:39

have nots.

2:35:40

If we're all one, how could we tolerate have nots?

2:35:43

How could we tolerate people living on dirt floors in third world countries

2:35:47

with no access

2:35:47

to clean water?

2:35:48

We wouldn't tolerate it.

2:35:49

We wouldn't tolerate it.

2:35:49

Because they would be us and we would understand that.

2:35:52

Yeah.

2:35:53

I mean, it could be like a complete game changer in terms of human civilization.

2:35:57

It could really move people into a complete next direction.

2:36:00

I mean, it could eliminate crime and violence, which sounds insane.

2:36:04

Like, boy, that's so utopian.

2:36:06

Like, oh, why don't you suck on some crystals, you fucking hippie.

2:36:10

But legitimately, if, look, if everybody has a cell phone, which essentially

2:36:13

everybody does,

2:36:14

right, right now in this time and age, if we get to a point where everybody is

2:36:19

connected,

2:36:19

everybody is hive mind connected.

2:36:22

You're, there's, we're all, you're not going to just be able to drive by a

2:36:26

homeless encampment.

2:36:27

Right.

2:36:28

You won't, you'll feel.

2:36:29

You'll feel it.

2:36:29

You'll feel it.

2:36:30

No, you'll feel it.

2:36:31

It won't be like, ew, you fucking losers, hit the gas.

2:36:33

Yeah.

2:36:34

It's going to be like, we need to solve this.

2:36:37

We need to get these people counseling, mental health crisis, get them off the

2:36:40

drugs,

2:36:41

whatever it is that's wrong with them.

2:36:43

I mean, that's an incredible idea.

2:36:46

Yeah.

2:36:46

You know, like when an airplane kind of like goes like this and your stomach

2:36:49

goes and you just feel it.

2:36:51

Yeah.

2:36:51

Could you imagine like you drive by a homeless encampment and that's what you

2:36:54

feel?

2:36:54

Like you feel like something's wrong.

2:36:56

And we'll all feel it collectively.

2:36:58

Right.

2:36:59

If we're all connected and we all feel things connectively, we will actively

2:37:03

work together

2:37:03

together to solve these problems.

2:37:04

Yeah.

2:37:05

And if we're dealing with, if, if we really get to a point of abundance, like

2:37:09

true abundance,

2:37:10

where resources are not an issue and no one's starving, we could really fix all

2:37:16

the problems.

2:37:17

Like none of them are insurmountable.

2:37:20

None of them are breathing underwater.

2:37:21

Right.

2:37:21

None of them are flying to the sun.

2:37:24

None of them.

2:37:24

Yeah.

2:37:24

Right.

2:37:25

So all of them are things that could be, if we took all the world's resources,

2:37:30

socialism doesn't

2:37:31

work.

2:37:31

Right.

2:37:32

Why does it not work?

2:37:32

Because it rewards lazy people and it punishes ambitious people.

2:37:36

It's not, it doesn't, it doesn't work with human nature, but it would work if

2:37:39

you have

2:37:39

fucking hive mind.

2:37:41

If we all, we all understand what it means to put in effort.

2:37:44

We all understood what, what each other are feeling and thinking.

2:37:47

Right.

2:37:47

And we all compiled resources and fixed all of our social problems.

2:37:52

Right.

2:37:52

Like literally stop all wars, stop all crimes, stop all violence, stop all

2:37:58

poverty.

2:37:59

Done.

2:38:01

And then what do we do?

2:38:02

We work together to solve whatever the fuck else is wrong with your society.

2:38:06

Well, it's more like what is left over that we haven't figured out.

2:38:10

Think about what the world was like before the internet.

2:38:12

It's almost impossible to imagine, but we both grew up without it.

2:38:16

Yeah.

2:38:17

Yeah.

2:38:18

And so we're entering into this new world.

2:38:21

Think about what world was like without the hive mind, but yet we all grew up

2:38:25

without it.

2:38:25

Like that might be the next thing.

2:38:27

The thing that I remember the most about that era is I had a positive, some

2:38:33

view of everybody.

2:38:34

Really?

2:38:36

Meaning there weren't like the, the bad actors were pretty bad, but yeah,

2:38:41

generally like I looked

2:38:43

up to most business people, like the people that I now, I feel like have been a

2:38:46

little bit

2:38:47

unmasked then to me were pristine.

2:38:48

Oh, that's interesting.

2:38:49

Like the Bill Gates is of the world.

2:38:51

You know, I was like, man, I really aspire to be Bill Gates when I was like 13

2:38:55

or 14.

2:38:56

It just seemed like you're like, why is he buying all the farmland?

2:38:59

This fucking weirdo.

2:38:59

I mean, it's a fucking so funny.

2:39:02

He, uh, he bought this like 45,000 acres and 4,500 acres.

2:39:08

I can't get the order of magnitude right.

2:39:09

Uh, in Phoenix to build his own digital city.

2:39:13

Yeah.

2:39:13

Okay.

2:39:14

It's like weird.

2:39:15

So I bought the 1700 acres beside him.

2:39:17

That's hilarious.

2:39:21

Fuck you.

2:39:21

It's a very odd thing.

2:39:24

It's a very odd thing when people get exposed and you just go like, what the

2:39:28

fuck is that

2:39:28

guy really all about?

2:39:30

And, but also like how isolated is he?

2:39:33

He's been, he's been isolated for 50 years.

2:39:35

Right.

2:39:36

Like who are his friends and how, how many people does he have?

2:39:39

It must be very hard to be him actually.

2:39:40

I mean, especially now that he's divorced.

2:39:42

Right.

2:39:43

So now he's got no one going, but that speech fucking sucked.

2:39:46

Yeah.

2:39:46

He's got, I mean, he has a longterm partner.

2:39:48

Um, she seems like a lovely woman.

2:39:51

Um, but yeah, it's just gotta be super lonely.

2:39:54

It's gotta be.

2:39:55

It's not, to me, it's not worth that level of, I don't even know.

2:40:00

What is it?

2:40:01

It's like material success, at least measured in the outside world.

2:40:03

I don't know what it is, but it's not.

2:40:04

That's a lot, man.

2:40:07

This is like, I like, I don't know how Elon does it.

2:40:09

Like it's a lot.

2:40:09

It's super isolating.

2:40:11

Yeah.

2:40:11

It's, it's, it's just that he's very by himself and he's going to be even more

2:40:18

isolated

2:40:19

in a matter of a few months.

2:40:20

Yeah.

2:40:21

And that's unfortunate because you have very empathetic, very kind of like

2:40:24

sensitive people

2:40:25

like that, I think need other people.

2:40:27

Well, he's got people around him, but he's got very few people around him that

2:40:33

can kick

2:40:33

reality at him, you know, that, that is a bit of a problem, but he still seems

2:40:39

to be having fun.

2:40:40

Every time I'm around him, we have a bunch of laughs.

2:40:42

Like, he's fun to hang out with.

2:40:43

He's got an incredible sense of humor.

2:40:45

We, um, Jamie and I went down, uh, to one of the rocket launches at SpaceX.

2:40:50

Starbase?

2:40:51

Yeah, we went down there.

2:40:52

Fucking crazy.

2:40:52

And we watched from the ground while I took off, which is incredible.

2:40:57

Because it's like, how far was it, Jamie?

2:40:58

It was like two miles away from us?

2:41:00

A mile, mile and a half.

2:41:01

It's a mile and a half.

2:41:02

You feel it in your chest.

2:41:03

Have you been?

2:41:04

When a rocket launches?

2:41:06

No.

2:41:06

Have you been there?

2:41:06

Dude, it's bananas.

2:41:07

No.

2:41:07

The fucking thing, like, first of all, it doesn't look that far.

2:41:10

It looks like it's like maybe a quarter mile.

2:41:13

I'm just not good at judging.

2:41:15

This is a starship?

2:41:16

Oh, yeah.

2:41:17

So you feel it.

2:41:17

You've, like, his kids started crying, like, we want to go inside.

2:41:22

Like, it's disturbing.

2:41:24

Like, the amount of energy that's coming out of these fucking rocket boosters.

2:41:28

And then I hung out with him in the command center while the rocket was flying

2:41:33

through space.

2:41:34

And we're watching it on all these monitors and then lands in the water in

2:41:37

Australia.

2:41:38

And he's cracking jokes the whole time because the thing is, like, losing

2:41:42

pressure.

2:41:43

Because it's, they're stress testing all this stuff.

2:41:45

Which is really funny when really dumb people go, oh, he's a fucking dumbass.

2:41:48

His rockets keep blowing up.

2:41:49

Like, they just don't understand.

2:41:51

Like, the only way you find out what the capability of this technology is, is

2:41:56

you have to, like,

2:41:57

let it blow up.

2:41:58

And then you go, okay, it needs to be thicker.

2:42:00

It needs to be this and that.

2:42:01

And we need to add these things.

2:42:02

And there's sensors everywhere.

2:42:04

And so he's cracking jokes the entire time while this thing is, like, losing

2:42:07

pressure.

2:42:07

And it eventually wound up landing.

2:42:09

And it was fine.

2:42:10

But it did have a hole in it.

2:42:12

But it was just, like, he's laughing.

2:42:14

Like, he's having a good old time.

2:42:15

He's not freaked out.

2:42:16

No.

2:42:16

You know, he, he's uniquely built to handle it.

2:42:19

I, uh, when there was a rocket launch in Wradenburg in California, and I

2:42:24

chartered a Pilatus.

2:42:26

And I, because you can get low.

2:42:27

What's a Pilatus?

2:42:28

Like a little, like, propeller plane.

2:42:29

Oh, okay.

2:42:30

And I went around and around.

2:42:32

And I have this video of it kind of, like, coming up and through.

2:42:35

Because, like, how close were you?

2:42:36

A hundred miles.

2:42:41

Oh, wow.

2:42:41

But you, but it's, like, right there.

2:42:43

Uh-huh.

2:42:43

You know, because it's the distance.

2:42:45

Right.

2:42:45

And it's coming up and I'm kind of going around.

2:42:47

It was the craziest thing.

2:42:48

Wow.

2:42:49

It was cool.

2:42:50

It was super cool.

2:42:51

That shit is super cool.

2:42:52

It's very cool.

2:42:53

Oh, my God.

2:42:54

It's very cool.

2:42:55

I mean, just Starbase is bananas.

2:42:57

Just when you go down there and they have their own town.

2:42:59

The whole thing is straight.

2:43:00

There's fucking Cybertrucks everywhere.

2:43:02

I'm like, how do you find your car?

2:43:03

Like, if.

2:43:04

Is it, is it an incorporated town?

2:43:06

It started off as unincorporated, but it's its own thing now.

2:43:09

I believe it's its own town.

2:43:10

Is there a mayor?

2:43:11

That's a good question.

2:43:13

I think there is.

2:43:14

I think we talked about this.

2:43:15

I don't remember though.

2:43:17

But the actual factory itself is nuts.

2:43:21

Because I, Jamie and I were both like, this is way bigger than I thought it was

2:43:25

going to be.

2:43:26

And the rockets are way bigger than you thought.

2:43:27

And like, the garage doors are fucking bananas.

2:43:30

They've got a city government website, commission, mayor.

2:43:34

That's crazy.

2:43:38

It's really crazy.

2:43:39

Bobby Pettit.

2:43:40

Bobby Pettit is the mayor.

2:43:41

Yeah.

2:43:42

Robert Bobby Pettit.

2:43:43

That's awesome.

2:43:43

They have their own little Irish pub.

2:43:44

It's like, it's really cool.

2:43:46

They have really good food.

2:43:46

You know, when he, when he opened the first gigafactory, which was in Nevada,

2:43:51

we had a party.

2:43:51

And like, it was like a small opening thing.

2:43:55

And so we all drove in there and I have a video.

2:43:57

Of me and just like a pickup truck driving into the thing.

2:44:01

I started the video.

2:44:02

And I think it was 43 seconds until it ended.

2:44:06

And this was like, you know, a decade ago.

2:44:08

And I thought to myself, this is implausible.

2:44:10

Like, I've never even contemplated things that could be built this big.

2:44:14

I didn't think it was allowed.

2:44:16

I don't even know how something like this works.

2:44:19

And I was like, how does, how do you envision this whole thing works?

2:44:22

It's like simple.

2:44:23

Raw materials in the front.

2:44:25

Cars out the back.

2:44:27

I'm like, that's it?

2:44:31

It sounds so simple.

2:44:32

Well, he thinks big.

2:44:34

He thinks big.

2:44:35

And thank God he's around.

2:44:37

I mean, if he wasn't around, if he hadn't purchased Twitter, I think

2:44:40

our entire civilization would look very different.

2:44:42

Very different.

2:44:43

It would, I mean, that sounds like a very grandiose thing to say.

2:44:46

Sounds hyperbolic, but you're right.

2:44:47

I think it's true.

2:44:48

Because I don't, I think free speech is a core component of our civilization.

2:44:53

And I don't really think we had it.

2:44:55

Yeah.

2:44:56

I think it was curated.

2:44:57

And it was very tightly controlled by the actual federal government, which is

2:45:01

spooky.

2:45:01

No, no, no.

2:45:02

It decided what we should be paying attention to.

2:45:05

Yes.

2:45:06

Just, just put it very simply without kind of like, and that's not right.

2:45:10

Right.

2:45:11

Because when they're telling you to pay attention to this and the actual issue

2:45:16

is this and you cannot,

2:45:18

then you can't fix what's actually broken.

2:45:21

Right.

2:45:21

And you start to, we start to basically be like, we're part of just a useful

2:45:26

idiot for these people.

2:45:27

Yes.

2:45:28

And that's not right.

2:45:29

It's not right.

2:45:29

Listen, man, this was a lot of fun.

2:45:32

It's always great to talk to you.

2:45:33

Thank you very much for doing this.

2:45:34

It was very cool.

2:45:35

Um, let's do it again sometime.

2:45:37

All right.

2:45:39

Thank you.

2:45:39

All right.

2:45:39

Bye, everybody.