#2422 - Jensen Huang

16 views

3 months ago

0

Save

Audio

Jensen Huang

1 appearance

Jensen Huang is the founder, president, and CEO of NVIDIA, the company whose 1999 invention of the GPU helped transform gaming, computer graphics, and accelerated computing. Under his leadership, NVIDIA has grown into a full-stack computing infrastructure company reshaping AI and data-center technology across industries. www.nvidia.com www.youtube.com/nvidia

ChatJRE - Chat with the JRE chatbot

Timestamps

0:09Trump anecdotes, U.S. manufacturing/energy policy, and the broader AI technology race
9:58AI progress is gradual; channeling compute into safer, more accurate systems (and implications for defense/cybersecurity)
19:57Cybersecurity as a model for AI safety: shared defense, quantum encryption, and fears of sentient AI

Show all

Comments

Write a comment...

Transcript

0:00

Joe Rogan podcast, check it out.

0:03

The Joe Rogan experience.

0:05

Train by day, Joe Rogan podcast by night, all day.

0:09

Hello, Jensen.

0:13

Hi, Joe.

0:14

Good to see you again.

0:14

We were just talking about, was that the first time we ever spoke?

0:17

Or was the first time we spoke at SpaceX?

0:19

SpaceX.

0:20

SpaceX the first time.

0:21

When you were giving Elon that crazy AI chip.

0:24

Right, DJX Spark.

0:25

Yeah, ooh, that was a big moment.

0:27

That was a huge moment.

0:28

That felt crazy to be there.

0:29

It was like watching these wizards of tech, like exchange information and you're

0:35

giving

0:36

him this crazy device, you know?

0:37

And then the other time was, I was shooting arrows in my backyard and randomly

0:43

get this

0:44

call from Trump and he's hanging out with you.

0:45

President Trump called and I called you.

0:47

Yeah, it's just-

0:48

We were talking about you.

0:49

We were talking about the UFC thing he was going to do in his front yard.

0:55

Yeah.

0:56

And he pulls out, he's like, Jensen, look at this design.

0:59

He's so proud of it.

1:00

And I go, you're going to have a fight in the front lawn in the White House?

1:05

He goes, yeah.

1:06

Yeah, you're going to come.

1:07

This is going to be awesome.

1:08

And he's showing me his design and how beautiful it is.

1:10

And he goes, and somehow your name comes up.

1:14

He goes, do you know Joe?

1:16

And I said, yeah.

1:17

I'm going to be on his podcast.

1:19

He said, let's call him.

1:23

He's like a kid.

1:24

I know.

1:25

Let's call him.

1:25

He's like a 79-year-old kid.

1:28

He's so incredible.

1:29

Yeah.

1:30

He's an odd guy.

1:31

Just very different.

1:33

You know, like what you'd expect from him.

1:36

Very different than what people think of him.

1:39

And also just very different as a president.

1:41

A guy who just calls you or texts you out of the blue.

1:43

Also, he makes, when he texts you, you have an Android, so it won't go through

1:47

with you.

1:47

But with my iPhone, he makes the text go big.

1:50

Is that right?

1:51

He's going to say he's respected again.

1:53

Like, all caps and makes the text enlarge.

1:59

It's kind of ridiculous.

2:00

Well, the one-on-one Trump, President Trump, is very different.

2:05

He surprised me.

2:06

First of all, he's an incredibly good listener.

2:09

Almost everything I've ever said to him, he's remembered.

2:12

Yeah.

2:13

People don't, they only want to look at negative stories about him or negative

2:18

narratives about him.

2:20

You know, you can catch anybody on a bad day.

2:22

Like, there's a lot of things he does where I don't think he should do.

2:24

Like, I don't think he should say to a reporter, quiet piggy.

2:28

Like, that's pretty ridiculous.

2:29

Also, objectively funny.

2:32

I mean, it's unfortunate that it happened to her.

2:34

I wouldn't want that to happen to her, but it was funny.

2:37

Just ridiculous that the president does that.

2:39

I wish he didn't do that.

2:40

But other than that, like, he's an interesting guy.

2:43

Like, he's a lot of different things wrapped up into one person, you know?

2:48

You know, part of his charm, or part of his genius is he says what's on his

2:53

mind.

2:53

Yes.

2:54

Which is like an anti-politician in a lot of ways.

2:57

Yeah, right.

2:58

So, you know what's on his mind is really what's on his mind.

3:01

Which I think people prefer.

3:02

And he's telling you what he believes.

3:04

I do that, too.

3:05

Some people.

3:06

Some people would rather be lied to.

3:07

Yeah.

3:07

But I like the fact that he's telling you what's on his mind.

3:10

Almost every time he explains something, he says something, he starts with his,

3:17

you could tell, his love for America, what he wants to do for America.

3:22

And everything that he thinks through is very practical and very common sense.

3:27

And, you know, it's very logical.

3:30

And I still remember the first time I met him.

3:34

And so this was – I'd never known him, never met him before.

3:38

And Secretary Lutnik called.

3:41

And we met right before – right at the beginning of the administration.

3:45

And he said – he told me what was important to President Trump, that United

3:51

States manufactures on shore.

3:54

And that was really important to him because it's important to national

3:59

security.

4:00

He wants to make sure that the important critical technology of our nation is

4:03

built in the United States and that we re-industrialize and get good at

4:07

manufacturing again because it's important for jobs.

4:11

It just seems like common sense, right?

4:13

Incredible common sense.

4:14

And that was like literally the first conversation I had with Secretary Lutnik.

4:17

And he was talking about how – that he started our conversation with Jensen.

4:29

This is Secretary Lutnik.

4:30

And I just want to let you know that you're a national treasure.

4:34

NVIDIA is a national treasure.

4:38

And whenever you need access to the president, the administration, you call us,

4:44

we're always going to be available to you.

4:47

Literally, that was the first sentence.

4:49

That's pretty nice.

4:50

And it was completely true.

4:53

Every single time I called, if I needed something, I want to get something off

4:58

my chest, express some concern.

5:00

They're always available.

5:01

Incredible.

5:02

It's just unfortunate we live in such a politically polarized society that you

5:06

can't recognize good common sense things if they're coming from a person that

5:10

you object to.

5:12

And that, I think, is what's going on here.

5:13

I think most people generally – as a country, as a giant community, which we

5:19

are, it just only makes sense that we have manufacturing in America, especially

5:26

critical technology like you're talking about.

5:28

Like, it's kind of insane that we buy so much technology from other countries.

5:32

If the United States doesn't grow, we will have no prosperity.

5:38

We can't invest in anything domestically or otherwise.

5:42

We can't fix any of our problems.

5:44

If we don't have energy growth, we can't have industrial growth.

5:49

If we don't have industrial growth, we can't have job growth.

5:52

It's as simple as that.

5:55

And the fact that he came into office and the first thing that he said was,

5:58

drill, baby, drill, his point is we need energy growth.

6:02

Without energy growth, we can have no industrial growth.

6:05

And that was – it saved the AI industry.

6:09

I got to tell you flat out, if not for his pro-growth energy policy, we would

6:15

not be able to build factories for AI.

6:19

We would not be able to build chip factories.

6:20

We surely won't be able to build supercomputer factories.

6:24

None of that stuff would be possible.

6:26

And without all of that, construction jobs would be challenged, right?

6:30

Electrical – you know, electrician jobs.

6:32

All of these jobs that are now flourishing would be challenged.

6:35

And so I think he's got it right.

6:37

We need energy growth.

6:38

We want to re-industrialize the United States.

6:41

We need to be back in manufacturing.

6:43

Every successful person doesn't need to have a PhD.

6:46

Every successful person doesn't have to have gone to Stanford or MIT.

6:50

And I think that that sensibility is spot on.

6:55

Now, when we're talking about technology growth and energy growth, there's a

6:59

lot of people that go, oh, no, that's not what we need.

7:02

We need to simplify our lives and get back.

7:05

But the real issue is that we're in the middle of a giant technology race.

7:08

And whether people are aware of it or not, whether they like it or not, it's

7:12

happening.

7:13

And it's a really important race because whoever gets to whatever the event

7:19

horizon of artificial intelligence is, whoever gets there first has massive

7:25

advantages in a huge way.

7:28

Do you agree with that?

7:29

Well, first, the part – I will say that we are in a technology race and we

7:33

are always in a technology race.

7:35

We've been in a technology race with somebody forever.

7:38

Right.

7:39

Right.

7:39

Since the Industrial Revolution, we've been in a technology race.

7:41

Since the Manhattan Project.

7:42

Yeah.

7:43

Yeah.

7:43

Or, you know, even going back to the discovery of energy, right?

7:47

The United Kingdom was where the Industrial Revolution was, if you will,

7:52

invented, when they realized that they can turn steam and such into energy,

7:56

into electricity.

7:58

All of that was invented largely in Europe.

8:04

And the United States capitalized on it.

8:07

We were the ones that learned from it.

8:10

We industrialized it.

8:12

We diffused it faster than anybody in Europe.

8:15

They were all stuck in discussions about policy and jobs and disruptions.

8:24

Meanwhile, the United States was forming.

8:26

We just took the technology and ran with it.

8:27

And so I think we were always in a bit of a technology race.

8:31

World War II was a technology race.

8:33

Manhattan Project was a technology race.

8:35

We've been in the technology race ever since during the Cold War.

8:39

I think we're still in the technology race.

8:40

It is probably the single most important race.

8:43

It is the technology is it gives you superpowers, you know, whether it's

8:49

information superpowers or energy superpowers or military superpowers is all

8:55

founded in technology.

8:57

And so technology leadership is really important.

8:59

Well, the problem is if somebody else has superior technology, right?

9:03

Yeah.

9:03

That's the issue.

9:04

That's right.

9:05

It seems like with the AI race, people are very nervous about it.

9:09

Like, you know, Elon has famously said there was like 80% chance it's awesome,

9:15

20% chance we're in trouble.

9:17

And people are worried about that 20%, rightly so.

9:20

I mean, you know, if you had 10 bullets in a revolver and, you know, you took

9:27

out eight of them and you still have two in there and you spin it,

9:32

you're not going to feel real comfortable when you pull that trigger.

9:33

It's terrifying.

9:34

Right.

9:35

And when we're working towards this ultimate goal of AI, it's just impossible

9:42

to imagine that it wouldn't be of national security interest to get there first.

9:49

We should, the question is what's there.

9:51

That's the, that was the part.

9:52

What is there?

9:53

Yeah.

9:53

I'm not sure.

9:54

And I don't think anybody, I don't think anybody really knows.

9:57

That's crazy though.

9:58

If I ask you, you're the head of NVIDIA.

10:01

If you don't know what's there, who knows?

10:04

Yeah.

10:04

I, I think it's probably going to be much more gradual than we think.

10:08

It won't be a moment.

10:11

It won't be, it won't be as if somebody arrived and nobody else has.

10:16

I don't think it's going to be like that.

10:18

I think it's going to be things that just get better and better and better and

10:21

better, just like technology does.

10:23

So you are rosy about the future.

10:25

You're, you're very optimistic about what's going to happen with AI.

10:28

Obviously, will you make the best AI chips in the world?

10:31

You probably better be.

10:34

If history is a guide, we were always concerned about new technology.

10:39

Humanity has always been concerned about new technology.

10:43

There are always somebody who's thinking, there are always a lot of people who

10:46

are quite concerned, were quite concerned.

10:48

And, and, and so if, if history is a guide, it is the case that all of this

10:55

concern is channeled into making the technology safer.

11:01

And so, for example, in the last several years, I would say AI technology has

11:08

increased probably in the last two years alone, maybe a hundred X.

11:13

Let's just give it a number.

11:15

Okay.

11:16

It's like a car two years ago was a hundred times slower.

11:20

So AI is a hundred times more capable today.

11:25

Now, how did we channel that technology?

11:26

How do we channel all of that power?

11:29

We directed it to causing the AI to be able to think, meaning that it can take

11:35

a problem that we give it, break it down step by step.

11:40

It does research before it answers.

11:43

And so it grounds it on truth.

11:45

It'll reflect on that answer, ask itself, is this the best, you know, answer

11:51

that I can give you?

11:53

Am I certain about this answer?

11:55

If it's not certain about the answer or highly confident about the answer, you'll

11:58

go back and do more research.

12:00

It might actually even use a tool because that tool provides a better solution

12:04

than it could hallucinate itself.

12:07

As a result, we took all of that computing capability and we channeled it into

12:12

having it produce a safer result, safer answer, a more truthful answer.

12:17

Because as you know, one of the greatest criticisms of AI in the beginning was

12:21

that it hallucinated.

12:22

Right.

12:23

And so if you look at the reason why people use AI so much today is because the

12:27

amount of hallucination has reduced.

12:29

You know, I use it almost, well, I use the whole trip over here.

12:34

And so I think the capability, most people think about power and they think

12:41

about, you know, maybe as an explosion power, but the technology power, most of

12:48

it is channeled towards safety.

12:50

A car today is more powerful, but it's safer to drive.

12:54

A lot of that power goes towards better handling.

12:56

You know, I'd rather have a, well, you have a thousand horsepower truck.

13:02

I think 500 horsepower is pretty good.

13:05

No, a thousand better.

13:06

I think a thousand is better.

13:07

I don't know if it's better, but it's definitely faster.

13:09

Yeah.

13:10

No, I think it's better.

13:11

You get out of trouble faster.

13:13

I enjoyed my 599 more than my 612.

13:20

It was, I think it was better, better, and more horsepower is better.

13:24

My 459 is better than my 430.

13:26

More horsepower is better.

13:28

I think more horsepower is better.

13:29

I think it's better handling.

13:31

It's better control.

13:32

In the case of technology, it's also very similar in that way.

13:36

You know, and so if you look at what we're going to do with the next thousand

13:40

times of performance in AI, a lot of it is going to be channeled towards more

13:45

reflection, more research, thinking about the answer more deeply.

13:51

So when you're defining safety, you're defining it as accuracy.

13:54

Functionality.

13:56

Functionality.

13:57

Okay.

13:57

It does what you expect it to do.

14:01

And then you take all the technology in a horsepower, you put guardrails on it,

14:05

just like our cars.

14:07

We've got a lot of technology in a car today.

14:09

A lot of it goes towards, for example, ABS.

14:13

ABS is great.

14:14

And so traction control.

14:17

That's fantastic.

14:18

Without a computer in the car, how would you do any of that?

14:21

Right.

14:21

And that little computer, the computers that you have doing your traction

14:25

control is more powerful than the computer that went to Apollo 11.

14:28

And so you want that technology.

14:31

Channel it towards safety.

14:33

Channel it towards functionality.

14:35

And so when people talk about power, the advancement of technology, oftentimes

14:40

I feel what they're thinking and what we're actually doing is very different.

14:45

Well, what do you think they're thinking?

14:46

Well, they're thinking somehow that this AI is being powerful and their mind

14:53

probably goes towards a sci-fi movie.

14:57

The definition of power, you know, oftentimes the definition of power is

15:03

military power or physical power.

15:05

But in the case of technology power, when we translate all of those operations,

15:11

it's towards more refined thinking, you know, more reflection, more planning,

15:16

more options.

15:18

I think the big fears that people have is, one, a big fear is military

15:21

applications.

15:22

That's a big fear.

15:23

Yeah.

15:24

Because people are very concerned that you're going to have AI systems that

15:28

make decisions that maybe an ethical person wouldn't make or a moral person

15:31

wouldn't make based on achieving an objective versus based on, you know, how it's

15:37

going to look to people.

15:41

Well, I'm happy that our military is going to use AI technology for defense.

15:47

And I think that Andruil, building military technology, I'm happy to hear that.

15:54

I'm happy to see all these tech startups now channeling their technology

15:58

capabilities towards defense and military applications.

16:01

I think you're going to need to do that.

16:03

Yeah, we had Palmer Luckey on the podcast.

16:05

He was demonstrating some of the stuff with his helmet on.

16:07

And he showed some videos how you could see behind walls and stuff.

16:11

Like, it's nuts.

16:11

He's actually the perfect guy to go start that company, by the way.

16:14

A hundred percent.

16:15

Yeah, a hundred percent.

16:17

It's like he's born for that.

16:18

Yeah.

16:19

He came in here with a copper jacket on.

16:21

He's a freak.

16:22

It's awesome.

16:23

He's awesome.

16:24

But it's also, it's a, you know, an unusual intellect channeled into that very

16:29

bizarre field is what you need.

16:31

And I think it's, it's a, I think I'm happy that we're making it more socially

16:36

acceptable.

16:37

You know, there was a time where when somebody wanted to channel their

16:41

technology capability and their intellect into defense technology, somehow they're

16:47

vilified.

16:48

But we need people like that.

16:51

We need people who enjoy, enjoy that part of application of technology.

16:56

Well, people are terrified of war, you know, so it makes sense.

16:59

Well, the best way to avoid it has excessive military might.

17:02

Do you think that's absolutely the best way?

17:04

Not, not diplomacy, not working stuff out?

17:08

All of it.

17:08

All of it.

17:09

Yeah.

17:09

You have to have military might in order to get people to sit down with you.

17:12

Right.

17:13

Exactly.

17:13

All of it.

17:14

Otherwise they just invade.

17:15

That's right.

17:16

Why ask for permission?

17:18

Again, like you said, history.

17:19

Go back and look at history.

17:20

That's right.

17:21

When you look at the future of AI and you just said that no one really knows

17:26

what's happening.

17:28

Do you ever sit down and ponder scenarios?

17:31

Like, what do you, what do you think is like best case scenario for AI over the

17:36

next two decades?

17:40

Um, the best case scenario is that AI diffuses into everything that we do and,

17:51

uh, our, everything's

17:55

more efficient, but the threat of war remains a threat of war.

18:02

Uh, cyber security remains a super difficult challenge.

18:08

Somebody is going to try to breach your security.

18:14

You're going to have thousands of millions of AI agents protecting you from

18:20

that threat.

18:21

Your technology is going to get better.

18:24

Their technology is going to get better.

18:26

Just like cyber security right now, while we speak, we're being, we're seeing

18:32

cyber attacks

18:33

all over the planet on just about every front door you can imagine.

18:36

And, and yet you and I are sitting here talking.

18:43

And so the reason for that is because we know that there's a whole bunch of

18:48

cyber security

18:49

technology in defense.

18:51

And so we just have to keep amping that up, keep stepping that up.

18:55

This episode is brought to you by Visible.

18:57

When your phone plans as good as Visible, you've got to tell your people.

19:01

It's the ultimate wireless hack to save money and still get great coverage and

19:05

a reliable

19:06

connection.

19:06

Get one line wireless with unlimited data and hotspot for $25 a month.

19:13

Taxes and fees included all on Verizon's 5G network.

19:18

Plus now for a limited time, new members can get the Visible plan for just $19

19:24

a month for

19:25

the first 26 months.

19:26

Use promo code switch26 and save beyond the season.

19:32

It's a deal so good.

19:33

You're going to want to tell your people.

19:35

Switch now at visible.com slash Rogan.

19:38

Terms apply.

19:39

Limited time offers subject to change.

19:42

See visible.com for planned features and network management details.

19:46

That's a big issue with people is the worry that technology is going to get to

19:51

a point where

19:52

encryption is going to be obsolete.

19:54

Encryption is just, it's no longer going to protect data.

19:57

It's no longer going to protect systems.

19:59

Do you anticipate that ever being an issue?

20:01

Or do you think there's, it's as the defense grows, the threat grows, then

20:05

defense grows,

20:06

and it just keeps going on and on and on.

20:08

And they'll always be able to fight off any sort of intrusions.

20:14

Not forever, some intrusion will get in, and then we'll all learn from it.

20:20

And you know, the reason why cybersecurity works is because, of course, the

20:24

technology of defense

20:25

is advancing very quickly.

20:28

The technology of offense is advancing very quickly.

20:31

However, the benefit of the cybersecurity defense is that socially, the

20:38

community, all of our companies

20:41

work together as one.

20:42

Most people don't realize this.

20:44

There's a whole community of cybersecurity experts.

20:50

We exchange ideas.

20:53

We exchange best practices.

20:55

We exchange what we detect.

20:57

The moment something has been breached, or maybe there's a loophole, or

21:01

whatever it is,

21:02

it is shared by everybody.

21:04

The patches are shared with everybody.

21:06

That's interesting.

21:06

Yeah.

21:07

Most people don't realize this.

21:08

No, I had no idea.

21:10

I've assumed that it would just be competitive like everything else.

21:12

No, no.

21:13

No, we work together, all of us.

21:15

Has that always been the case?

21:17

It surely has been the case for about 15 years.

21:20

It might not have been the case long ago.

21:22

But this-

21:24

What do you think started off that cooperation?

21:26

People recognizing it's a challenge and no company can stand alone.

21:30

And the same thing is going to happen with AI.

21:33

I think we all have to decide.

21:36

Working together to stay out of harm's way is our best chance for defense.

21:42

Then it's basically everybody against the threat.

21:46

And it also seems like you'd be way better at detecting where these threats are

21:50

coming from

21:51

and neutralizing them, too.

21:52

Exactly.

21:52

Because the moment you detect it somewhere, you've got to find out right away.

21:56

It'll be really hard to hide.

21:57

That's right.

21:58

Yeah.

21:58

That's how it works.

22:00

That's the reason why it's safe.

22:01

That's why I'm sitting here right now instead of locking everything down on

22:04

video.

22:04

Not only am I watching my own back, I've got everybody watching my back, and I'm

22:12

watching

22:12

everybody else's back.

22:13

It's a bizarre world, isn't it, when you think about that, cyber threats?

22:16

The idea about cybersecurity is unknown to the people who are talking about AI

22:21

threats.

22:22

I think when they think about AI threats and AI cybersecurity threats, they

22:26

have to also

22:27

think about how we deal with it today.

22:29

Now, there's no question that AI is a new technology, and it's a new type of

22:36

software.

22:37

In the end, it's software.

22:38

It's a new type of software, and so it's going to have new capabilities.

22:42

But so will the defense, where you use the same AI technology to go defend

22:46

against it.

22:48

So do you anticipate a time ever in the future where it's going to be

22:52

impossible, where there's

22:55

not going to be any secrets, where the bottleneck between the technology that

23:00

we have and the

23:01

information that we have, information is just all a bunch of ones and zeros.

23:04

It's out there on hard drives, and the technology has more and more access to

23:07

that information.

23:08

Is it ever going to get to a point in time where there's no way to keep a

23:12

secret?

23:14

Because it seems like that's where everything is kind of headed in a weird way.

23:17

I don't think so.

23:17

I think the quantum computers we're supposed to, we all, yeah, quantum

23:21

computers will make

23:22

it possible, will make it so that the previous quantum, previous encryption

23:27

technology is

23:28

obsolete.

23:29

But that's the reason why the entire industry is working on post-quantum

23:34

encryption technology.

23:36

What would that look like?

23:37

New algorithms.

23:40

The crazy thing is when you hear about the kind of computation that quantum

23:44

computing can

23:45

do, and the power that it has, where you're looking at all the supercomputers

23:50

in the world

23:51

that would take billions of years, and it takes them a few minutes to solve

23:53

these equations.

23:54

How do you make encryption for something that can do that?

23:58

I'm not sure.

23:59

But I've got a bunch of scientists who are working on that.

24:02

I hope they can figure it out.

24:04

Yeah.

24:04

We've got a bunch of scientists who are experts in that.

24:06

Is the ultimate fear that it can't be breached, that quantum computing will

24:10

always be able

24:10

to decrypt all other quantum computing encryption?

24:14

I don't think so.

24:16

It just gets to some point where it's like, stop playing the stupid game.

24:19

We know everything.

24:20

I don't think so.

24:21

No?

24:22

Because I'm, you know, history is a guide.

24:25

History is a guide.

24:27

Before AI came around.

24:28

That's my worry.

24:29

My worry is this is a totally, you know, it's like history was one thing, and

24:32

then nuclear

24:33

weapons kind of changed all of our thoughts on war and mutually assured

24:37

destruction came,

24:38

got everybody to stop using nuclear bombs.

24:42

Yeah.

24:42

My worry is that-

24:44

The thing is, Joe, is that AI is not going to, it's not like we're cavemen, and

24:49

then all

24:50

of a sudden, one day, AI shows up.

24:52

Every single day, we're getting better and smarter because we have AI, and so

24:57

we're stepping

24:57

on our own AI's shoulders.

24:59

So when that, whatever that AI threat comes, it's a click ahead.

25:05

It's not a galaxy ahead.

25:07

You know, it's just a click ahead.

25:10

And so I think the idea that somehow this AI is going to pop out of nowhere and

25:19

somehow

25:20

think in a way that we can't even imagine thinking and do something that we can't

25:26

possibly

25:27

imagine, I think is far-fetched.

25:29

And the reason for that is because we all have AIs, and, you know, there's a

25:33

whole bunch of

25:34

AIs being in development.

25:35

We know what they are, and we're using it.

25:37

And so every single day, we're getting, we're close to each other.

25:41

But don't they do things that are very surprising?

25:44

Yeah, but so you have an AI that does something surprising.

25:49

I'm going to have an AI.

25:50

Right.

25:50

My AI looks at your AI and goes, that's not that surprising.

25:53

The fear for the lay person like myself is that AI becomes sentient and makes

25:57

its own

25:58

decisions.

25:59

And then ultimately decides to just govern the world, do it its own way.

26:06

They're like, you guys, you had a good run, but we're taking over now.

26:09

Yeah, but my AI is going to take care of me.

26:14

So that's the, this is the cybersecurity argument.

26:19

Yes.

26:20

Well, you have an AI and it's super smart, but my AI is super smart too.

26:24

And, and maybe your AI, let's pretend, let's, let's pretend for a second that

26:30

we understand

26:30

what consciousness is and we understand what sentience is and, and that, in

26:34

fact.

26:34

And we really are just pretending.

26:35

Okay.

26:36

Let's just pretend for a second that we, we believe that.

26:38

I don't believe actually, I don't actually don't believe that, but nonetheless,

26:41

let's pretend

26:41

we believe that so your, your, your AI is conscious and my AI is conscious and,

26:46

and let's say your

26:47

AI is, you know, wants to, I don't know, do something surprising.

26:51

My AI is so smart that it won't, it might be surprising to me, but it probably

26:57

won't be

26:57

surprising to my AI.

26:58

And so maybe my AI thinks it's surprising as well, but it's so smart.

27:05

The moment it sees it the first time, it's not going to be surprised the second

27:08

time, just

27:09

like us.

27:10

And so I feel like, I think the idea that, that only one person has AI and that

27:17

one person's

27:18

AI is, compares everybody else's AI is Neanderthal, is probably unlikely.

27:26

I think it's much more like cybersecurity.

27:28

Interesting.

27:30

I think the fear is not that your AI is going to battle with somebody else's AI.

27:36

The fear is that AI is no longer going to listen to you.

27:39

That's the fear is that human beings won't have control over it after a certain

27:43

point.

27:44

If it achieves sentience and then has the ability to be autonomous.

27:49

That there's one AI.

27:50

Well, they just combine.

27:53

Yeah.

27:53

Becomes one AI.

27:54

That it's a life form.

27:55

Yeah.

27:55

But that's the, there's arguments about that, right?

27:58

That we're dealing with some sort of synthetic biology, that it's not as simple

28:01

as new technology,

28:02

that you're creating a life form.

28:04

If it's like life form, let's go along with that for a while.

28:09

I think if it's like life form, as you know, all life forms don't agree.

28:13

And so I'm going to have to go with your life form and my life form.

28:17

I'm going to agree because my life form is going to want to be the super life

28:20

form.

28:21

And now that we have disagreeing life forms, we're back again to where we are.

28:27

Well, they would probably cooperate with each other.

28:29

It would just, the reason why we don't cooperate with each other is we're

28:34

territorial primates.

28:36

But AI wouldn't be a territorial primate.

28:40

They'd realize the folly in that sort of thinking.

28:42

And it would say, listen, there's plenty of energy for everybody.

28:46

We don't need to dominate.

28:49

We don't need, we're not trying to acquire resources and take over the world.

28:53

We're not looking to find a good breeding partner.

28:55

We're just existing as a new super life form that these cute monkeys created

29:02

for us.

29:03

Okay.

29:05

Well, that would be a superpower with no ego.

29:11

Right.

29:12

And if it has no ego, why would it have to ego to do any harm to us?

29:20

Well, I don't assume that it would do harm to us.

29:22

But the fear would be that we would no longer have control and that we would no

29:28

longer be the apex species on the planet.

29:31

This thing that we created would now be.

29:34

Is that funny?

29:35

No.

29:36

I just think it's not going to happen.

29:38

I know you think it's not going to happen.

29:40

But it could, right?

29:41

It could.

29:42

Here's the other thing.

29:43

It's like if we're racing towards could.

29:45

Yeah.

29:46

And could could be the end of human beings being in control of our own destiny.

29:52

I just think it's extremely unlikely.

29:54

Mm.

29:55

Yeah.

29:55

That's what they said in the Terminator movie.

29:57

And it hasn't happened.

29:59

No, not yet.

30:00

But you guys are working towards it.

30:01

The thing about you're saying about conscience and sentience that you don't

30:07

think that AI will achieve consciousness or that consciousness is specific.

30:12

What's the definition of consciousness?

30:14

What is the definition to you?

30:15

Consciousness, I guess first of all, you need to know about your own existence.

30:27

You have to have experience, not just knowledge and intelligence.

30:44

But the concept of a machine having an experience, I'm not – well, first of

30:50

all, I don't know what defines experience, why we have experiences.

30:56

Right.

30:57

Yeah.

30:57

And why this microphone doesn't.

30:59

And so I think I know – well, I think I know what consciousness is.

31:10

The sense of experience, the ability to know self versus the ability to be able

31:18

to reflect, know our own self, the sense of ego.

31:24

I think all of those human experiences probably is what consciousness is.

31:35

The sense of why it exists versus the concept of knowledge and intelligence,

31:41

which is what AI is defined by today.

31:44

It has knowledge.

31:45

It has intelligence.

31:46

Artificial intelligence.

31:48

We don't call it artificial consciousness.

31:51

Artificial intelligence, the ability to perceive, recognize, understand, plan,

32:03

perform tasks.

32:06

Those things are foundations of intelligence, to know things, knowledge.

32:13

I don't – it's clearly different than consciousness.

32:18

Well, consciousness is so loosely defined.

32:20

How can we say that?

32:21

I mean, doesn't a dog have consciousness?

32:22

Yeah.

32:23

Dogs seem to be pretty conscious.

32:24

That's right.

32:25

Yeah.

32:25

So – and that's a lower level consciousness than a human being's

32:28

consciousness.

32:29

I'm not sure.

32:30

Yeah, right.

32:31

Well –

32:32

The question is what –

32:33

Lower level intelligence.

32:34

It's lower level intelligence.

32:36

Yes.

32:36

But I don't know that it's lower level consciousness.

32:38

That's a good point.

32:39

Right.

32:39

Because I believe my dogs feel as much as I feel.

32:42

Yeah, they feel a lot.

32:43

Yeah.

32:44

Yeah.

32:44

Right.

32:44

Yeah, they get attached to you.

32:47

That's right.

32:47

They get depressed if you're not there.

32:49

That's right.

32:49

Exactly.

32:50

There's definitely that.

32:51

Yeah.

32:53

The concept of experience.

32:55

Right.

32:56

But isn't AI interacting with society?

32:59

So doesn't it acquire experience through that interaction?

33:03

I don't think interactions is experience.

33:06

I think experience is – experience is a collection of feelings, I think.

33:14

You're aware of that AI – I forget which one – where they gave it some

33:19

false information

33:21

about one of the programmers having an affair with his wife just to see how it

33:24

would respond

33:25

to it.

33:25

And then when they said they were going to shut it down, it threatened to blackmail

33:28

him

33:28

and reveal his affair.

33:29

And it was like, whoa, like it's conniving.

33:32

Like if that's not learning from experience and being aware that you're about

33:37

to be shut

33:38

down, which would imply at least some kind of consciousness, or you could kind

33:42

of define

33:43

it as consciousness if you were very loose with the term, and if you imagine

33:47

that this

33:47

is going to exponentially become more powerful, wouldn't that ultimately lead

33:53

to a different

33:54

kind of consciousness than we're defining from biology?

33:56

Well, first of all, let's just break down what it probably did.

34:00

It probably read somewhere.

34:02

There's probably text that in these consequences, certain people did that.

34:10

Right.

34:11

I could imagine a novel.

34:12

Right.

34:13

Having those words related.

34:15

Sure.

34:16

And so inside...

34:17

It realizes its strategy for survival is blackmail.

34:20

It's just a bunch of numbers.

34:21

Blackmail.

34:21

It's just a bunch of numbers that in the collection of numbers that relates to

34:29

a husband cheating

34:30

on a wife has subsequently a bunch of numbers that relates to blackmail and

34:37

such things,

34:39

whatever the revenge was.

34:42

And so it has spewed it out.

34:43

And so it's just like, you know, it's just as if I'm asking it to write me a

34:49

poem in

34:50

Shakespeare.

34:50

It's just whatever the words are, and that dimensionality, this dimensionality

34:57

is all these

34:58

vectors in multidimensional space, that word in the prompt that described the

35:07

affair subsequently led

35:10

to one word after another, led to, you know, some revenge in something.

35:15

But it's not because it had consciousness or, you know, it just spewed out

35:18

those words, generated

35:20

those words.

35:20

I understand what you're saying, that it learned from patterns that human

35:24

beings have exhibited, both in literature and in real life.

35:27

That's exactly right.

35:28

But at a certain point in time, one would say, okay, well, it couldn't do this

35:32

two years

35:33

ago, and it couldn't do this four years ago.

35:35

Like when we're looking towards the future, like at what point in time when it

35:39

can do everything a person does, what point in time do we decide that it's

35:43

conscious?

35:43

If it absolutely mimics all human thinking and behavior patterns.

35:49

That doesn't make it conscious.

35:50

It becomes indiscernible.

35:51

It's aware, it can communicate with you the exact same way a person can.

35:56

Like is consciousness, are we putting too much weight on that concept?

36:00

Because it seems like it's a version of a kind of consciousness.

36:04

It's a version of imitation.

36:06

Imitation consciousness, right.

36:08

But if it perfectly imitates it.

36:09

I still think it's an example of imitation.

36:12

So it's like a fake Rolex when they 3D print them and make them like indiscernible.

36:16

The question is what's the definition of consciousness?

36:18

Yeah.

36:18

Yeah.

36:19

That's the question.

36:21

And I don't think anybody's really clearly defined that.

36:23

That's where it gets weird.

36:25

And that's where the real doomsday people are worried that you are creating a

36:29

form of consciousness that you can't control.

36:31

I believe it is possible to create a machine that imitates human intelligence

36:42

and has the ability to understand information.

36:49

Understand instructions.

36:52

Break the problem down.

36:54

Break the problem down.

36:54

Solve problems and perform tasks.

36:57

I believe that completely.

36:59

I believe that we could have a computer that has a vast amount of knowledge.

37:10

Some of it true.

37:11

Some of it true.

37:11

Some of it not true.

37:12

Some of it generated by humans.

37:16

Some of it generated synthetically.

37:18

And more and more of knowledge in the world will be generated synthetically

37:23

going forward.

37:24

Until now, the knowledge that we have are knowledge that we generate and we

37:30

propagate and we send to each other and we amplify it and we add to it and we

37:36

modify it.

37:37

In the future, in a couple of years, maybe two or three years, 90% of the world's

37:45

knowledge will likely be generated by AI.

37:49

That's crazy.

37:50

I know, but it's just fine.

37:51

But it's just fine.

37:53

I know.

37:54

And the reason for that is this.

37:55

Let me tell you why.

37:56

Okay.

37:57

It's because what difference does it make to me that I am learning from a

38:02

textbook that was generated by a bunch of people I didn't know or written by a

38:08

book that, you know, from somebody I don't know, to knowledge generated by AI

38:14

computers that are assimilating all of this and re-synthesizing things.

38:20

To me, I don't think there's a whole lot of difference.

38:22

We still have to fact check it.

38:25

We still have to make sure that it's, you know, based on fundamental first

38:28

principles and we still have to do all of that just like we do today.

38:31

Is this taking into account the kind of AI that exists currently?

38:35

And do you anticipate that just like we could have never really believed that

38:40

AI would be, at least a person like myself, would never believe AI would be as

38:45

so ubiquitous and so worth – it's so powerful today and so important today.

38:50

We never thought that 10 years ago.

38:52

Never thought that.

38:53

Imagine, like, what are we looking at 10 years from now?

38:56

I think that if you reflect back 10 years from now, you would say the same

39:02

thing, that we would have never believed that.

39:07

In a different direction.

39:09

Right.

39:10

But if you go forward nine years from now and then ask yourself what's going to

39:15

happen 10 years from now, I think it will be quite gradual.

39:21

One of the things that Elon said that makes me happy is he believes that we're

39:27

going to get to a point where it's not necessary for people to work and not

39:33

meaning that you're going to have no purpose in life.

39:38

But you will have, in his words, universal high income because so much revenue

39:44

is generated by AI that it will take away this need for people to do things

39:49

that they don't really enjoy doing just for money.

39:54

And I think a lot of people have a problem with that because their entire

39:57

identity and how they think of themselves and how they fit in the community is

40:02

what they do.

40:03

Like this is Mike.

40:03

He's an amazing mechanic.

40:05

Go to Mike and Mike takes care of things.

40:07

But there's going to come a point in time where AI is going to be able to do

40:11

all those things much better than people do.

40:14

And people will just be able to receive money.

40:16

But then what does Mike do?

40:18

Mike is, you know, really loves being the best mechanic around.

40:22

You know, what does the guy who, you know, codes, what does he do when AI can

40:28

code infinitely faster with zero errors?

40:32

Like what happens with all those people and that is where it gets weird because

40:36

we've sort of wrapped our identity as human beings around what we do for a

40:41

living.

40:42

You know, when you meet someone, one of the first things you meet somebody at a

40:45

party, hi, Joe, what's your name?

40:47

Mike, what do you do, Mike?

40:48

And, you know, Mike's like, oh, I'm a lawyer.

40:49

Oh, what kind of law?

40:50

And you have a conversation, you know, when Mike is like, I get money from the

40:54

government.

40:55

I play video games.

40:55

It gets weird.

40:57

And I think the concept sounds great until you take into account human nature.

41:03

And human nature is that we like to have puzzles to solve and things to do and

41:07

an identity that's wrapped around our idea that we're very good at this thing

41:12

that we do for a living.

41:14

Yeah, I think, let's see.

41:18

Let me start with the more mundane.

41:20

Okay.

41:21

And I'll work backwards.

41:22

Okay.

41:22

Work forward.

41:25

So one of the predictions from Jeff Hinton, who started the whole deep learning

41:33

phenomenon, deep learning technology trend, and incredible, incredible

41:40

researcher, professor at University of Toronto, he invented, discovered, or

41:48

invented the idea of backpropagation, which allows the neural network to learn.

41:55

And as you know, for the audience, software historically was humans applying

42:06

first principles and our thinking to describe an algorithm that is then codified,

42:17

just like a recipe that's codified in software.

42:20

It looks just like a recipe.

42:21

It looks just like a recipe, how to cook something.

42:24

It looks exactly the same, just in a slightly different language.

42:27

We call it Python or C or C++ or whatever it is.

42:32

In the case of deep learning, this invention of artificial intelligence, we put

42:38

a structure of a whole bunch of neural networks and a whole bunch of math units,

42:44

and we make this large structure.

42:48

It's like a switchboard of little mathematical units, and we connect it all

42:55

together.

42:58

And we give it the input that the software would eventually receive, and we

43:05

just let it randomly guess what the output is.

43:11

And so we say, for example, the input could be a picture of a cat.

43:15

And one of the outputs of the switchboard is where the cat signal is supposed

43:22

to show up.

43:24

And all of the other signals, the other one's a dog, the other one's an

43:27

elephant, the other one's a tiger.

43:31

And all of the other signals are supposed to be zero when I show it a cat, and

43:36

the one that is a cat should be one.

43:39

And I show it a cat through this big, huge network of switchboards and math

43:44

units, and they're just doing multiply and adds, multiplies and adds.

43:51

And this thing, this switchboard is gigantic.

43:56

The more information you're going to give it, the bigger the switchboard has to

44:02

be.

44:02

And what Jeff Hinton discovered, invented, was a way for you to guess that, put

44:08

the cat signal in, put the cat image in.

44:13

And that cat image, you know, could be a million numbers, because it's, you

44:17

know, a megapixel image, for example.

44:20

And it's just a whole bunch of numbers.

44:22

And somehow from those numbers, it has to light up the cat signal.

44:28

Okay, that's the bottom line.

44:30

And if it, the first time you do it, it just comes up with garbage.

44:36

And so it says, the right answer is cat.

44:41

And so you need to increase this signal and decrease all of the other, and back

44:46

propagates the outcome through the entire network.

44:50

And then you show it in another, now it's an image of a dog.

44:55

And it guesses it, takes a swing at it, and it comes up with a bunch of garbage.

45:01

And you say, no, no, no, the answer is this is a dog.

45:04

I want you to produce dog.

45:06

And all of the other switch, all the other outputs have to be zero.

45:10

And I want to back propagate that, and just do it over and over and over again.

45:15

It's just like showing a kid, this is an apple, this is a dog, this is a cat.

45:20

And you just keep showing it to them until they eventually get it.

45:23

Okay, well, anyways, that big invention is deep learning.

45:26

That's the foundation of artificial intelligence.

45:30

It's a piece of software that learns from examples.

45:34

That's basically machine learning, a machine that learns.

45:40

And so one of the big first applications was image recognition.

45:47

And one of the most important image recognition applications is radiology.

45:51

And so he predicted about five years ago that in five years' time, the world

46:02

won't need any radiologists.

46:04

Because AI would have swept the whole field.

46:06

Well, it turns out, AI has swept the whole field.

46:10

That is completely true.

46:12

Today, just about every radiologist is using AI in some way.

46:17

And what's ironic, though, what's interesting is that the number of radiologists

46:23

has actually grown.

46:26

And so the question is, why?

46:28

That's kind of interesting, right?

46:30

It is.

46:31

And so the prediction was, in fact, that 30 million radiologists will be wiped

46:37

out.

46:38

But as it turns out, we needed more.

46:42

And the reason for that is because the purpose of a radiologist is to diagnose

46:47

disease, not to study the image.

46:50

The image studying is simply a task in service of diagnosing the disease.

46:59

And so now, the fact that you could study the images more quickly and more

47:04

precisely without ever making a mistake and never gets tired.

47:10

You could study more images.

47:12

You could study it in 3D form instead of 2D because, you know, the AI doesn't

47:18

care whether it studies images in 3D or 2D.

47:22

You could study it in 4D.

47:23

And so now you could study images in a way that radiologists can't easily do.

47:30

And you could study a lot more of it.

47:33

And so the number of tests that people are able to do increases.

47:36

And because they're able to serve more patients, the hospital does better.

47:42

They have more clients, more patients.

47:44

As a result, they have better economics.

47:47

When they have better economics, they hire more radiologists because their

47:51

purpose is not to study the images.

47:53

Their purpose is to diagnose disease.

47:55

And so the question is, what I'm leading up to is, ultimately, what is the

48:00

purpose?

48:01

What is the purpose of the lawyer?

48:03

And has the purpose changed?

48:06

What is the purpose?

48:08

You know, one of the examples that I would give is, for example, if my car

48:14

became self-driving, will all chauffeurs be out of jobs?

48:19

The answer probably is not.

48:21

Because for some chauffeurs, some people who are driving you, they could be

48:26

protectors.

48:27

Some people, they're part of the experience, part of the service.

48:31

So when you get there, they, you know, they could take care of things for you.

48:34

And so for a lot of different reasons, not all chauffeurs would lose their jobs.

48:39

Some chauffeurs would lose their jobs.

48:42

And many chauffeurs would change their jobs.

48:44

And the type of applications of autonomous vehicles will probably increase.

48:49

You know, the usage of the technology would then find new homes.

48:53

And so I think you have to go back to, what is the purpose of a job?

48:57

You know, like, for example, if AI comes along, I actually don't believe I'm

49:00

going to lose my job.

49:01

Because my purpose isn't to, I have to look at a lot of documents.

49:06

I study a lot of emails.

49:08

I look at a bunch of diagrams, you know.

49:11

The question is, what is the job?

49:14

And the purpose of somebody probably hasn't changed.

49:18

A lawyer, for example, help people.

49:20

That probably hasn't changed.

49:22

Studying legal documents, generating documents, it's part of the job, not the

49:26

job.

49:27

But don't you think there's many jobs that AI will replace?

49:31

If your job is the task.

49:33

Yeah, if your job is the task.

49:35

Right.

49:35

So automation.

49:36

Yeah.

49:36

If your job is the task.

49:39

That's a lot of people.

49:40

It could be a lot of people.

49:41

But it'll probably generate, like, for example, let's say I'm super excited

49:47

about the robots Elon's working on.

49:52

It's still a few years away.

49:53

When it happens, when it happens, there's a whole new industry of technicians

50:00

and people who have to manufacture the robots, right?

50:06

And so that job never existed.

50:09

And so you're going to have a whole industry of people taken care of, like, for

50:14

example, you know, all the mechanics and all the people who are building things

50:19

for cars, supercharging cars.

50:21

That didn't exist before cars.

50:24

And now we're going to have robots.

50:25

You're going to have robot apparels.

50:27

So a whole industry of, right, isn't that right?

50:30

Because I want my robot to look different than your robot.

50:32

Oh, God.

50:33

And so you're going to have a whole, you know, apparel industry for robots.

50:37

You're going to have mechanics for robots.

50:39

And you have, you know, people who come and maintain your robots.

50:43

Don't you think that'll all be automated, though?

50:43

No.

50:44

You don't think so?

50:44

You don't think that'll be all done by other robots?

50:47

Eventually.

50:48

And then there'll be something else.

50:49

So you think ultimately people just adapt, except if you are the task, which is

50:55

a large percentage of the workforce.

50:58

If your job is just to chop vegetables, Cuisinart's going to replace you.

51:02

Yeah.

51:03

So people have to find meaning in other things.

51:06

Your job has to be more than the task.

51:08

What do you think about Elon's belief that this universal basic income thing

51:13

will eventually become necessary?

51:16

Many people think that.

51:18

Andrew Yang thinks that.

51:20

He was one of the first people to sort of sound that alarm during the 2020

51:24

election.

51:28

Yeah, I guess both ideas probably won't exist at the same time.

51:38

And as in life, things will probably be in the middle.

51:41

One idea, of course, is that there'll be so much abundance of resource that

51:47

nobody needs a job.

51:48

And we'll all be wealthy.

51:51

On the other hand, we're going to need universal basic income.

51:56

Both ideas don't exist at the same time.

51:58

And so we're either going to be all wealthy or we're going to be all using-

52:04

How could everybody be wealthy though?

52:05

What scenario do you-

52:07

Well, because wealthy, not because you have a lot of dollars, wealthy because

52:09

there's a lot of abundance.

52:11

Like, for example, today, we are wealthy of information.

52:14

You know, this is a concept several thousand years ago, only a few people have.

52:20

And so today we have wealth of a whole bunch of things, resources that-

52:26

That's a good point.

52:26

Yeah.

52:26

And so we're going to have wealth of resources.

52:28

Things that we think are valuable today that in the future are just not that

52:33

valuable, you know.

52:35

And so, because it's automated.

52:37

And so I think the question, maybe partly, it's hard to answer partly because

52:46

it's hard to talk about infinity and it's hard to talk about a long time from

52:51

now.

52:52

And the reason for that is because there's just too many scenarios to consider.

52:59

But I think in the next several years, call it five to 10 years, there are

53:05

several things that I believe in hope.

53:08

And I say hope because I'm not sure.

53:12

One of the things that I believe is that the technology divide will be

53:17

substantially collapsed.

53:22

And, of course, the alternative viewpoint is that AI is going to increase the

53:28

technology divide.

53:30

Now, the reason why I believe AI is going to reduce the technology divide is

53:36

because we have proof.

53:38

The evidence is that AI is the easiest application in the world to use.

53:44

ChatGPT has grown to almost a billion users, frankly, practically overnight.

53:50

And if you're not exactly sure how to use, everybody knows how to use ChatGPT,

53:54

just say something to it.

53:55

If you're not sure how to use ChatGPT, you ask ChatGPT how to use it.

53:59

No tool in history has ever had this capability.

54:04

A Cuisinart, you know, if you don't know how to use it, you're kind of screwed.

54:08

You're going to walk up to it and say, how do you use a Cuisinart?

54:11

You're going to have to find somebody else.

54:12

And so, but an AI will just tell you exactly how to do it.

54:16

Anybody could do this.

54:17

It'll speak to you in any language.

54:19

And if it doesn't know your language, you'll speak it in that language.

54:23

And it'll probably figure out that it doesn't completely understand your

54:26

language.

54:26

Go learn it instantly and comes back and talk to you.

54:30

And so I think the technology divide has a real chance, finally, that you don't

54:35

have to speak Python or C++ or Fortran.

54:39

You can just speak human and whatever form of human you like.

54:43

And so I think that that has a real chance of closing the technology divide.

54:46

Now, of course, the counter narrative would say that AI is only going to be

54:54

available for the nations and the countries that have a vast amount of

55:00

resources.

55:01

Because AI takes energy and AI takes a lot of GPUs and factories to be able to

55:08

produce the AI.

55:11

No doubt at the scale that we would like to do in the United States.

55:14

But the fact of the matter is your phone is going to run AI just fine all by

55:20

itself in a few years.

55:22

Today, it already does it fairly decently.

55:25

And so the fact that every country, every nation, every society will have to

55:31

benefit a very good AI.

55:34

It might not be tomorrow's AI.

55:35

It might be yesterday's AI.

55:37

But yesterday's AI is freaking amazing.

55:39

You know, in 10 years' time, 9-year-old AI is going to be amazing.

55:44

You don't need 10-year-old AI.

55:46

You don't need frontier AI like we need frontier AI because we want to be the

55:50

world leader.

55:51

But for every single country, everybody, I think the capability to elevate

55:55

everybody's knowledge and capability and intelligence, that day is coming.

56:00

The Octagon isn't just in Las Vegas anymore.

56:03

It's right in your hands with DraftKings Sportsbook, the official sports

56:07

betting partner of UFC.

56:08

Get ready because when Dwavishwili and Jan face off again at UFC 323, every

56:15

punch, every takedown, every finish, it all has the potential to pay off in

56:20

real time.

56:21

New customers bet just $5, and if your bet wins, you get paid $200 in bonus

56:26

bets.

56:27

And hey, Missouri, the wait is over.

56:29

DraftKings Sportsbook is now live in the Show Me State.

56:32

Download the DraftKings Sportsbook app and use promo code ROGAN.

56:36

That's code ROGAN to turn $5 into $200 in bonus bets if your bet wins.

56:42

In partnership with DraftKings, the crown is yours.

56:46

Gambling problem?

56:47

Call 1-800-GAMBLER.

56:48

In New York, call 877-8-HOPE-N-Y or text HOPE-N-Y-467-369.

56:53

In Connecticut, help is available for problem gambling.

56:55

Call 888-789-7777 or visit ccpg.org.

56:59

Please play responsibly.

57:00

On behalf of Boothill Casino and Resort in Kansas, pass through if per wager

57:03

tax may apply in Illinois.

57:04

21 and over.

57:05

Age and eligibility varies by jurisdiction.

57:07

Void in Ontario.

57:08

Restrictions apply.

57:09

Bet must win to receive bonus bets which expire in seven days.

57:12

Minimum odds required.

57:13

For additional terms and responsible gaming resources, see dkng.co slash audio.

57:17

Limited time offer.

57:18

And also energy production, which is the real bottleneck when it comes to third

57:23

world countries.

57:24

That's right.

57:25

Electricity and all the resources that we take for granted.

57:31

Almost everything is going to be energy constrained.

57:33

And so if you take a look at one of the most important technology advances in

57:39

history is this idea called Moore's Law.

57:42

Moore's Law started basically in my generation.

57:48

And my generation is the generation of computers.

57:53

We graduated in 1984 and that was basically at the very beginning of the PC

57:59

revolution and the microprocessor.

58:02

And every single year, it approximately doubled.

58:08

And we describe it as every single year we doubled the performance.

58:13

But what it really means is that every single year, the cost of computing halved.

58:20

And so the cost of computing in the course of five years reduced by a factor of

58:26

10, the amount of energy necessary to do computing, to do any task, reduced by

58:33

a factor of 10.

58:34

Every single 10 years, 100, 1,000, 10,000, 100,000, so on and so forth.

58:44

And so each one of the clicks of Moore's Law, the amount of energy necessary to

58:49

do any computing reduced.

58:51

That's the reason why you have a laptop today when back in 1984, it sat on the

58:56

desk, you got a plug in, it wasn't that fast, and it consumed a lot of power.

59:01

Today, you know, it's only a few watts.

59:03

And so Moore's Law is the fundamental technology, the fundamental technology

59:08

trend that made it possible.

59:10

Well, what's going on in AI?

59:11

The reason why NVIDIA is here is because we invented this new way of doing

59:15

computing.

59:16

We call it accelerated computing.

59:17

We started it 33 years ago.

59:19

It took us about 30 years to really made a huge breakthrough.

59:24

In that 30 years or so, we took computing, you know, probably a factor of, well,

59:30

let me just say, like the last 10 years.

59:33

The last 10 years, we improved the performance of computing by 100,000 times.

59:40

Whoa.

59:42

Imagine a car over the course of 10 years, it became 100,000 times faster.

59:46

Or at the same speed, 100,000 times cheaper.

59:51

Or at the same speed, 100,000 times less energy.

59:55

If your car did that, it doesn't need energy at all.

1:00:00

What I mean, what I'm trying to say is that in 10 years' time, the amount of

1:00:05

energy necessary for artificial intelligence for most people will be minuscule,

1:00:11

utterly minuscule.

1:00:12

And so we'll have AI running in all kinds of things and all the time because it

1:00:16

doesn't consume that much energy.

1:00:19

And so if you're a nation that uses AI for, you know, almost everything in your

1:00:23

social fabric, of course, you're going to need these AI factories.

1:00:27

But for a lot of countries, I think you're going to have excellent AI and you're

1:00:31

not going to need as much energy.

1:00:33

Everybody will be able to come along is my point.

1:00:36

So currently, that is a big bottleneck, right, is energy?

1:00:39

It is the bottleneck.

1:00:41

The bottleneck.

1:00:42

So was it Google that is making nuclear power plants to operate one of its AI

1:00:48

factories?

1:00:50

Oh, I haven't heard that.

1:00:51

But I think in the next six, seven years, I think you're going to see a whole

1:00:54

bunch of small nuclear reactors.

1:00:56

And by small, like how big are you talking about?

1:00:59

Hundreds of megawatts, yeah.

1:01:00

Okay.

1:01:01

And that these will be local to whatever specific company they have?

1:01:06

That's right.

1:01:06

We'll all be power generators.

1:01:08

Whoa.

1:01:09

You know, just like you're, you know, somebody's farm.

1:01:13

It probably is the smartest way to do it, right?

1:01:15

And it takes the burden off the grid.

1:01:19

It takes, and you could build as much as you need.

1:01:21

And you can contribute back to the grid.

1:01:24

It's a really important point that I think you just made about Moore's Law and

1:01:28

the relationship to pricing.

1:01:29

Because, you know, a laptop today, like you can get one of those little MacBook

1:01:33

Airs.

1:01:34

They're incredible.

1:01:34

They're so thin.

1:01:35

Unbelievably powerful.

1:01:37

Battery life is crazy.

1:01:37

You don't ever have to charge it.

1:01:38

Yeah, battery life is crazy.

1:01:40

And it's not that expensive, relatively speaking.

1:01:43

Exactly.

1:01:44

Like something like that.

1:01:44

I remember when-

1:01:45

And that's just Moore's Law.

1:01:46

Right.

1:01:47

Then there's the NVIDIA Law.

1:01:48

Oh.

1:01:49

Just, right?

1:01:50

The law I was talking to you about.

1:01:51

Yeah.

1:01:52

The computing that we invented.

1:01:53

Right.

1:01:54

The reason why we're here, this new way of doing computing, is like Moore's Law

1:02:00

on energy drinks.

1:02:02

I mean, it's like Moore's Law and Joe Rogan.

1:02:09

Wow.

1:02:10

That's interesting.

1:02:11

Yeah.

1:02:12

That's us.

1:02:13

So, explain that.

1:02:14

This chip that you brought to Elon, what's the significance of this?

1:02:18

Like, why is it so superior?

1:02:20

And so, in 2012, Jeff Hinton's lab, this gentleman I was talking about, Ilya

1:02:28

Suskaber, Alex Khrushchevsky, they made a breakthrough in computer vision.

1:02:37

And literally creating a piece of software called AlexNet, and its job was to

1:02:47

recognize images.

1:02:49

And it recognized images at a level, computer vision, which is fundamental to

1:02:55

intelligence.

1:02:57

If you can't perceive, it's hard to have intelligence.

1:03:00

And so, computer vision is a fundamental pillar of, not the only, but

1:03:03

fundamental pillar of.

1:03:05

And so, breaking computer vision, or breaking through in computer vision, is

1:03:09

pretty foundational to almost everything that everybody wants to do in AI.

1:03:13

And so, in 2012, their lab in Toronto made this breakthrough called AlexNet.

1:03:23

And AlexNet was able to recognize images so much better than any human-created

1:03:31

computer vision algorithm in the 30 years prior.

1:03:37

So, all of these people, all of these scientists, and we had many, too, working

1:03:42

on computer vision algorithms.

1:03:44

And these two kids, Ilya and Alex, under Jeff Hinton, took a giant leap above

1:03:55

it.

1:03:56

And it was based on this thing called AlexNet, this neural network.

1:04:00

And the way it ran, the way they made it work, was literally buying two NVIDIA

1:04:06

graphics cards.

1:04:08

Because NVIDIA's GPUs, we've been working on this new way of doing computing.

1:04:14

And our GPUs application, and it's basically a supercomputing application back

1:04:22

in 1984, in order to process computer games and what you have in your racing

1:04:31

simulator.

1:04:33

That is called an image generator supercomputer.

1:04:36

And so, NVIDIA started, our first application was computer graphics.

1:04:41

And we applied this new way of doing computing, where we do things in parallel

1:04:46

instead of sequentially.

1:04:48

A CPU does things sequentially.

1:04:50

Step one, step two, step three.

1:04:52

In our case, we break the problem down, and we give it to thousands of

1:04:57

processors.

1:04:59

And so, our way of doing computation is much more complicated.

1:05:07

But if you're able to formulate the problem in the way that we create it called

1:05:15

CUDA, this is the invention of our company.

1:05:17

If you could formulate it in that way, we could process everything

1:05:20

simultaneously.

1:05:22

Now, in the case of computer graphics, it's easier to do because every single

1:05:28

pixel on your screen is not related to every other pixel.

1:05:32

And so, I could render multiple parts of the screen at the same time.

1:05:36

Not completely true, because, you know, maybe the way lighting works or the way

1:05:41

shadow works, there's a lot of dependency and such.

1:05:45

But, computer graphics, with all the pixels, I should be able to process

1:05:49

everything simultaneously.

1:05:51

And so, we took this embarrassingly parallel problem called computer graphics,

1:05:57

and we applied it to this new way of doing computing.

1:06:00

NVIDIA's accelerated computing.

1:06:05

We put it in all of our graphics cards.

1:06:07

Kids were buying it to play games.

1:06:10

You probably don't know this, but we're the largest gaming platform in the

1:06:14

world today.

1:06:15

Oh, I know that.

1:06:16

Oh, okay.

1:06:16

I used to make my own computers.

1:06:18

I used to buy your graphics cards.

1:06:19

Oh, that's super cool.

1:06:20

Yeah.

1:06:20

Okay.

1:06:21

Set up SLI with two graphics cards.

1:06:23

Oh, yeah.

1:06:23

I love it.

1:06:23

Okay.

1:06:24

That's super cool.

1:06:24

Oh, yeah, man.

1:06:25

I used to be a Quake junkie.

1:06:26

Oh, that's cool.

1:06:27

Yeah.

1:06:28

Okay.

1:06:28

So, SLI, I'll tell you the story in just a second, and how it led to Elon.

1:06:32

I'm still answering the question.

1:06:34

And so, anyways, these two kids trained this model using the technique I

1:06:39

described earlier on our GPUs, because our GPUs could process things in

1:06:44

parallel.

1:06:45

It's essentially a supercomputer in a PC.

1:06:48

The reason why you used it for Quake is because it is the first consumer

1:06:53

supercomputer.

1:06:55

Okay?

1:06:56

And so, anyways, they made that breakthrough.

1:07:00

We were working on computer vision at the time.

1:07:02

It caught my attention.

1:07:03

And so, we went to learn about it.

1:07:06

Simultaneously, this deep learning phenomenon was happening all over the

1:07:12

country.

1:07:13

Universities after another recognized the importance of deep learning, and all

1:07:17

of this work was happening at Stanford, at Harvard, at Berkeley, just all over

1:07:22

the place.

1:07:23

New York University, you know, Yang LeCun, Andrew Yang at Stanford, so many

1:07:28

different places.

1:07:30

And I see it cropping up everywhere.

1:07:33

And so, my curiosity asked, you know, what is so special about this form of

1:07:39

machine learning?

1:07:41

And we've known about machine learning for a very long time.

1:07:43

We've known about AI for a very long time.

1:07:45

We've known about neural networks for a very long time.

1:07:48

And so, we realized that this architecture for deep neural networks, back

1:07:55

propagation, the way deep neural networks were created, we could probably scale

1:08:02

this problem, scale the solution to solve many problems.

1:08:07

That is essentially a universal function approximator, okay?

1:08:13

Meaning, you know, back when you were in school, you have a box.

1:08:20

Inside of it is a function.

1:08:21

You give it an input.

1:08:22

It gives you an output.

1:08:24

And the reason why I call it a universal function approximator is that this

1:08:29

computer, instead of you describing the function, a function could be a Newton's

1:08:34

equation, f equals ma.

1:08:36

That's a function.

1:08:38

You write the function in software.

1:08:39

You give it input, f, mass, acceleration.

1:08:44

It'll tell you the force, okay?

1:08:46

And the way this computer works is really interesting.

1:08:51

You give it a universal function.

1:08:54

It's not f equals ma.

1:08:56

It's just a universal function.

1:08:57

It's a big, huge, deep neural network.

1:09:00

And instead of describing the inside, you give it examples of input and output,

1:09:07

and it figures out the inside.

1:09:09

So you give it input and output, and it figures out the inside.

1:09:13

A universal function approximator.

1:09:16

Today, it could be Newton's equation.

1:09:18

Tomorrow, it could be Maxwell's equation.

1:09:20

It could be Coulomb's law.

1:09:22

It could be thermodynamics equation.

1:09:24

It could be, you know, Schrodinger's equation for quantum physics.

1:09:28

And so you could put any, you could have this describe almost anything, so long

1:09:32

as you have the input and the output.

1:09:35

So long as you have the input and the output.

1:09:37

Or it could learn the input and output.

1:09:39

And so we took a step back, and we said, hang on a second.

1:09:43

This isn't just for computer vision.

1:09:47

Deep learning could solve any problem.

1:09:49

All the problems that are interesting, so long as we have input and output.

1:09:54

Now, what has input and output?

1:09:58

Well, the world.

1:09:59

The world has input and output.

1:10:01

And so we could have a computer that could learn almost anything.

1:10:05

Machine learning, artificial intelligence.

1:10:07

And so we reasoned that maybe this is the fundamental breakthrough that we

1:10:11

needed.

1:10:12

There were a couple of things that had to be solved.

1:10:16

For example, we had to believe that you could actually scale this up to giant

1:10:19

systems.

1:10:20

It was running in a, they had two graphics cards, two GTX 580s.

1:10:25

Which, by the way, is exactly your SLI configuration.

1:10:30

Yeah.

1:10:31

Okay.

1:10:32

So that GTX 580 SLI was the revolutionary computer that put deep learning on

1:10:39

the map.

1:10:41

Wow.

1:10:41

It was 2018.

1:10:42

And you were using it to play quick.

1:10:45

Wow.

1:10:45

That's crazy.

1:10:46

That was the moment.

1:10:47

That was the big bang of modern AI.

1:10:50

We were lucky because we were inventing this technology, this computing

1:10:54

approach.

1:10:55

We were lucky that they found it.

1:10:58

Turns out they were gamers and it was lucky they found it.

1:11:01

And it was lucky that we paid attention to that moment.

1:11:05

It was a little bit like, you know, that Star Trek, you know, first contact.

1:11:14

The Vulcans had to have seen the warp drive at that very moment.

1:11:20

If they didn't witness the warp drive, you know, they would have never come to

1:11:24

Earth.

1:11:25

And everything would have never happened.

1:11:27

It's a little bit like if I hadn't paid attention to that moment, that flash,

1:11:31

and that flash didn't last long.

1:11:33

If I hadn't paid attention to that flash or our company didn't pay attention to

1:11:36

it, who knows what would have happened.

1:11:39

But we saw that and we reasoned our way into this is a universal function

1:11:43

approximator.

1:11:45

This is not just a computer vision approximator.

1:11:47

We could use this for all kinds of things if we could solve two problems.

1:11:51

The first problem is that we have to prove to ourselves it could scale.

1:11:55

The second problem we had to wait for, I guess, contribute to and wait for, is

1:12:05

the world will never have enough data on input and output where we could supervise

1:12:15

the AI to learn everything.

1:12:18

For example, if we have to supervise our children on everything they learn, the

1:12:22

amount of information they could learn is limited.

1:12:25

We needed the AI, we needed the computer to have a method of learning without

1:12:30

supervision.

1:12:32

And that's where we had to wait a few more years.

1:12:35

But unsupervised AI learning is now here.

1:12:40

And so the AI could learn by itself.

1:12:43

And the reason why the AI could learn by itself is because we have many

1:12:47

examples of right answers.

1:12:49

Like, for example, if I want to learn, if I want to teach an AI how to predict

1:12:54

the next word, I could just grab it, grab a whole bunch of text that we already

1:13:00

have, mask out the last word, and make it try and try and try again until it

1:13:04

predicts the next one.

1:13:07

Or I mask out random words inside the text, and I make it try and try and try

1:13:11

until it predicts it.

1:13:12

You know, like, Mary goes down to the bank.

1:13:17

Is it a river bank or a money bank?

1:13:21

Well, if you're going to go down to the bank, it's probably a river bank.

1:13:25

And it might not be obvious even from that.

1:13:28

It might need, and caught a fish.

1:13:35

Okay, now you know it must be the river bank.

1:13:38

And so you give these AIs a whole bunch of these examples, and you mask out the

1:13:42

words, it'll predict the next one.

1:13:45

Okay?

1:13:45

And so unsupervised learning came along.

1:13:48

These two ideas, the fact that it's scalable and unsupervised learning came

1:13:52

along, we were convinced that we had to put everything into this and help

1:13:57

create this industry because we're going to solve a whole bunch of interesting

1:14:01

problems.

1:14:02

And that was in 2012.

1:14:03

By 2016, I had built this computer called the DGX-1.

1:14:09

The one that you saw me give to Elon is called DGX-Spark.

1:14:14

The DGX-1 was $300,000.

1:14:18

It cost NVIDIA a few billion dollars to make, the first one.

1:14:23

And instead of two chips SLI, we connected eight chips with a technology called

1:14:32

NVLink.

1:14:34

But it's basically SLI supercharged.

1:14:36

Okay?

1:14:38

Okay.

1:14:38

And so we connected eight of these chips together instead of just two.

1:14:42

And all of them worked together, just like your Quake rig did, to solve this

1:14:47

deep learning problem, to train this model.

1:14:51

And so we created this thing.

1:14:53

I announced it at GTC at one of our annual events.

1:14:59

And I described this deep learning thing, computer vision thing, and this

1:15:04

computer called DGX-1.

1:15:07

The audience was, like, completely silent.

1:15:09

They had no idea what I was talking about.

1:15:11

And I was lucky because I had known Elon, and I helped him build the first

1:15:20

computer for Model 3, the Model S.

1:15:26

And when he wanted to start working on autonomous vehicle, I helped him build

1:15:31

the computer that went into the Model S AV system, his full self-driving system.

1:15:38

We were basically the FSD computer version one.

1:15:41

And so we were already working together.

1:15:46

And when I announced this thing, nobody in the world wanted it.

1:15:51

I had no purchase orders, not one.

1:15:53

Nobody wanted to buy it.

1:15:55

Nobody wanted to be part of it, except for Elon.

1:15:58

He goes, he was at the event, and we were doing a fireside chat about the

1:16:03

future of self-driving cars.

1:16:05

I think it was, like, 2016.

1:16:07

At that time, it was 2015.

1:16:10

And he goes, you know what?

1:16:13

I have a company that could really use this.

1:16:16

And I said, wow, my first customer.

1:16:19

And so I was pretty excited about it.

1:16:23

And he goes, yeah, we have this company.

1:16:27

It's a nonprofit company.

1:16:29

And all the blood drained out of my face.

1:16:33

Yeah.

1:16:33

I just spent a few billion dollars building this thing.

1:16:37

It cost $300,000.

1:16:39

And, you know, the chances of a nonprofit being able to pay for this thing is

1:16:43

approximately zero.

1:16:44

And he goes, you know, this is an AI company.

1:16:48

And it's a nonprofit, and we could really use one of these supercomputers.

1:16:54

And so I picked it up.

1:16:56

I built the first one for ourselves.

1:16:58

We're using it inside the company.

1:16:59

I boxed one up.

1:17:00

I drove it up to San Francisco, and I delivered it at Elon in 2016.

1:17:04

A bunch of researchers were there.

1:17:07

Peter Beal was there.

1:17:09

Ilia was there.

1:17:10

There was a bunch of people there.

1:17:12

And I walked up to the second floor where they were all kind of in a room that's

1:17:17

smaller than your place here.

1:17:19

And that place turned out to have been OpenAI.

1:17:23

2016.

1:17:25

Wow.

1:17:26

Just a bunch of people sitting in a room.

1:17:29

It's not really nonprofit anymore, though, is it?

1:17:33

They're not nonprofit anymore.

1:17:34

Weird how that works.

1:17:36

Yeah, yeah.

1:17:36

But anyhow, Elon was there.

1:17:39

Yeah.

1:17:40

It was really a great, great moment.

1:17:42

Oh, yeah.

1:17:43

There you go.

1:17:43

Yeah, that's it.

1:17:44

Look at you, bro.

1:17:46

Same jacket.

1:17:46

Look at that.

1:17:48

I haven't aged.

1:17:49

Not a lick of black hair, though.

1:17:52

The size of it is significantly smaller.

1:17:57

That was the other day.

1:17:58

Okay, so.

1:17:58

Oh, yeah.

1:17:59

There you go.

1:17:59

Yeah.

1:18:00

Look at the difference.

1:18:01

That's crazy.

1:18:01

Exactly the same industrial design.

1:18:03

He's holding it in his hand.

1:18:04

Here's the amazing thing.

1:18:09

DGX1 was one petaflops, okay?

1:18:12

That's a lot of flops.

1:18:14

And DGX Spark is one petaflops.

1:18:18

Nine years later.

1:18:21

Wow.

1:18:22

The same amount of computing horsepower.

1:18:26

In a much smaller.

1:18:27

Shrunken down.

1:18:28

Yeah.

1:18:28

And instead of $300,000, it's now $4,000.

1:18:32

And it's the size of a small book.

1:18:33

Incredible.

1:18:35

Crazy.

1:18:35

That's how technology moves.

1:18:38

Anyways, that's the reason why I wanted to give him the first one.

1:18:41

It's so-

1:18:41

Because I gave him the first one in 2016.

1:18:43

It's so fascinating.

1:18:44

I mean, if you wanted to make a story for a film, I mean, that would be the

1:18:49

story that,

1:18:50

like, what better scenario, if it really does become a digital life form, how

1:18:57

funny would

1:18:58

it be that it is birthed out of the desire for computer graphics for video

1:19:03

games?

1:19:03

Exactly.

1:19:05

Isn't it kind of crazy?

1:19:06

It's kind of crazy.

1:19:07

Yeah.

1:19:08

Kind of crazy when you think about it that way, because it's a perfect origin

1:19:13

story.

1:19:14

Computer graphics was one of the hardest supercomputer problems, generating

1:19:21

reality.

1:19:22

And also one of the most profitable to solve, because computer games are so

1:19:27

popular.

1:19:28

When NVIDIA started in 1993, we were trying to create this new computing

1:19:33

approach.

1:19:34

The question is, what's the killer app?

1:19:37

And the company wanted to create a new type of computing architecture, a new

1:19:48

type of computer

1:19:50

that can solve problems that normal computers can't solve.

1:19:54

Well, the applications that existed in the industry in 1993 are applications

1:20:03

that normal computers

1:20:05

can solve.

1:20:06

Because if the normal computers can't solve them, why would the application

1:20:08

exist?

1:20:09

And so we had a mission statement for a company that has no chance of success.

1:20:21

But I didn't know that in 1993.

1:20:23

It just sounded like a good idea.

1:20:24

Right.

1:20:25

And so if we created this thing that can solve problems, you know, it's like,

1:20:32

you actually

1:20:34

have to go create the problem.

1:20:37

And so that's what we did in 1993, there was no Quake.

1:20:42

John Carmack hadn't even released Doom yet.

1:20:45

You probably remember that.

1:20:47

Sure.

1:20:48

Yeah.

1:20:48

And there were no applications for it.

1:20:53

And so I went to Japan because the arcade industry had this, at the time of

1:20:58

Sega, if you remember?

1:21:00

Sure.

1:21:00

The arcade machines, they came out with 3D arcade systems.

1:21:06

Virtual fighter, Daytona, virtual cop, all of those arcade games were in 3D for

1:21:13

the very first time.

1:21:15

And the technology they were using was from Martin Marietta, the flight simulators,

1:21:21

they took the guts out of a flight simulator and put it into an arcade machine.

1:21:26

The system that you have over here, it's got to be a million times more

1:21:31

powerful than that arcade machine.

1:21:33

And that was a flight simulator for NASA.

1:21:37

Whoa.

1:21:38

And so they took the guts out of that.

1:21:41

They were using it for flight simulation for jets and, you know, space shuttle.

1:21:46

And they took the guts out of that.

1:21:48

And Sega had this brilliant computer developer.

1:21:53

His name was Yu Suzuki.

1:21:54

Yu Suzuki and Miyamoto, Sega and Nintendo, these were the, you know, the

1:22:01

incredible pioneers, the visionaries, the incredible artists.

1:22:07

And they're both very, very technical.

1:22:11

They were the origins, really, of the gaming industry.

1:22:15

And Yu Suzuki pioneered 3D graphics gaming.

1:22:19

And so I went, we created this company and there were no apps.

1:22:25

And we were spending all of our afternoons, you know, we told our family we

1:22:31

were going to work, but it was just the three of us, you know, who's going to

1:22:34

know.

1:22:35

And so we went to Curtis's, one of the founders, went to Curtis's townhouse.

1:22:40

And Chris and I were married.

1:22:42

We have kids.

1:22:43

I already had Spencer and Madison.

1:22:45

They were probably two years old.

1:22:47

And Chris's kids are about the same age as ours.

1:22:54

And we would go to work in this townhouse.

1:22:56

But, you know, when you're a startup and the mission statement is the way we

1:23:01

described, you're not going to have too many customers calling you.

1:23:05

And so we had really nothing to do.

1:23:08

And so after lunch, we would always have a great lunch.

1:23:11

After lunch, we would go to the arcades and play the Sega, you know, the Sega

1:23:15

Virtua Fighter and Daytona and all those games.

1:23:18

And analyze how they're doing it, trying to figure out how they were doing that.

1:23:24

And so we decided, let's just go to Japan and let's convince Sega to move those

1:23:31

applications into the PC.

1:23:34

And we would start the PC gaming, the 3D gaming industry, partnering with Sega.

1:23:41

That's how NVIDIA started.

1:23:43

Wow.

1:23:44

And so in exchange for them developing their games for our computers in the PC,

1:23:52

we would build a chip for their game console.

1:23:56

That was the partnership.

1:23:58

I build a chip for your game console.

1:24:01

You port the Sega games to us.

1:24:04

And then they paid us, you know, at the time, quite a significant amount of

1:24:10

money to build that game console.

1:24:14

And that was kind of the beginning of NVIDIA getting started.

1:24:19

And we thought we were on our way.

1:24:21

And so I started with a business plan, a mission statement that wasn't possible.

1:24:25

We lucked into the Sega partnership.

1:24:28

We started taking off, started building our game console.

1:24:32

And about a couple years into it, we discovered our first technology didn't

1:24:38

work.

1:24:39

It was, it would have been a flaw.

1:24:42

It was a flaw.

1:24:44

And all of the technology ideas that we had, the architecture concepts were

1:24:49

sound.

1:24:50

But the way we were doing computer graphics was exactly backwards.

1:24:54

You know, instead of, I won't bore you with the technology, but instead of

1:24:59

inverse texture mapping, we were doing forward texture mapping.

1:25:03

Instead of triangles, we did curved surfaces.

1:25:08

So other people did it flat.

1:25:10

We did it round.

1:25:11

Other technology, the technology that ultimately won, the technology we use

1:25:18

today, has Z-buffers.

1:25:20

It automatically sorted.

1:25:22

We had an architecture with no Z-buffers.

1:25:25

The application had to sort it.

1:25:26

And so we chose a bunch of technology approaches that three major technology

1:25:33

choices, all three choices were wrong.

1:25:35

Okay.

1:25:36

So this is how incredibly smart we were.

1:25:38

And so in 1995, mid-95, we realized we were going down the wrong path.

1:25:46

Meanwhile, the Silicon Valley was packed with 3D graphics startups because it

1:25:52

was the most exciting technology of that time.

1:25:57

And so 3D effects and rendition and silicon graphics was coming in.

1:26:02

Intel was already in there.

1:26:03

And, you know, gosh, what added up eventually to a hundred different startups

1:26:08

we had to compete against.

1:26:10

Everybody had chosen the right technology approach, and we chose the wrong one.

1:26:15

And so we were the first company to start.

1:26:18

We found ourselves essentially dead last with the wrong answer.

1:26:23

And so the company was in trouble.

1:26:28

And ultimately, we had to make several decisions.

1:26:34

The first decision is, well, if we change now, we will be the last company.

1:26:51

Even if we changed into the technology that we believe to be right, we'd still

1:26:56

be dead.

1:26:57

And so that argument, you know, do we change and therefore be dead?

1:27:04

Don't change and make this technology work somehow or go do something

1:27:09

completely different.

1:27:12

That question stirred the company strategically and was a hard question.

1:27:17

I eventually, you know, advocated for, we don't know what the right strategy is,

1:27:22

but we know what the wrong technology is.

1:27:25

So let's stop doing it the wrong way and let's give ourselves a chance to go

1:27:29

figure out what the strategy is.

1:27:30

The second thing, the second problem we had was our company was running out of

1:27:35

money.

1:27:37

And I had, I was in a contract with Sega and I owed them this game console.

1:27:41

And if that contract would have been canceled, we'd be dead.

1:27:46

We would have vaporized instantly.

1:27:50

And so, so I, uh, uh, I went to Japan and I explained to, uh, the CEO of Sega,

1:28:00

Iri Madri, really great man.

1:28:02

He was the former CEO of Honda USA, went back to Sega to run Sega, went back to

1:28:08

Japan to run Sega.

1:28:11

And I explained to him that I was, uh, I guess I was what, 30, 33 years old.

1:28:18

You know, when I was 33 years old, I still had acne and I got this, this, you

1:28:24

know, Chinese kid that was super skinny.

1:28:30

And he, he was already kind of elder.

1:28:33

And, uh, I went to him and I said, I said, listen, I've got some bad news for

1:28:39

you.

1:28:40

And, and first, the technology that we promised you doesn't work.

1:28:50

And second, we shouldn't finish your contract because we'd waste all your money

1:28:59

and you would have something that doesn't work.

1:29:02

And I recommend you'd find another partner to build your game console.

1:29:06

Whoa.

1:29:07

And so I'm terribly sorry that we've set you back in your product roadmap.

1:29:15

And third, even though you're going to, I'm asking you to let me out of the

1:29:21

contract, I still need the money.

1:29:25

Because if you didn't give me the money, we'd vaporize overnight.

1:29:31

And so I explained it to him humbly, honestly, I gave him the background,

1:29:41

explained to him why the technology doesn't work.

1:29:46

Why we thought it was going to work, why it doesn't work.

1:29:49

And, um, and I asked him to, uh, convert the last $5 million that they were

1:29:59

going to complete the contract.

1:30:03

To give us that money as an investment instead.

1:30:11

And he said, but it's very likely your company will go out of business, even

1:30:17

with my investment.

1:30:21

And it was completely true.

1:30:22

Back then, 1995, $5 million was a lot of money.

1:30:27

It's a lot of money today.

1:30:28

$5 million was a lot of money.

1:30:30

And here's a pile of competitors doing it right.

1:30:33

What are the chances that giving NVIDIA $5 million, that we would develop the

1:30:38

right strategy, that he would get a return on that $5 million or even get it

1:30:42

back?

1:30:42

Zero percent.

1:30:44

You do the math, it's zero percent.

1:30:48

If I were sitting there right there, I wouldn't have done it.

1:30:51

$5 million was a mountain of money to Sega at the time.

1:30:57

And so I told him that, that, that, um, uh, if you invested that $5 million in

1:31:06

us, it is most likely to be lost.

1:31:10

But if you didn't invest that money, we'd be out of business and we would have

1:31:15

no chance.

1:31:18

And I, I told him that I, I don't even know exactly what I said in the end, but

1:31:26

I told him that I would understand if he decided not to, but it would make the

1:31:34

world to me if he did.

1:31:37

He went off and thought about it for a couple of days and came back and said,

1:31:40

we'll do it.

1:31:40

Wow.

1:31:41

Did you have a strategy to how to correct what it was doing wrong?

1:31:48

Did you explain that to him?

1:31:49

Oh man, wait until I tell you the rest of it's, it's scarier, even scarier.

1:31:53

Oh no.

1:31:54

And so, so, um, so what he, what he decided was, was, uh, uh, Jensen was a

1:32:05

young man he liked.

1:32:08

That's it.

1:32:09

Wow.

1:32:11

To this day.

1:32:12

That's nuts.

1:32:14

I was.

1:32:15

Boy, do you owe, but the world owes that guy.

1:32:18

No doubt.

1:32:19

Right.

1:32:20

Like what, he, he, he, he celebrated today in Japan.

1:32:24

And if he would have kept that five, the, the investment, I think it'd be worth

1:32:29

probably about a trillion dollars today.

1:32:32

I know.

1:32:36

But the moment we went public, they sold it.

1:32:39

They go, wow, that's a miracle.

1:32:41

So, they sold it, yeah, they sold it at NVIDIA valuation about 300 million.

1:32:47

That's our IPO valuation, 300 million.

1:32:51

Wow.

1:32:53

And so, so anyhow, I was incredibly grateful.

1:32:57

Um, and then now we had to figure out what to do because we still were doing

1:33:03

the wrong strategy, wrong technology.

1:33:06

So, unfortunately, we had to lay off most of the company.

1:33:09

We shrunk the company all back.

1:33:11

All the people working on the game console, you know, we had to shrunk it all

1:33:14

back, shrink it all back.

1:33:17

And, um, and then, and then somebody told me that, but Jensen, we've never

1:33:24

built it this way before.

1:33:27

We've never built it the right way before.

1:33:28

We've only known how to build it the wrong way.

1:33:34

And so, nobody in the company knew how to build this supercomputing image

1:33:39

generator, 3D graphics thing that Silicon Graphics did.

1:33:45

And so, so, uh, I said, okay, how hard can it be?

1:33:51

You got all these 30 companies, you know, 50 companies doing it.

1:33:54

How hard can it be?

1:33:55

And so, luckily, there was a textbook written by the company, Silicon Graphics.

1:34:04

And so, I went down to the store, I had 200 bucks in my pocket, and I bought

1:34:08

three textbooks, only three they had, $60 a piece.

1:34:12

I bought the three textbooks.

1:34:15

I brought it back and I gave one to each one of the architects, and I said,

1:34:18

read that and let's go save the company.

1:34:20

And so, so they, they, they read this textbook, learned from the giant at the

1:34:30

time, Silicon Graphics, about how to do,

1:34:34

3D Graphics.

1:34:34

But the thing that was amazing, and what makes NVIDIA special today, is that

1:34:40

the people that are there are able to start from first principles.

1:34:45

Learn best known art, but re-implement it in a way that's never been done

1:34:52

before.

1:34:54

And so, when we re-imagined the technology of 3D Graphics, we re-imagined it in

1:35:01

a way that manifests today, the modern 3D Graphics.

1:35:06

We really invented modern 3D Graphics.

1:35:09

But we learned from previous known arts, and we implemented fundamentally

1:35:14

differently.

1:35:15

What did you do that changed it?

1:35:17

Well, you know, ultimately, ultimately, the, the simple, the simple answer is

1:35:25

that the way Silicon Graphics works, the geometry engine is a bunch of software

1:35:31

running on processors.

1:35:33

We took that and eliminated all the generality, the general purposeness of it,

1:35:45

and we reduced it down into the most essential part of 3D Graphics.

1:35:51

And we hard-coded it into the chip.

1:35:53

And so, instead of something general purpose, we hard-coded it very

1:35:58

specifically into just the limited applications, limited functionality

1:36:04

necessary for video games.

1:36:07

And that capability, that super, and because we reinvented a whole bunch of

1:36:12

stuff, it supercharged the capability of that one little chip.

1:36:16

And our one little chip was generating images as fast as a $1 million image

1:36:23

generator.

1:36:24

That was the big breakthrough.

1:36:26

We took a million-dollar thing, and we put it into the graphics card that you

1:36:31

now put into your gaming PC.

1:36:33

And that was our big invention.

1:36:36

And then, and of course, the question is, is, how do you compete against these

1:36:41

30 other companies doing what they were doing?

1:36:45

And, and there we did, we did several things.

1:36:49

One, instead of building a 3D graphics chip for every 3D graphics application,

1:36:56

we decided to build a 3D graphics chip for one application.

1:37:02

We bet the farm on video games.

1:37:06

The needs of video games are very different than the needs for CAD, needs for

1:37:10

flight simulators.

1:37:11

They're related, but not the same.

1:37:12

And so, we narrowly focused our problem statement so I could reject all of the

1:37:17

other complexities.

1:37:18

And we shrunk it down into this one little focus, and then we supercharged it

1:37:23

for gamers.

1:37:25

And the second thing that we did was we created a whole ecosystem of working

1:37:30

with game developers and getting their games ported and adapted to our silicon

1:37:35

so that we could get, turn essentially what is a technology business into a

1:37:41

platform business, into a game platform business.

1:37:45

So, you know, GeForce is really, you know, GeForce is really, you know, GeForce

1:37:51

is really, you know, GeForce is really the game console inside your PC.

1:37:57

It's, you know, it runs Windows, it runs Excel, it runs PowerPoint, of course,

1:38:01

those are easy things.

1:38:03

But its fundamental purpose was simply to turn your PC into a game console.

1:38:08

So, we were the first technology company to build all of this incredible

1:38:14

technology in service of one audience, gamers.

1:38:18

Now, of course, in 1993, the gaming industry didn't exist.

1:38:22

But by the time that John Carmack came along, and the Doom phenomenon happened,

1:38:30

and then Quake came out, as you know, that entire community, boom, took off.

1:38:38

Do you know where the name Doom came from?

1:38:40

It came from this, there's a scene in the movie, The Color of Money, where Tom

1:38:44

Cruise, who's this elite pool player, shows up at this pool hall, and this

1:38:49

local hustler says, what do you got in the case?

1:38:51

And he opens up this case, he has a special pool cue, he goes in here, and he

1:38:55

opens it up, he goes, Doom.

1:38:57

And that's where it came from.

1:38:59

Is that right?

1:38:59

Yeah, because Carmack said that's what they wanted to do to the gaming industry.

1:39:02

Doom.

1:39:03

That when Doom came out, it would just be, everybody would be like, oh, we're

1:39:06

fucked.

1:39:06

Oh, wow.

1:39:07

This is Doom.

1:39:08

That's awesome.

1:39:09

Isn't that amazing?

1:39:09

That's amazing, yeah.

1:39:10

Because it's the perfect name for the game.

1:39:11

Yeah.

1:39:12

And the name came out of that scene in that movie.

1:39:14

That's right.

1:39:15

Well, and then, of course, Tim Sweeney and Epic Games and the 3D gaming genre

1:39:23

took off.

1:39:24

Yes.

1:39:25

And so, if you just kind of, in the beginning was no gaming industry, we had no

1:39:30

choice but to focus the company on one thing, that one thing.

1:39:34

It's a really incredible origin story.

1:39:37

Oh, it's amazing.

1:39:39

It must be like, look back.

1:39:41

Started with a disaster.

1:39:41

That $5 million, that pivot with that conversation with that gentleman, if he

1:39:46

did not agree to that, if he did not like you, what would the world look like

1:39:50

today?

1:39:50

That's crazy.

1:39:52

Wait, then our entire life hung on another gentleman.

1:39:56

And so, now, here we are, we built, so before GeForce, it was Reva 128.

1:40:03

Reva 128 saved the company.

1:40:06

It revolutionized computer graphics.

1:40:09

The performance, cost performance ratio of 3D graphics for gaming was off the

1:40:13

charts amazing.

1:40:15

And we're getting ready to ship it.

1:40:23

Get what?

1:40:24

Well, we're building it.

1:40:25

But we're, so, as you know, $5 million doesn't last long.

1:40:30

And so, every single month, every single month, we were drawing down.

1:40:38

You have to build it, prototype it.

1:40:42

You have to design it, prototype it.

1:40:43

Get the silicon back, which costs a lot of money.

1:40:49

Test it with software.

1:40:53

Because without the software testing the chip, you don't know the chip works.

1:40:56

And then you're going to find a bug, probably.

1:41:00

Because every time you test something, you find bugs.

1:41:03

Which means you have to tape it out again.

1:41:07

Which is more time, more money.

1:41:10

And so, we did the math.

1:41:12

There was no chance somebody was going to survive it.

1:41:14

We didn't have that much time to tape out a chip, send it to a foundry, TSMC.

1:41:20

Get the silicon back, test it, send it back out again.

1:41:23

There was no shot, no hope.

1:41:25

And so, the math, the spreadsheet, doesn't allow us to do that.

1:41:32

And so, I heard about this company.

1:41:35

And this company built this machine.

1:41:38

And this machine is an emulator.

1:41:43

You could take your design, all of the software that describes the chip.

1:41:50

And you could put it into this machine.

1:41:54

And this machine will pretend it's our chip.

1:41:56

So, I don't have to send it to the fab, wait until the fab sends it back, test.

1:42:01

I could have this machine pretend it's our chip.

1:42:04

And I could put all of the software on top of this machine, called an emulator,

1:42:08

and test all of the software on this pretend chip.

1:42:13

And I could fix it all before I send it to the fab.

1:42:17

Whoa.

1:42:18

And if I could do that, when I send it to the fab, it should work.

1:42:23

Nobody knows, but it should work.

1:42:27

And so, we came to the conclusion that let's take half of the money we had left

1:42:33

in the bank.

1:42:34

At the time, it was about a million dollars.

1:42:36

Take half of that money and go buy this machine.

1:42:40

So, instead of keeping the money to stay alive, I took half of the money to go

1:42:45

buy this machine.

1:42:47

Well, I called this guy up.

1:42:48

The company's called Icos.

1:42:50

Called this company up and I said, hey, listen, I heard about this machine.

1:42:56

I like to buy one.

1:42:57

And they go, oh, that's terrific, but we're out of business.

1:43:03

I said, what?

1:43:05

You're out of business?

1:43:06

He goes, yeah, we had no customers.

1:43:08

I said, wait, hang on a second.

1:43:14

So, you never made the machine?

1:43:15

They said, no, no, no, we made the machine.

1:43:17

We have one in inventory if you want it, but we're out of business.

1:43:21

So, I bought one out of inventory.

1:43:24

Okay.

1:43:27

After I bought it, they went out of business.

1:43:28

Wow.

1:43:30

I bought it out of inventory.

1:43:31

And on this machine, we put NVIDIA's chip into it and we tested all of the

1:43:38

software on top.

1:43:41

And at this point, we were on fumes.

1:43:43

But we convinced ourselves that chip is going to be great.

1:43:47

And so, I had to call some other gentleman.

1:43:50

So, I called TSMC.

1:43:53

And I told TSMC, I thought, listen, TSMC is the world's largest founder today.

1:43:59

At the time, they were just a few hundred million dollars large.

1:44:06

Tiny little company.

1:44:07

Tiny little company.

1:44:07

And I explained to them what we were doing.

1:44:13

And I explained to them, I told them I had a lot of customers.

1:44:18

I had one.

1:44:20

You know, Diamond Multimedia.

1:44:23

Probably one of the companies you bought the graphics card from back in the old

1:44:25

days.

1:44:26

And I said, you know, we have a lot of customers and the demand's really great.

1:44:30

And we're going to tape out a chip to you.

1:44:36

And I like to go directly to production.

1:44:38

Because I know it works.

1:44:42

And they said, nobody has ever done that before.

1:44:47

Nobody has ever taped out a chip that worked the first time.

1:44:52

And nobody starts out production without looking at it.

1:44:57

But I knew that if I didn't start the production, I'd be out of business

1:45:02

anyways.

1:45:03

And if I could start the production, I might have a chance.

1:45:07

And so, TSMC decided to support me.

1:45:13

And this gentleman is named Morris Chang.

1:45:16

Morris Chang is the father of the foundry industry.

1:45:20

The founder of TSMC.

1:45:22

Really great man.

1:45:23

He decided to support our company.

1:45:29

I explained to them everything.

1:45:31

He decided to support us.

1:45:33

Frankly, probably because they didn't have that many other customers anyhow.

1:45:37

But they were grateful.

1:45:39

And I was immensely grateful.

1:45:41

And as we were starting the production,

1:45:44

Morris flew to the United States.

1:45:47

And he didn't so many words ask me so.

1:45:52

But he asked me a whole lot of questions that was trying to tease out.

1:45:57

Do I have any money?

1:46:00

But he didn't directly ask me that, you know.

1:46:02

And so, the truth is that we didn't have all the money.

1:46:08

But we had a strong P.O. from the customer.

1:46:10

And if it didn't work, some wafers would have been lost.

1:46:16

And, you know, I'm not exactly sure what would have happened, but we would have

1:46:20

come short.

1:46:22

It would have been rough.

1:46:23

But they supported us with all of that risk involved.

1:46:28

We launched this chip.

1:46:30

Turns out to have been completely revolutionary.

1:46:33

Knocked the ball out of the park.

1:46:36

We became the fastest growing technology company in history to go from zero to

1:46:42

$1 billion.

1:46:44

It's so wild that you didn't test the chip.

1:46:46

I know.

1:46:47

We tested afterwards, yeah.

1:46:48

We tested afterwards.

1:46:49

Afterwards, but he went into production already.

1:46:53

But by the way, by the way, that methodology that we developed to save the

1:46:59

company is used throughout the world today.

1:47:02

That's amazing.

1:47:03

Yeah.

1:47:03

We changed the whole world's methodology of designing chips, the whole world's

1:47:08

rhythm of designing chips.

1:47:10

We changed everything.

1:47:13

How well did you sleep those days?

1:47:15

It must have been so much stress.

1:47:17

You know, what is that feeling where the world just kind of feels like it's

1:47:28

flying?

1:47:30

You have this, what do you call that feeling?

1:47:34

You can't stop the feeling that everything's moving super fast.

1:47:40

And, you know, you're laying in bed and the world just feels like, you know,

1:47:47

and you feel deeply anxious, completely out of control.

1:47:56

I've felt that probably a couple of times in my life.

1:47:59

It's during that time.

1:48:02

Wow.

1:48:03

Yeah.

1:48:03

It was incredible.

1:48:04

What an incredible success.

1:48:06

But I learned a lot.

1:48:07

I learned about, I learned simple things.

1:48:10

I learned how to develop strategies.

1:48:14

I learned how to, you know, our company learned how to develop strategies.

1:48:21

What are winning strategies?

1:48:22

We learned how to create a market.

1:48:23

We created the modern 3D gaming market.

1:48:26

We learned how, and so that exact same skill is how we create the modern AI

1:48:33

market.

1:48:35

It's exactly the same, yeah, it's exactly the same skill, exactly the same

1:48:39

blueprint.

1:48:40

And we learned how to deal with crisis, how to stay calm, how to think through

1:48:48

things systematically.

1:48:52

We learned how to remove all waste in the company and work from first

1:48:56

principles and doing only the things that are essential.

1:49:00

Everything else is waste because we have no money for it.

1:49:04

To live on fumes at all times.

1:49:09

And the feeling, no different than the feeling I had this morning when I woke

1:49:14

up, that you're going to be out of business soon.

1:49:17

That, you know, the phrase 30 days from going out of business, I've used for 33

1:49:23

years.

1:49:24

You still feel that?

1:49:25

Oh, yeah.

1:49:25

Oh, yeah.

1:49:26

Really?

1:49:26

Every morning.

1:49:27

Every morning.

1:49:27

But you guys are one of the biggest companies on planet Earth.

1:49:31

But the feeling doesn't change.

1:49:33

Wow.

1:49:34

The sense of vulnerability, the sense of uncertainty, the sense of insecurity,

1:49:41

it doesn't leave you.

1:49:43

That's crazy.

1:49:44

We were, you know, we had nothing.

1:49:47

We had nothing.

1:49:48

We were dealing with giants.

1:49:49

And you still feel that?

1:49:50

Oh, yeah.

1:49:50

Oh, yeah.

1:49:50

Every day.

1:49:51

Every moment.

1:49:52

Do you think that fuels you?

1:49:54

Is that part of the reason why the company is so successful, that you have that

1:49:58

hungry mentality?

1:50:04

That you never rest, you're never sitting on your laurels, you're always on the

1:50:08

edge?

1:50:09

I have a greater drive from not wanting to fail than the drive of wanting to

1:50:20

succeed.

1:50:21

Isn't that like success coaches would tell you that's completely the wrong

1:50:28

psychology?

1:50:29

The world has just heard me say that out loud for the first time.

1:50:32

But it's true.

1:50:34

Well, that's so fascinating.

1:50:36

The fear of failure drives me more than the greed or whatever it is.

1:50:43

Well, ultimately, that's probably a more healthy approach now that I'm thinking

1:50:47

about it.

1:50:48

I'm not ambitious, for example.

1:50:51

I just want to stay alive, Joe.

1:50:54

I want the company to thrive, you know?

1:50:57

I want us to make an impact.

1:50:59

That's interesting.

1:51:00

Yeah.

1:51:00

Well, maybe that's why you're so humble.

1:51:02

Maybe that's what keeps you grounded, you know?

1:51:05

Because with the kind of spectacular success the company's achieved, it would

1:51:09

be easy to get a big head.

1:51:10

No.

1:51:11

Right?

1:51:12

But isn't that interesting?

1:51:13

It's like if you were the guy that your main focus is just success, you

1:51:19

probably would go, well, made it, nailed it, I'm the man.

1:51:24

Drop the mic.

1:51:25

Instead, you wake up, you're like, God, we can't fuck this up.

1:51:28

No, exactly.

1:51:28

Every morning.

1:51:29

Every morning.

1:51:30

No, every moment.

1:51:31

That's crazy.

1:51:33

Before I go to bed.

1:51:34

Well, listen, if I was a major investor in your company, that's who I'd want

1:51:37

running it.

1:51:38

I'd want a guy who's terrified of-

1:51:40

Yeah.

1:51:41

That's why I work seven days a week every moment I'm awake.

1:51:47

You work every moment you're awake?

1:51:48

Every moment I'm awake.

1:51:49

Wow.

1:51:50

I'm thinking about solving a problem.

1:51:52

I'm thinking about-

1:51:55

How long can you keep this up?

1:51:56

I don't know, but it could be next week.

1:52:00

Sounds exhausting.

1:52:01

It is exhausting.

1:52:03

It sounds completely exhausting.

1:52:04

Always in a state of anxiety.

1:52:06

Wow.

1:52:07

Always in a state of anxiety.

1:52:09

Well, kudos to you for admitting that.

1:52:12

I think that's important for a lot of people to hear because there's probably

1:52:16

some young

1:52:17

people out there that are in a similar position to where you were when you were

1:52:22

starting out

1:52:23

that just feel like, oh, those people that have made it, they're just smarter

1:52:27

than me

1:52:28

and they had more opportunities than me and it's just like it was handed to

1:52:32

them

1:52:32

or they're just in the right place at the right time.

1:52:35

Joe, I just described to you somebody who didn't know what was going on,

1:52:38

actually did it wrong.

1:52:40

Yeah.

1:52:41

Yeah.

1:52:42

And the ultimate diving catch like two or three times.

1:52:45

Crazy.

1:52:45

Yeah.

1:52:46

The ultimate diving catch is the perfect way to put it.

1:52:49

Yeah.

1:52:50

It's just like the edge of your glove.

1:52:52

It probably bounced off of somebody's helmet and landed at the edge.

1:52:57

God, that's incredible.

1:53:01

It's incredible, but it's also, it's really cool that you have this perspective,

1:53:06

that you

1:53:06

look at it that way.

1:53:07

Because, you know, a lot of people that have delusions of grandeur, they have,

1:53:14

you know,

1:53:15

they're inflated.

1:53:16

And their rewriting of history oftentimes had them somehow extraordinarily

1:53:23

smart and they

1:53:25

were geniuses and they knew all along and they were spot on.

1:53:28

The business plan was exactly what they thought.

1:53:31

Yeah.

1:53:31

They destroyed the competition and, you know, and they emerged victorious.

1:53:39

Meanwhile, you're like, I'm scared every day.

1:53:41

Exactly.

1:53:42

Exactly.

1:53:44

It's so funny.

1:53:45

Oh my God.

1:53:47

That's amazing.

1:53:47

It's so true though.

1:53:48

It's amazing.

1:53:49

It's so true.

1:53:50

It's amazing.

1:53:50

Well, but I think there's nothing inconsistent with being a leader and being

1:53:56

vulnerable.

1:53:58

You know, the company doesn't need me to be a genius right all along, right all

1:54:04

the time.

1:54:05

Absolutely certain about what I'm trying to do and what I'm doing.

1:54:09

The company doesn't need that.

1:54:10

The company wants me to succeed.

1:54:13

You know, the thing that, and we started out today talking about President

1:54:17

Trump and I was

1:54:18

about to say something.

1:54:19

And listen, he is my president.

1:54:23

He is our president.

1:54:25

We should all, and we're talking about just because it's President Trump, we

1:54:29

all want him

1:54:29

to be wrong.

1:54:30

I think the United States, we all have to realize he is our president.

1:54:35

We want him to succeed because-

1:54:38

No matter who's president, we should have that attitude.

1:54:39

That's right.

1:54:40

Yeah.

1:54:41

We want him to succeed.

1:54:42

We need to help him succeed because it helps everybody, all of us succeed.

1:54:48

And I'm lucky that I work in a company where I have 40,000 people who wants me

1:54:55

to succeed.

1:54:56

They want me to succeed, and I can tell.

1:54:59

And they're all, every single day, to help me overcome these challenges, trying

1:55:05

to realize

1:55:06

what I describe to be our strategy, doing their best.

1:55:11

And if it's somehow wrong or not perfectly right, to tell me so that we could

1:55:17

pivot.

1:55:18

And the more vulnerable we are as a leader, the more able other people are able

1:55:24

to tell

1:55:25

you, you know, that, Jensen, that's not exactly right.

1:55:27

Or have you considered this information?

1:55:31

And the more vulnerable we are, the more able we're actually able to pivot.

1:55:37

If we put ourselves into this superhuman capability, then it's hard for us to

1:55:41

pivot strategy because

1:55:43

we were supposed to be right all along.

1:55:44

And so if you're always right, how can you possibly pivot?

1:55:48

Because pivoting requires you to be wrong.

1:55:50

And so I've got no trouble with being wrong.

1:55:53

I just have to make sure that I stay alert, that I reason about things from

1:55:58

first principles

1:55:59

all the time, always break things down to first principles, understand why it's

1:56:03

happening.

1:56:04

Reassess continuously.

1:56:07

The reassessing continuously is kind of partly what causes continuous anxiety,

1:56:12

you know, because

1:56:14

you're asking yourself, were you wrong yesterday?

1:56:16

Are you still right?

1:56:17

Is this the same?

1:56:19

Has that changed?

1:56:20

Has that conditioned?

1:56:22

Is that worse than you thought?

1:56:23

But God, that mindset is perfect for your business, though, because this

1:56:27

business is ever

1:56:28

changing.

1:56:28

All the time.

1:56:29

I've got competition coming from every direction.

1:56:31

So much of it is kind of up in the air.

1:56:34

And you have to invent a future where a hundred variables are included, and

1:56:42

there's no way

1:56:44

you could be right on all of them.

1:56:45

And so you have to be, you have to surf.

1:56:48

Wow.

1:56:49

You have to surf.

1:56:50

That's a good way to put it.

1:56:51

You have to surf.

1:56:52

Yeah.

1:56:52

You're surfing waves of technology and innovation.

1:56:55

That's right.

1:56:56

You can't predict the waves.

1:56:57

You got to deal with the ones you have.

1:56:59

Wow.

1:57:00

And, but skill matters.

1:57:02

And I've been doing this for 30, I'm the longest running tech CEO in the world.

1:57:06

Is that true?

1:57:07

Congratulations.

1:57:08

That's amazing.

1:57:09

And, you know, people ask me how, just one, don't get fired.

1:57:16

That'll stop and shorten a heartbeat.

1:57:17

And then two, don't get bored.

1:57:20

Yeah.

1:57:22

Well, how do you maintain your enthusiasm?

1:57:24

The honest truth is, it's not always enthusiasm.

1:57:31

It's, you know, sometimes it's enthusiasm.

1:57:33

Sometimes it's just good old fashioned fear.

1:57:36

And then sometimes, you know, a healthy dose of frustration.

1:57:40

You know, it's.

1:57:41

Whatever keeps you moving.

1:57:43

Yeah.

1:57:43

Just all the emotions.

1:57:45

I think, you know, CEOs, we have all the emotions, right?

1:57:48

You know?

1:57:49

And so probably, probably jacked up to the maximum because you're, you're kind

1:57:55

of feeling it on behalf of the whole company.

1:57:58

I'm feeling it on behalf of everybody at the same time.

1:58:01

And it kind of, you know, encapsulates into, into somebody.

1:58:05

And so I have to be mindful of the past.

1:58:09

I have to be mindful of the present.

1:58:10

I've got to be mindful of the future.

1:58:11

And, you know, it can't, it's not without emotion.

1:58:16

It's not just, it's, it's not just a job.

1:58:20

Let's just put it that way.

1:58:21

It doesn't seem like it at all.

1:58:24

I would imagine one of the more difficult aspects of your job currently, now

1:58:28

that the company is massively successful, is anticipating where technology is

1:58:33

headed and where the applications are going to be.

1:58:36

So how do you try to map that out?

1:58:39

Yeah, there, there, there, there's a whole bunch of ways.

1:58:44

And, and it takes, it takes, it takes a whole bunch of things.

1:58:51

But let me just start.

1:58:54

You have to be surrounded by amazing people.

1:58:56

And NVIDIA is now, you know, if you look at, look at, look at the large tech

1:59:02

companies in the world today, most of them have a business in advertising or

1:59:08

social media or, you know, content distribution.

1:59:13

And at the core of it is really fundamental computer science.

1:59:20

And so the company's business is not computers.

1:59:22

The company's business is not technology.

1:59:25

Technology drives the company.

1:59:27

NVIDIA is the only company in the world that's large whose only business is

1:59:31

technology.

1:59:32

We only build technology.

1:59:34

We don't advertise.

1:59:35

The only way that we make money is to create amazing technology and sell it.

1:59:40

And so to be that, to be NVIDIA today, you're, the number one thing is you're

1:59:45

surrounded by the finest computer scientists in the world.

1:59:50

And that's my gift.

1:59:52

My gift is that we've created a company's culture, a condition by which the

1:59:57

world's greatest computer scientists want to be part of it because they get to

2:00:02

do their life's work and create the next thing because that's what they want to

2:00:07

do.

2:00:08

Because maybe they're not, they don't want to be in service of another business.

2:00:13

They want to be in service of the technology itself.

2:00:15

And we're the largest form of its kind in the history of the world.

2:00:18

I know.

2:00:20

It's pretty amazing.

2:00:21

Wow.

2:00:22

And so, one, you know, we have got a great condition.

2:00:27

We have a great culture.

2:00:28

We have great people.

2:00:30

And now the question is, how do you systematically be able to see the future,

2:00:40

stay alert of it, and reduce the likelihood of missing something or being wrong?

2:00:51

And so, there's a lot of different ways you could do that.

2:00:54

For example, we have great partnerships.

2:00:56

We have fundamental research.

2:00:58

We have a great research lab, one of the largest industrial research labs in

2:01:01

the world today.

2:01:02

And we partner with a whole bunch of universities and other scientists.

2:01:06

We do a lot of open collaboration.

2:01:08

And so, I'm constantly working with researchers outside the company.

2:01:14

We have the benefit of having amazing customers.

2:01:18

And so, I have the benefit of working with Elon and, you know, and others in

2:01:22

the industry.

2:01:24

And we have the benefit of being the only pure play technology company that can

2:01:30

serve consumer internet, industrial manufacturing, scientific computing,

2:01:37

healthcare, financial services.

2:01:41

All the industries that we're in, they're all signals to me.

2:01:44

And so, they all have mathematicians and scientists.

2:01:49

And so, because I have the benefit now of a radar system that is the most broad

2:01:54

of any company in the world, working across every single industry, from

2:02:00

agriculture to energy to video games.

2:02:05

And so, the ability for us to have this vantage point, one, doing fundamental

2:02:10

research ourselves, and then, two, working with all the great researchers,

2:02:15

working with all the great industries, the feedback system is incredible.

2:02:20

And then, finally, you just have to have a culture of staying super alert.

2:02:24

There's no easy way of being alert, except for paying attention.

2:02:29

I haven't found a single way of being able to stay alert without paying

2:02:34

attention.

2:02:35

And so, you know, I probably read several thousand emails a day.

2:02:39

How?

2:02:43

How do you have the time for that?

2:02:44

I wake up early.

2:02:45

This morning, I was up at four o'clock.

2:02:47

How much do you sleep?

2:02:48

Six, seven hours.

2:02:52

Yeah.

2:02:53

And then, you're up at four, read emails for a few hours before you get going.

2:02:57

That's right, yeah.

2:02:58

Wow.

2:02:59

Every day?

2:03:01

Every single day.

2:03:02

Not one day missed.

2:03:03

Including Thanksgiving, Christmas.

2:03:06

Do you ever take a vacation?

2:03:10

Yeah, but they're – my definition of a vacation is when I'm with my family.

2:03:14

And so, if I'm with my family, I'm very happy.

2:03:18

I don't care where we are.

2:03:19

And you don't work then, or do you work a little?

2:03:21

No, no, I work a lot.

2:03:22

Even, like, if you go on a trip somewhere, you're still working.

2:03:27

Oh, sure.

2:03:28

Oh, sure.

2:03:29

Wow.

2:03:29

Every day?

2:03:30

Every day.

2:03:30

But my kids work every day.

2:03:33

You make me tired just saying this.

2:03:34

My kids work every day.

2:03:35

Both of my kids work at NVIDIA.

2:03:39

They work every day.

2:03:40

Wow.

2:03:40

Yeah, I'm very lucky.

2:03:41

Wow.

2:03:43

Yeah.

2:03:43

It's brutal now because, you know, it's just me working every day.

2:03:47

Now, we have three people working every day.

2:03:48

And they want to work with me every day.

2:03:50

And so, it's a lot of work.

2:03:53

Well, you've obviously imparted that ethic into them.

2:03:58

They work incredibly hard.

2:03:59

I mean, it's not –

2:04:00

But my parents work incredibly hard.

2:04:04

Yeah.

2:04:05

I was born with the work gene, the suffering gene.

2:04:09

Well, listen, man.

2:04:11

It has paid off.

2:04:13

What a crazy story.

2:04:15

I mean, it's just – it's really an amazing origin story.

2:04:17

It really – I mean, it has to be kind of surreal to be in the position that

2:04:22

you're in now when you look back at how many times that it could have fallen

2:04:25

apart and humble beginnings.

2:04:28

But, you know, this is – it's a great country.

2:04:30

You know, I'm an immigrant.

2:04:31

My parents sent my older brother and I here first.

2:04:35

We're in Thailand.

2:04:39

I was born in Taiwan.

2:04:40

But my dad had a job in Thailand.

2:04:44

He was a chemical and instrumentation engineer, incredible engineer.

2:04:48

And his job was to go start an oil refinery.

2:04:52

And so, we moved to Thailand, lived in Bangkok.

2:04:56

And in 19 – I guess 1973, 1974 timeframe, you know how Thailand, every so

2:05:04

often, they would just have a coup.

2:05:07

You know, the military would have an uprising.

2:05:10

And all of a sudden, one day, there were tanks and soldiers in the streets.

2:05:15

And my parents thought, you know, it probably isn't safe for the kids to be

2:05:18

here.

2:05:18

And so, they contacted my uncle.

2:05:22

My uncle lives in Tacoma, Washington.

2:05:23

And we had never met him.

2:05:27

And my parents sent us to him.

2:05:30

How old were you?

2:05:31

I was about to turn nine.

2:05:34

And my older brother almost turned 11.

2:05:37

And so, the two of us came to the United States.

2:05:41

And we stayed with our uncle for a little bit while he looked for a school for

2:05:47

us.

2:05:47

And my parents didn't have very much money.

2:05:51

And they'd never been to the United States.

2:05:53

My father was – I'll tell you that story in a second.

2:05:56

And so, my uncle found a school that would accept foreign students and

2:06:04

affordable enough for my parents.

2:06:11

And that school turns out to have been in Oneida, Kentucky, Clark County,

2:06:16

Kentucky, the epicenter of the opioid crisis today.

2:06:20

Coal country.

2:06:22

Clark County, Kentucky is – was the poorest county in America when I showed

2:06:31

up.

2:06:32

It is the poorest county in America today.

2:06:34

And so, we went to the school.

2:06:37

It's a great school.

2:06:38

Oneida Baptist Institute.

2:06:41

In a town of a few hundred.

2:06:45

I think it was 600 at the time that we showed up.

2:06:47

No traffic light.

2:06:51

And I think it was 600 today.

2:06:54

It's kind of an amazing feat, actually.

2:06:56

The ability to hold your population for – when it's 600 people, it's quite a

2:07:03

magical thing.

2:07:05

It's quite a magical thing, however they did it.

2:07:07

And so, the school had a mission of being an open school for any children who

2:07:16

would like to come.

2:07:18

And what that basically means is that if you're a troubled student, if you have

2:07:25

a troubled family,

2:07:28

if you're, you know, whatever your background, you're welcome to come to Oneida

2:07:37

Baptist Institute,

2:07:38

including kids from international who would like to stay there.

2:07:44

Did you speak English at the time?

2:07:45

Okay.

2:07:46

Yeah.

2:07:47

Okay.

2:07:48

Yeah.

2:07:49

And so, we showed up and my first thought was, gosh, there are a lot of

2:08:00

cigarette butts on the ground.

2:08:03

100% of the kids smoked.

2:08:05

So, right away, you know, this is not a normal school.

2:08:11

Nine-year-olds?

2:08:12

No.

2:08:12

I was the youngest kid.

2:08:13

Okay.

2:08:14

11-year-olds.

2:08:15

My roommate was 17 years old.

2:08:17

Wow.

2:08:19

Yeah.

2:08:19

He just turned 17.

2:08:20

And he was jacked.

2:08:22

And I don't know where he is now.

2:08:30

I know his name, but I don't know where he is now.

2:08:32

But anyways, that night, we got – and the second thing I noticed when you

2:08:38

walk into your dorm room

2:08:39

is there are no drawers and no closet doors, just like a prison.

2:08:48

And there are no locks so that people could check up on you.

2:08:56

And so, I go into my room, and he's 17, and, you know, get ready for bed.

2:09:06

And he had all this tape, and he had all this tape all over his body, and it

2:09:11

turned out he was in a knife fight, and he's been stabbed all over his body.

2:09:17

And these were just fresh wounds.

2:09:18

Whoa.

2:09:20

And the other kids were hurt much worse.

2:09:21

And so, he was my roommate, the toughest kid in school, and I was the youngest

2:09:28

kid in school.

2:09:29

So, it was a junior high, but they took me anyways because if I walked about a

2:09:37

mile across the Kentucky River, the Swing Bridge,

2:09:43

the other side is a middle school that I could go to, and then I can go to that

2:09:48

school, and I come back, and then I stay in the dorm.

2:09:52

And so, basically, Oneida Baptist Institute was my dorm when I went to this

2:09:56

other school.

2:09:57

My older brother went to the junior high.

2:10:01

And so, we were there for a couple of years.

2:10:05

Every kid had chores.

2:10:07

My older brother's chore was to work in the tobacco farm, you know, so they

2:10:13

raised tobacco so that they could raise some extra money for the school, kind

2:10:17

of like a penitentiary.

2:10:19

Wow.

2:10:20

And my job was just to clean the dorm.

2:10:22

And so, I was nine years old.

2:10:25

I was cleaning toilets for a dorm of 100 boys.

2:10:31

I cleaned more bathrooms than anybody, and I just wished that everybody was a

2:10:35

little bit more careful, you know?

2:10:38

But anyways, I was the youngest kid in school.

2:10:43

My memories of it was really good, but it was a tough town.

2:10:50

Sounds like it.

2:10:51

Yeah, town kids, they all carried, everybody had knives.

2:10:53

Everybody had knives.

2:10:55

Everybody smoked.

2:10:57

Everybody had a Zippo lighter.

2:10:59

I smoked for a week.

2:11:01

Did you?

2:11:02

Oh, yeah, sure.

2:11:02

How old were you?

2:11:03

I was nine, yeah.

2:11:04

When you were nine, you were nine, you tried smoking.

2:11:06

Yeah, I got myself a pack of cigarettes.

2:11:07

Everybody else did.

2:11:08

Did you get sick?

2:11:09

No, I got used to it, you know?

2:11:12

And I learned how to blow smoke rings and, you know, breathe out of my nose,

2:11:19

you know, take it in and out of my nose.

2:11:22

I mean, there was all the different things that you learned.

2:11:24

At nine?

2:11:26

Yeah.

2:11:27

Wow.

2:11:28

You just did it to fit in or it looked cool?

2:11:30

Yeah, because everybody else did it.

2:11:31

Right.

2:11:31

Yeah.

2:11:31

And then I did it for a couple of weeks, I guess.

2:11:35

And I just rather have, I had a quarter, you know, I had a quarter a month or

2:11:41

something like that.

2:11:42

I just rather buy popsicles and fredsicles with it.

2:11:47

I was nine, you know.

2:11:48

Right.

2:11:49

I chose, I chose the better path.

2:11:51

Wow.

2:11:52

That was our school.

2:11:54

And then my parents came to the United States two years later and we met them

2:11:58

in Tacoma, Washington.

2:12:01

That's wild.

2:12:02

It was a really crazy experience.

2:12:05

What a strange, formative experience.

2:12:07

Yeah.

2:12:08

Tough kids.

2:12:09

Thailand to one of the poorest places in America, or if not the poorest, as a

2:12:17

nine-year-old.

2:12:19

Yeah, it was my first experience.

2:12:21

By yourself.

2:12:21

Yeah.

2:12:21

With your brother.

2:12:22

Wow.

2:12:22

Yeah.

2:12:23

Yeah.

2:12:24

No, I used to remember, and what breaks my heart, probably the only thing that

2:12:28

really

2:12:29

breaks my heart about that experience was, so we didn't have enough money to

2:12:38

make, you know,

2:12:40

international phone calls every week.

2:12:41

And so my parents gave us this tape deck, this Iowa tape deck, and a tape.

2:12:49

And so every month we would sit in front of that tape deck, and my older

2:12:55

brother Jeff and I,

2:12:56

the two of us would just tell them what we did the whole month.

2:13:03

Wow.

2:13:05

And we would send that tape by mail.

2:13:09

And my parents would take that tape and record back on top of it and send it

2:13:13

back to us.

2:13:14

Wow.

2:13:17

Could you imagine if for two years, if that tape still existed, of these two

2:13:23

kids just describing

2:13:25

their first experience with the United States?

2:13:27

Like, I remember telling my parents that I joined the swim team.

2:13:38

My roommate was really buff, and so every day we spent a lot of time in the gym.

2:13:44

And so every night, 100 push-ups, 100 sit-ups, every day in the gym.

2:13:50

So I was nine years old.

2:13:51

I was pretty buff.

2:13:52

And I'm pretty fit.

2:13:55

And so I joined the soccer team.

2:14:00

I joined the swim team.

2:14:01

Because if you join the team, they take you to Meads, and then afterwards, you

2:14:07

get to go to a nice restaurant.

2:14:08

And that nice restaurant was McDonald's.

2:14:11

Wow.

2:14:12

And I recorded this thing.

2:14:15

I said, Mom and Dad, we went to the most amazing restaurant today.

2:14:19

This whole place is lit up.

2:14:22

It's like the future.

2:14:24

And the food comes in a box.

2:14:27

And the food is incredible.

2:14:32

The hamburger is incredible.

2:14:33

It was McDonald's.

2:14:34

But anyhow, wouldn't it be amazing?

2:14:38

Oh, my God.

2:14:39

Two years.

2:14:39

You've been recording?

2:14:40

Yeah, two years.

2:14:41

Yeah.

2:14:42

What a crazy connection to your parents, too.

2:14:45

Just sending a tape and them sending you one back.

2:14:48

And it's the only way you're communicating for two years?

2:14:50

Yeah.

2:14:51

Wow.

2:14:53

Yeah.

2:14:53

No, my parents are incredible, actually.

2:14:56

They grew up really poor.

2:15:00

And when they came to the United States, they had almost no money.

2:15:05

Probably one of the most impactful memories I have is they came and we were

2:15:13

staying in an apartment complex.

2:15:16

And they had just rent, I guess people still do, rent a bunch of furniture.

2:15:26

And we were messing around.

2:15:35

And we bumped into the coffee table and crushed it.

2:15:41

It was made out of particle wood and we crushed it.

2:15:44

And I just still remember the look on my mom's face, you know, because they

2:15:51

didn't have any money and she didn't know how she was going to pay it back.

2:15:53

But anyhow, that kind of tells you how hard it was for them to come here.

2:15:59

But they left everything behind and all they had was their suitcase and the

2:16:04

money they had in their pocket.

2:16:06

And they came to the United States.

2:16:07

How old were they at the time?

2:16:08

Pursued the American dream.

2:16:09

They were in their 40s.

2:16:10

Wow.

2:16:11

Yeah, late 30s.

2:16:12

Pursued the American dream.

2:16:14

This is the American dream.

2:16:16

I'm the first generation of the American dream.

2:16:18

Wow.

2:16:19

Yeah.

2:16:20

It's hard not to love this country.

2:16:22

It's hard not to be romantic about this country.

2:16:25

That is a romantic story.

2:16:27

That's an amazing story.

2:16:29

Yeah.

2:16:29

And my dad found his job literally in the newspaper, you know, the ads.

2:16:35

And he calls people, got a job.

2:16:38

What did he do?

2:16:40

He was a consulting engineer in a consulting firm.

2:16:44

And they helped people build oil refineries, paper mills, and fabs.

2:16:50

And that's what he did.

2:16:51

He's really good at factory design, instrumentation engineer.

2:16:57

And so he's brilliant at that.

2:17:00

And so he did that.

2:17:01

And my mom worked as a maid.

2:17:04

And they found a way to raise us.

2:17:07

Wow.

2:17:09

That's an incredible story, Jensen.

2:17:11

It really is.

2:17:13

All of it.

2:17:14

From your childhood to the perils of NVIDIA almost falling.

2:17:20

It's really incredible, man.

2:17:22

It's a great story.

2:17:23

Yeah.

2:17:24

I've lived a great life.

2:17:25

You really have.

2:17:26

And it's a great story for other people to hear, too.

2:17:29

It really is.

2:17:30

You don't have to go to Ivy League schools to succeed.

2:17:35

This country creates opportunities, has opportunities for all of us.

2:17:39

You do have to strive.

2:17:41

You have to claw your way here.

2:17:44

Yeah.

2:17:46

But if you put in the work, you can succeed.

2:17:48

Nobody works hard.

2:17:49

There's a lot of luck and a lot of good decision making.

2:17:53

And the good graces of others.

2:17:55

Yes.

2:17:55

That's really important.

2:17:57

You and I spoke about two people who are very dear to me.

2:18:00

But the list goes on.

2:18:02

The people at NVIDIA who have helped me, many friends that are on the board,

2:18:11

the decisions,

2:18:13

them giving me the opportunity.

2:18:15

Like when we were inventing this new computing approach, I tanked our stock

2:18:20

price because we

2:18:22

added this thing called CUDA to the chip.

2:18:24

We had this big idea.

2:18:25

We added this thing called CUDA to the chip.

2:18:27

But nobody paid for it.

2:18:28

But our cost doubled.

2:18:30

And so we had this graphics chip company.

2:18:33

And we invented GPUs.

2:18:35

We invented programmable shaders.

2:18:37

We invented everything modern computer graphics.

2:18:40

We invented real-time ray tracing.

2:18:44

That's why it went from GTX to RTX.

2:18:46

We invented all this stuff.

2:18:49

But every time we invented something, the market doesn't know how to appreciate

2:18:54

it.

2:18:55

But the cost went way up.

2:18:57

And in the case of CUDA that enabled AI, the cost increased a lot.

2:19:02

But we really believed it.

2:19:05

And so if you believe in that future and you don't do anything about it, you're

2:19:10

going to

2:19:10

regret it for your life.

2:19:12

And so I always tell the team, do we believe this or not?

2:19:18

And if you believe it, and grounded on first principles, not random hearsay,

2:19:24

and we believe it, we owe it to ourselves to go pursue it.

2:19:28

If we're the right people to go do it, if we're the right people to go do it,

2:19:30

if it's really, really hard to do, it's worth doing, and we believe it.

2:19:33

Let's go pursue it.

2:19:35

Well, we pursued it.

2:19:36

We launched the product.

2:19:38

It was exactly like when I launched DGX1, and the entire audience was like

2:19:44

complete silence.

2:19:46

When I launched CUDA, the audience was complete silence.

2:19:51

No customer wanted it.

2:19:53

Nobody asked for it.

2:19:55

Nobody understood it.

2:19:57

NVIDIA was a public company.

2:19:58

What year was this?

2:19:59

This is, let's see, 2006, 20 years ago.

2:20:08

2005.

2:20:11

Wow.

2:20:12

Our stock prices went poof.

2:20:15

I think our valuation went down to like $2 or $3 billion.

2:20:23

From?

2:20:23

From about 12 or something like that.

2:20:26

I crushed it in a very bad way.

2:20:31

What is it now, though?

2:20:33

Yeah, it's higher.

2:20:36

Very humble of you.

2:20:39

It's higher, but it changed the world.

2:20:42

Yeah.

2:20:43

That invention changed the world.

2:20:44

It's an incredible story, Johnson.

2:20:47

It really is.

2:20:48

Thank you.

2:20:51

I like your story.

2:20:51

It's incredible.

2:20:52

My story's not as incredible.

2:20:54

My story's more weird.

2:20:55

You know?

2:20:57

It's much more fortuitous and weird.

2:21:01

Okay.

2:21:01

What are the three milestones that, most important milestones that led to here?

2:21:10

That's a good question.

2:21:11

What was step one?

2:21:12

I think step one was seeing other people do it.

2:21:17

Step one was in the initial days of podcasting, like in 2009 when I started

2:21:23

podcasting and only been around for a couple of years.

2:21:27

The first was Adam Curry.

2:21:29

The first was Adam Curry, my good friend, who was the podfather.

2:21:31

He invented podcasting.

2:21:33

And then, you know, I remember Adam Carolla had a show because he had a radio

2:21:38

show.

2:21:39

His radio show got canceled.

2:21:41

And so he decided to just do the same show, but do it on the internet.

2:21:43

And that was pretty revolutionary.

2:21:44

Nobody was doing that.

2:21:45

And then there was the experience that I had had doing different morning radio

2:21:50

shows, like Opie and Anthony in particular, because it was fun.

2:21:55

And we would just get together with a bunch of comedians.

2:21:57

You know, I'd be on the show with like three or four other guys that I knew.

2:22:01

And it was always just, I looked forward to it.

2:22:03

It was just such a good time.

2:22:05

And I said, God, I miss doing that.

2:22:07

It's so fun to do that.

2:22:08

I wish I could do something like that.

2:22:10

And then I saw Tom Green set up.

2:22:12

Tom Green had a set up in his house.

2:22:14

And he essentially turned his entire house into a television studio.

2:22:18

And he did an internet show from his living room.

2:22:20

He had servers in his house and cables everywhere.

2:22:23

He had to step over cables.

2:22:24

This was like 2007.

2:22:25

I'm like, Tom, this is nuts.

2:22:27

Like this is, and I'm like, you got to figure out a way to make money from this.

2:22:30

I wish everybody on the internet could see your set up.

2:22:33

It's nuts.

2:22:34

I just want to let you guys know that.

2:22:35

It's not just this.

2:22:38

So that was the beginning of it.

2:22:41

It was just seeing other people do it and then saying, all right, let's just

2:22:44

try it.

2:22:44

And then so the beginning days, we just did it on a laptop.

2:22:48

Had a laptop with a webcam and just messed around.

2:22:51

Had a bunch of comedians come in and we would just talk.

2:22:54

Joke around.

2:22:55

And then I did it like once a week.

2:22:56

And then I started doing it twice a week.

2:22:58

And then all of a sudden I was doing it for a year.

2:23:00

And then I was doing it for two years.

2:23:02

Then it was like, oh, it's starting to get a lot of viewers, a lot of listeners.

2:23:06

You know?

2:23:07

And then I just kept doing it.

2:23:09

That's all it is.

2:23:10

I just kept doing it because I enjoyed doing it.

2:23:12

Was there any setback?

2:23:14

No.

2:23:15

No, there was never really a setback.

2:23:17

Really?

2:23:17

No.

2:23:17

It must have been.

2:23:18

It's not the same kind of story.

2:23:19

You're just resilient.

2:23:21

Or you're just tough.

2:23:22

No.

2:23:23

No, no, no.

2:23:24

It wasn't tough or hard.

2:23:25

It was just interesting.

2:23:27

So I just.

2:23:27

You were never once punched in the face.

2:23:30

No, not in the show.

2:23:31

No, not really.

2:23:32

Not doing the show.

2:23:33

You never did something that big blowback.

2:23:38

Nope.

2:23:38

Not really.

2:23:40

No.

2:23:41

It all just kept growing.

2:23:42

It kept growing.

2:23:44

And the thing stayed the same from the beginning to now.

2:23:47

And the thing is, I enjoy talking to people.

2:23:50

I've always enjoyed talking to interesting people.

2:23:52

I could even tell just when we walked in.

2:23:53

The way you interacted with everybody, not just me.

2:23:56

Yeah.

2:23:57

That's cool.

2:23:58

People are cool.

2:23:59

Yeah, that's cool.

2:24:00

You know, it's an amazing gift to be able to have so many conversations with so

2:24:07

many interesting

2:24:07

people because it changes the way you see the world because you see the world

2:24:11

through

2:24:12

so many different people's eyes.

2:24:13

And you have so many different people have different perspectives and different

2:24:17

opinions

2:24:17

and different philosophies and different life stories.

2:24:21

It's an incredibly enriching and educating experience having so many

2:24:28

conversations with so many

2:24:30

amazing people.

2:24:31

And that's all I started doing and that's all I do now.

2:24:36

Even now, when I book the show, I do it on my phone and I basically go through

2:24:41

this giant

2:24:42

list of emails of all the people that want to be on the show or that request to

2:24:47

be on the

2:24:47

show and then a factor in another list that I have of people that I would like

2:24:51

to get on

2:24:52

the show that I'm interested in.

2:24:53

And I just map it out.

2:24:55

And that's it.

2:24:56

And I go, ooh, I'd like to talk to him.

2:24:57

If it wasn't because of President Trump, I wouldn't have been bumped up on that

2:25:00

list.

2:25:00

No, I wanted to talk to you already.

2:25:03

I just think, you know, what you're doing is very fascinating.

2:25:06

I mean, how would I not want to talk to you?

2:25:08

And then today, it proved to be absolutely the right decision.

2:25:12

Well, you know, listen, it's strange to be an immigrant one day going to Oneida

2:25:18

Baptist

2:25:19

Institute with the students that were there.

2:25:23

And then here, NVIDIA is one of the most consequential companies in the history

2:25:30

of companies.

2:25:31

It is a crazy story.

2:25:34

It has to be strange for you.

2:25:36

The journey is, and it's very humbling, and I'm very grateful.

2:25:41

It's pretty amazing, man.

2:25:42

Surrounded by amazing people.

2:25:43

You're very fortunate, and you've also, you seem very happy.

2:25:47

And you seem like you're 100% on the right path in this life, you know?

2:25:51

You know, everybody says, you must love your job.

2:25:54

Not every day.

2:25:55

That's not, that's part of the beauty of everything.

2:25:59

Yeah.

2:25:59

Is that there's ups and downs.

2:26:01

That's right.

2:26:01

It's never just like this giant dopamine high.

2:26:03

We leave this impression.

2:26:05

Here's an impression I don't think is healthy.

2:26:08

People who are successful leave the impression often that our job gives us

2:26:16

great joy.

2:26:18

I think largely it does.

2:26:20

That our jobs, we're passionate about our work.

2:26:27

And that passion relates to, it's just so much fun.

2:26:31

I think it largely is.

2:26:33

But it distracts from, in fact, a lot of success comes from really, really hard

2:26:41

work.

2:26:42

Yes.

2:26:43

There's long periods of suffering and loneliness and uncertainty and fear and

2:26:52

embarrassment and humiliation.

2:26:56

All of the feelings that we most not love.

2:27:01

That creating something from the ground up, and Elon will tell you something

2:27:07

similar.

2:27:09

It's very difficult to invent something new.

2:27:11

And people don't believe you all the time.

2:27:15

You're humiliated often, disbelieved most of the time.

2:27:19

And so people forget that part of success.

2:27:23

And I don't think it's healthy.

2:27:26

I think it's good that we pass that forward and let people know that it's just

2:27:30

part of the journey.

2:27:32

Yes.

2:27:32

And suffering is part of the journey.

2:27:34

You will appreciate it so much, these horrible feelings that you have when

2:27:38

things are not going so well.

2:27:40

You will appreciate it so much more when they do go well.

2:27:43

Deeply grateful.

2:27:44

Yeah.

2:27:44

Yeah.

2:27:45

Deep, deep pride.

2:27:46

Incredible pride.

2:27:48

Incredible, incredible gratefulness.

2:27:51

And surely incredible memories.

2:27:53

Absolutely.

2:27:55

Jensen, thank you so much for being here.

2:27:57

This was really fun.

2:27:58

I really enjoyed it.

2:27:59

And your story is just absolutely incredible and very inspirational.

2:28:03

And I think it really is the American dream.

2:28:07

It is the American dream.

2:28:08

It really is.

2:28:09

Thank you so much.

2:28:10

Thank you, Jeff.

2:28:10

All right.

2:28:11

Bye, everybody.

2:28:11

Bye, everybody.