#2467 - Michael Pollan

111 views

2 months ago

0

Save

Audio

Michael Pollan

3 appearances

Michael Pollan is an author and journalist whose books include “The Omnivore’s Dilemma,” “In Defense of Food,” and “How to Change Your Mind." His most recent is “A World Appears: A Journey into Consciousness." www.penguinrandomhouse.com/books/646644/a-world-appears-by-michael-pollan www.michaelpollan.substack.com www.michaelpollan.com

ChatJRE - Chat with the JRE chatbot

Timestamps

0:00Psychedelics, meditation, and the mystery of consciousness (including plant consciousness and the 'hard problem')
9:57Psychedelics, surrender, and reclaiming consciousness (vs numbing it)
19:56Creativity, focus drugs, and the illusion of self (caffeine, Adderall, Buddhism, hypnosis)

Show all

Comments

Write a comment...

Transcript

0:00

Joe Rogan podcast, check it out.

0:03

The Joe Rogan experience.

0:05

Train by day, Joe Rogan podcast by night, all day.

0:09

Mr. Colin, so good to see you again.

0:14

Hey, good to be back.

0:15

Consciousness.

0:16

So, this new book, what inspired it?

0:20

What got you to, I mean, you've kind of explored consciousness a little bit

0:24

with your...

0:25

Psychedelic book, yeah.

0:26

How to Change Your Mind.

0:28

Well, actually, this book was inspired by the research I did for that book.

0:32

As you know, I had several research trips.

0:36

Do you do air quotes when you say research?

0:40

Yes.

0:41

And two things happened that were really interesting.

0:47

One is, there's something about psychedelics that makes you think about

0:52

consciousness.

0:54

You know, it's like smudging the windscreen, the windshield that you normally

0:59

use perfectly transparent and you see the world through.

1:01

Suddenly, it's like different.

1:03

And you realize there's something between me and the world.

1:06

And what is it?

1:08

And that's consciousness.

1:10

And so, like a lot of people who've done psychedelics, you start wondering

1:14

about this mystery.

1:16

Why is it this way, not that way?

1:18

So, that was one experience.

1:19

The other was, I had an experience in my garden in Connecticut where we have a

1:25

house of walking through my garden and getting the powerful impression that the

1:29

plants were conscious.

1:31

And that these, I remember this particular, it was a plume poppy or several plume

1:36

poppies.

1:36

And they were like returning my gaze.

1:39

They were very benevolent.

1:41

They were, you know, putting out positive vibes.

1:44

But, like, they were conscious, much more alive than they'd ever been.

1:50

And, like a lot of insights on psychedelics, I didn't know what to do with it.

1:53

Like, is it true?

1:54

Is it just a drug thing?

1:56

You know, what is it?

1:56

But I decided it would be interesting to find out.

1:59

And I consulted a couple people, scientists, and said, what do you do with an

2:03

insight like that?

2:05

And they said, well, you test it against other ways of knowing, including

2:08

scientific ways of knowing.

2:10

And that led me down this really interesting path, exploring plant intelligence

2:15

and plant consciousness.

2:18

So, basically, yeah, the book grew out of the psychedelic experiences and some

2:23

meditation experience.

2:24

Meditation also has a way of making you, like, hyper aware of how strange your

2:29

thoughts are.

2:30

Where are they coming from?

2:31

Who's thinking them?

2:32

So, there's a bunch of different schools of thought when it comes to

2:34

consciousness, right?

2:35

There's one, like the Rupert Sheldrake thing, sort of everything has

2:38

consciousness.

2:40

And there's the sort of rational scientists that believe it exists somewhere in

2:47

the mind.

2:48

In the brain.

2:49

Yeah, in the brain, excuse me.

2:51

And then there's people that think that the brain is essentially just an

2:55

antenna.

2:55

Right.

2:56

That's tuning in.

2:57

Receiving, yeah.

2:57

To the greater consciousness of whatever it is that's out there.

3:01

Yeah.

3:01

Do you have any one of them that you hold?

3:04

I don't.

3:05

They're all equally plausible.

3:07

You know, I went into the experience assuming, because this is what most

3:11

scientists assume,

3:12

that somehow a certain arrangement of neurons in the brain generates

3:17

consciousness, you know, subjective experience.

3:20

But no one's been able to show that.

3:22

We've gotten nowhere in that effort to, you know, we can, we might correlate

3:26

certain parts of the brain with consciousness,

3:29

but we don't understand how three pounds of matter could generate the feeling

3:34

of being you.

3:36

You talk about it in your book where the two gentlemen who had the bet.

3:39

Yeah, yeah.

3:40

That was Christophe Koch, who's a great brain scientist, and David Chalmers,

3:46

who's a philosopher.

3:49

And this goes back to, like, in the early 90s.

3:53

They were getting drunk in a bar in Bremen, Germany.

3:55

And Christophe Koch had really, was at the beginning of the modern scientific

4:00

exploration of consciousness.

4:02

And he was working with Francis Crick, who had just come off of a Nobel Prize

4:07

for the discovery of DNA.

4:09

And Crick, who is, like, the most famous scientist in the world at the time,

4:13

thought, well, the same kind of reductive science

4:18

that discovered the double helix DNA and explained heredity.

4:22

I'm going to do that for consciousness.

4:25

He was a very arrogant man, and he thought of just, you know, no problem.

4:29

And Crick was kind of his sidekick.

4:32

I'm sorry, Koch was his sidekick.

4:35

And so Koch, who shared that kind of confidence, made this bet with Chalmers

4:39

that they would find the neural correlates, the parts of the brain that are

4:43

responsible for consciousness, within 25 years.

4:47

That was 25 years, 27 years ago now.

4:50

And Chalmers won the bet.

4:52

Chalmers is famous for coining the term the hard problem to, you know, to

4:58

describe the whole effort to figure out consciousness.

5:03

And it's a hard problem for a lot of reasons.

5:06

I mean, it is one of the biggest mysteries in the universe.

5:09

I mean, how consciousness came to be.

5:11

Did it evolve?

5:12

Was it always here?

5:15

But his point was that our science is based on third-person, objective, quantifiable

5:21

measurements, and consciousness is fundamentally a subjective first-person

5:28

experience.

5:29

So how does those tools reach in and say anything of value about consciousness?

5:34

So he said, you know, there are easy problems in consciousness we can figure

5:39

out, like perception, emotion, things like that.

5:42

But there is this hard problem.

5:44

How do you get from matter to mind?

5:46

And he won the bet.

5:49

There was a ceremony I went to a couple years ago at NYU, and Coke presented Chalmers

5:57

with a case of very fine Madeira wine and renewed the bet.

6:03

He said, all right, in another 25 years.

6:05

That's optimistic.

6:07

How old are these gentlemen?

6:07

Coke is in his late 60s, so we'll see if he's around for this.

6:12

But Chalmers is a little bit younger.

6:14

It's such an interesting thought because we know that the mind contains, if

6:24

damaged, right?

6:27

We know that there's certain aspects, there's certain parts of the mind where,

6:30

like lobotomies, for instance.

6:32

We know that if we disturb it, it radically affects behavior.

6:35

We know that there's parts of the mind that you can stimulate that could

6:39

actually recall memories.

6:42

There's some weird stuff going on there.

6:44

So we know it's somehow or another at least functionally connected to

6:47

consciousness.

6:48

Oh, yeah.

6:49

It's definitely a relationship.

6:50

But if it's generating consciousness, that's one thing.

6:54

But it could be, as you said earlier, it could be receiving consciousness.

6:57

And the same things would hold true, that if you damage parts of the brain, if

7:01

you damage the television.

7:05

There's a signal still out there.

7:06

Right.

7:06

Yeah.

7:06

So that doesn't determine the truth of either theory.

7:12

And then the other one is panpsychism, which you were alluding to.

7:16

I don't know if that's Rupert Sheldrake.

7:18

I think he would believe more in the field of consciousness.

7:21

Yeah, right.

7:22

He was a morphic residence guy.

7:24

But I think he also subscribed to this idea that things contain consciousness.

7:28

It's not his, but you know what I mean.

7:30

Well, it's pretty universal, right?

7:33

There's a lot of people that have subscribed to this idea that everything has

7:35

consciousness.

7:36

Yeah.

7:37

That even the particles that this table is made of have some eensy little bit

7:41

of psyche.

7:43

And the challenge there is, so that solves the problem of how did it evolve?

7:47

It didn't evolve.

7:48

It's always here.

7:48

But then you have this other problem, like, well, how do you take these, if

7:53

every one of

7:53

our cells is made of particles that are conscious, how do you combine them in

7:57

such a way that

7:58

you get the sort of consciousness we have?

8:00

It's called the combination problem.

8:02

And nobody solved that.

8:03

It's a, you know, it's a really deep mystery.

8:06

And this is an odd book in some ways in that, I don't know if this is very

8:11

selling,

8:12

but you'll know less at the end than you do at the beginning.

8:15

But it's a fun ride.

8:17

Oh, I think it's a great ride.

8:19

It was a great ride for me.

8:20

I learned so much.

8:21

Well, it's a fun ride to consider these things that no one can really figure

8:25

out or not yet.

8:26

Yeah.

8:27

And also just to be put in touch with the fact you have this marvel going on in

8:31

your head

8:32

all the time.

8:32

You have a voice in your head.

8:34

You know, we're talking to each other, but you've got another voice going on

8:36

thinking

8:37

what you're going to ask, you know, what the next question is, maybe what you're

8:40

going

8:40

to have for dinner, you know, it's this amazing interior space we have.

8:46

Yeah.

8:46

And nobody understands how it came to be.

8:49

And you can manage it, which is also interesting.

8:52

Yeah, you can.

8:52

Because, like, I don't think about what I'm going to have for dinner.

8:55

That's the thing.

8:56

You put that out of your head?

8:57

No, about any of those things.

8:59

It's the way to stay locked in in a podcast.

9:01

Yeah, that's true.

9:02

Only think.

9:03

Because you can let your mind wander.

9:05

Oh, yeah.

9:05

Especially if someone on the other side is boring.

9:07

Yeah.

9:08

And then I'm like, oh, no, this conversation is going to be pulling teeth.

9:11

And then I start thinking about a new joke I'm working on or, oh, I got to get

9:15

my car

9:15

fixed.

9:16

Well, that's called spotlight consciousness, when you can, like, really, like,

9:20

put the blinders

9:20

on.

9:21

Yes.

9:21

And rule everything out.

9:23

And that's opposed to lantern consciousness, where you're taking in all sorts

9:27

of information,

9:28

you're letting your mind wander.

9:30

And, you know, they both have their value.

9:33

For our careers, spotlight consciousness is essential for our work.

9:38

We have to be able to focus.

9:39

To get through school, we have to be able to focus.

9:42

But, you know, children have this other kind of consciousness that's really

9:47

wild, because

9:48

they're very undisciplined.

9:49

They can't stay on task.

9:50

But they're taking in so much information.

9:53

And the world is just full of wonder and awe.

9:57

And psychedelics, you know, is a way to recover that kind of consciousness,

10:02

because you're getting

10:03

lots of sensory information from all over the place.

10:06

It's very hard to focus.

10:08

And so it's a taste of that other, you know, childhood consciousness.

10:14

I always say that about marijuana as well.

10:17

There's a thing about marijuana that people always say that it makes them

10:21

paranoid.

10:22

And I say it makes you aware of all the things you should be paranoid about.

10:27

Like, you're very, we're very vulnerable creatures, you know, but we like to

10:32

pretend that we are

10:33

not, you know, which is, I've found that out of all of my friends, the ones

10:37

that have tried

10:38

marijuana and hated it are all the ones that are control freaks.

10:41

Yeah, because you have to give up control.

10:44

Yeah, they're all really buttoned down, very serious, like really worried about

10:49

outcomes,

10:49

really concentrating on their career, really worried about, you know, just

10:54

certain things

10:55

that are just a part of their daily life.

10:58

And then they get a couple of hits of good weed.

11:01

And then they're like, oh, my God, we're on a planet.

11:04

You start freaking out like, oh, my God, none of this makes sense.

11:11

All this is crazy.

11:12

You know, um, the best piece of advice that I had when I was, you know,

11:19

starting my exploration

11:20

of psychedelics is you have to surrender.

11:22

Yes.

11:23

If you resist, you're going to be miserable.

11:25

You're going to get so anxious and so paranoid.

11:28

And if you let go, it's going to work out.

11:31

Yeah.

11:31

You just got to be able to accept whatever it's showing you.

11:35

And, um, you know, we live in a very strange culture where that's illegal.

11:39

Well, not everywhere, right?

11:42

I mean, it's changing.

11:43

Well, it is changing, fortunately.

11:45

And there's some talk about it changing federally.

11:47

You know, I actually talked to RFK Jr. about that.

11:50

There's some amazing therapies that are hugely beneficial to veterans, police

11:56

officers, people

11:57

with severe PTSD that experienced, you know, horrors that the average person

12:02

never has to experience.

12:04

And then they're forced to just, like, go back there, released.

12:08

Go back to regular life.

12:09

Yeah.

12:09

I know you've served us overseas and you've seen people blow up, but now go to

12:13

the supermarket.

12:14

Take this SSRI and it'll be okay.

12:16

And then, you know, I know a bunch of them.

12:18

And so many of them have benefited, particularly from Ibogaine.

12:21

Yep.

12:22

Ibogaine, um, the work that Rick Doblin and MAPS has done.

12:26

Yes, MDMA.

12:26

And psilocybin.

12:28

Those three are the big ones that I think.

12:31

Well, you know, I heard a lot of positive noise out of the administration at

12:34

the beginning

12:35

that they were very much in favor of approving, the FDA approving MDMA first

12:41

and then psilocybin.

12:43

I don't think we're there with Ibogaine yet just because the research hasn't

12:46

been done,

12:47

although it has shown great benefit anecdotally.

12:51

But something happened in the last month or two.

12:53

And there is, there was either Compass Pathways that was going to submit for psilocybin

13:03

therapy

13:04

or MAPS with, was on a list of five drugs that were going to get an expedited

13:10

approval process.

13:13

This list went up to the White House and the psychedelic was taken off it.

13:17

So there's somebody in the White House who doesn't want to see this happen.

13:20

So it may slow down even, even if RFK Jr. is in favor and some other people at

13:26

the FDA are in favor.

13:27

And maybe they're just waiting to get past the election.

13:30

It could be that it's too controversial for something to do before the midterms.

13:35

Yep.

13:35

Yep.

13:36

Um, that's a gross way to live your life.

13:39

Oh, yeah.

13:40

Always worrying about midterms and elections and you can't do what you actually

13:45

want to do

13:45

or think is right to do because you're worried about public perception.

13:49

It's just, it's...

13:50

And I don't think it would be unpopular.

13:51

I mean, the fact that it's helpful to vets and first responders and women who've

13:56

been victims of sexual abuse.

13:57

Yeah.

13:57

Seems to me that's a very sympathetic group of people.

14:00

Yeah.

14:00

And everyone has experienced loss of family members.

14:03

There's a bunch of different things that it can help you with.

14:06

That are way better for you than just numbing your mind all day long.

14:10

Yeah.

14:10

Which is what a lot of people are choosing to do.

14:12

And then, unfortunately, a lot of people self-medicate as well.

14:14

So then they get involved in, you know, all sorts of stuff that they just pick

14:18

up off the street or they start using alcohol, you know.

14:22

Well, you know, it's a...

14:24

This...

14:24

To go back to consciousness, this is a very common thing that people want to be

14:29

less conscious.

14:30

Right.

14:31

And I get that if you had trauma.

14:34

If you're a ruminator and being in your mind is a really scary place to be.

14:40

Yeah.

14:40

It doesn't solve anything.

14:42

But you have all these techniques we have for muting consciousness and just

14:47

being less aware, less present.

14:50

And one of the things that I concluded after doing all this research on

14:55

consciousness is that it's funny.

14:58

I was going down this path of tight focus.

15:01

Solve, you know, it was a very kind of Western male framework, which we got a

15:06

problem.

15:07

What's the solution?

15:08

Hard problem of consciousness.

15:10

What's the right theory?

15:11

And at a certain point, I realized, okay, that's an interesting question.

15:15

It's probably not solvable now.

15:17

But there is this incredible phenomenon that we have this interior space where

15:22

we have complete mental freedom, total privacy.

15:26

We can think whatever we want and we're giving it away.

15:30

We're either, you know, muffling it with drugs and things like that or we're

15:36

filling that time with social media, you know, scrolling.

15:43

I mean, we've heard about hacking our attention and we know these algorithms,

15:47

you know, from social media are very good at like giving us these little

15:51

dopamine hits.

15:52

But that's time that we used to spend in spontaneous thought, you know, daydreaming,

15:58

mind wandering, which can be very creative.

16:02

So I came out of it thinking, no, I may not solve consciousness, but I'm going

16:09

to appreciate it.

16:11

I'm going to use it.

16:12

I'm going to create a space for it.

16:15

And, you know, meditate is one way.

16:18

Using psychedelics is another way.

16:20

These are all ways to be in your head and explore what's there, which is kind

16:24

of miraculous.

16:25

Yeah, there's a bunch of different ways to do it.

16:27

I mean, some people like to do it through running.

16:29

Yeah.

16:30

You know, running is also they've found one of the things they've found

16:34

recently is that running with in terms of endogenous cannabinoids, like runner's

16:40

high.

16:41

It's a real thing.

16:42

Oh, yeah.

16:42

It's a real thing.

16:43

There's a drug released.

16:44

Yeah.

16:44

That feels great.

16:45

And it's rewarding you for that.

16:47

But it doesn't fuck with your perceptions.

16:49

It doesn't mess with your motor skills.

16:51

It doesn't cloud your judgment.

16:53

It just makes you feel great.

16:55

Yeah.

16:56

Yeah.

16:56

Experiences of awe do this, too.

16:58

You know, you go to the Grand Canyon or something and or a great piece of art

17:04

and you have this feeling of like powerful presence.

17:09

And it's very interesting.

17:11

It shrinks the ego.

17:12

I have a good friend who's a colleague at Berkeley, a psychologist who studies

17:17

awe.

17:19

And he does this cool experiment where he has people draw a picture of

17:22

themselves on graph paper, you know, just stick figure or something like that.

17:27

And then he takes some river rafting or something like that or even just shows

17:30

them a picture of Yosemite.

17:32

And then he has them draw themselves again and they draw themselves at like

17:36

half the size because their sense of self has been overwhelmed by this

17:40

transcendent experience.

17:42

And so he calls it the small self.

17:45

And it feels good.

17:47

I mean, we're so kind of weird about the self.

17:51

You know, we celebrate it.

17:52

Right.

17:52

Self-confidence.

17:53

We want our kids to have, you know, self-esteem and self-assurance.

17:56

Yet we do all sorts of things to get away from it, you know, to transcend it.

18:02

Well, I think it's because without those things, you're never going to make it

18:05

in life.

18:05

Yes.

18:06

It's adaptive.

18:07

You definitely, it's definitely gets things done, but it also isolates you,

18:11

right?

18:12

Yes.

18:12

Because the ego builds walls.

18:13

Right.

18:14

And when the walls come down, we feel like we're part of something much larger.

18:18

And that feels really good.

18:19

Well, I think my advice to people is once you get competency in a thing, forget

18:24

about the self-respect and forget about all that self-stuff.

18:29

And just concentrate on the thing, whatever it is.

18:32

Yeah.

18:33

And you can find some sort of meditative, at least beneficial, like whatever

18:40

you get from meditation, which is like a cleansing of the mind.

18:45

Like a lot of people find that through archery.

18:47

You know, archery is a weird thing because at the moment of releasing the arrow,

18:51

it's like almost impossible to think about anything else.

18:55

All you're thinking about is hitting the target.

18:57

And there's so many different things that you have to have in position.

19:01

There's so much going on that people, when they're troubled, love to go to an

19:05

archery range and just hit targets.

19:08

And it just clears your mind out.

19:09

This episode is brought to you by Armra.

19:11

Every week, there's some new wellness hack that people swear by.

19:15

And after a while, you start thinking, why do we think we can just outsmart our

19:19

bodies?

19:20

That's why Armra colostrum caught my attention.

19:24

It's something the body already recognizes and has hundreds of these

19:28

specialized nutrients for gut stuff, immunity, metabolism, etc.

19:33

I first noticed it working around training, especially workout recovery.

19:37

Most stuff falls off, but I am still taking this.

19:40

If you want to try, Armra is offering my listeners 30% off plus two free gifts.

19:45

Go to armra.com slash rogan.

19:48

It's flow, right?

19:50

It's a feeling you get to when your work is going really well and you're not

19:54

thinking about it.

19:55

You're just in it.

19:56

Yeah.

19:57

And it's a really precious experience.

20:00

It really is.

20:01

But if you're thinking about yourself and your self-image, like, that's not

20:04

going to come.

20:05

It's not.

20:06

It's not.

20:06

It's an interesting trap, you know.

20:10

We've had these discussions in stand-up comedy about joke thieves.

20:15

And they don't really make it anymore because the Internet has essentially

20:20

eliminated that problem for the most part.

20:24

The kind of mentality that makes you steal a joke is the exact kind of

20:29

mentality that keeps you from writing a joke.

20:31

So the kind of people that began their career stealing material, what happens

20:37

is, like, early on, they'll have, like, one good comedy special because it's

20:40

got a bunch of other people's material in it.

20:42

And then they get outed.

20:43

And so then they have to show they can do another.

20:46

And the other specials are always terrible.

20:49

I mean, unbelievably awful.

20:51

Like, someone's doing a cheap impression of the original person who had all

20:55

this great insight.

20:56

Because the very thing that keeps you from doing it is the thing that you've

21:00

been doing.

21:02

Like, thinking about yourself, like, I'm going to take these jokes and I'm

21:04

going to make it.

21:05

I'm going to have a big career.

21:06

People are going to laugh.

21:07

They're going to love me.

21:08

Here we go.

21:09

With no regard whatsoever for that other person's creativity.

21:12

That is, like, you're poisoning your own creativity.

21:16

It's weird.

21:17

It is weird.

21:18

It's weird because, like, everybody that I've ever talked to that's either an

21:23

author or even musicians or comedians, when something comes to them when they're

21:27

writing, it's like it comes from somewhere else.

21:30

It's like, I didn't even write it.

21:31

It's, and, you know, we call, we talk about being in the zone.

21:35

And there are times when you're writing, it doesn't happen every day, but there

21:39

are times when you're writing where you're just not thinking, but one sentence

21:42

after another after another, and you don't know where they're coming from.

21:44

Right.

21:45

And it's a wonderful feeling.

21:47

Well, Stephen King used to get obliterated so that he could get to that spot.

21:51

Like, there's books.

21:52

What do you mean obliterated?

21:53

Like, cocaine, alcohol, like, his best work.

21:56

Like, he wrote Cujo.

21:58

He didn't even remember it.

21:59

He didn't remember any of it.

22:00

He was obliterated.

22:01

He would just drink, like, cases of beer and do lines of Coke and write this

22:05

fucking insane fiction.

22:07

And he didn't know where it was coming from, you know, but, I mean, he showed

22:12

up every day and sat down with the computer and then it all came out.

22:17

It's such a weird mix of being disciplined and something else.

22:20

But it's very common amongst writers.

22:22

Yeah.

22:22

Like, Conor Thompson, same sort of situation.

22:25

Well, a lot of writers do that after they've written.

22:27

They don't, I don't know how many writers write under the influence.

22:31

Oh, I know a few.

22:31

But there's, yeah.

22:32

Yeah, I know quite a few.

22:34

That's interesting.

22:34

I know a lot of write under the influence of Adderall.

22:37

Yeah.

22:38

Well, and for me, it's caffeine.

22:39

I mean, I have a cup of coffee going the whole time I'm writing.

22:43

And that kind of keeps me, caffeine is a focus chemical.

22:47

It's, it's, it definitely encourages this spotlight consciousness.

22:52

Well, you talked about how you took this long break from caffeine.

22:55

Yeah.

22:56

And then when you took it again, it was almost like a psychedelic for you.

22:58

It was crazy how great it was.

23:00

No, it really was.

23:02

It was like one of the best drug experiences I've had.

23:04

It was three months off caffeine.

23:06

I did this fast for this book I was writing.

23:09

And, and then I was like, okay, now I'm going to have a cup.

23:13

And I was like, wow.

23:14

And I, and I tried to hold onto that.

23:16

You know, I said, all right, I'm only going to have coffee once a week and not

23:20

build up tolerance.

23:22

And, and I, I stuck to that for a few weeks and then I had like a Thursday

23:26

deadline.

23:27

And I was like, I'll move it up a couple of days and a slippery slope.

23:31

And then I was back to every day.

23:33

I like it.

23:34

I like a big French press where I could put a lot of grinds in there.

23:40

It's super strong when I'm writing.

23:42

It's like, whoa, it just, it just.

23:44

It makes all the difference.

23:45

Locks you in.

23:46

Yeah.

23:46

I had trouble writing that, that three month period.

23:49

I really did.

23:49

I just, my focus, I felt like I, so I have pretty good concentration.

23:54

I never had ADHD.

23:55

I had it for those three months.

23:57

That's crazy.

23:59

Stephen King said the biggest problem for him was quitting smoking.

24:03

He said when he quit smoking cigarettes, it's like he really felt a slowdown in

24:08

his.

24:08

Well that, yeah, it's that ritual.

24:10

It's the drug too.

24:11

And, and, and nicotine is another focus drug.

24:14

Definitely.

24:15

Like speed or something.

24:16

But it's also writing so much about ritual.

24:19

Like I got my coffee here and my cigarette here and between every paragraph.

24:23

So changing those rituals is really hard.

24:27

I mean, I only smoked into my twenties and, and quitting, you know, made it

24:33

very hard to

24:34

write for a while.

24:35

Really?

24:35

Yeah.

24:35

Yeah.

24:36

It's interesting.

24:37

It's a very ritualized process.

24:38

Well, I worry about the people that like, especially journalists.

24:42

I know quite a few journalists that have an Adderall problem.

24:45

Yeah.

24:45

Because it's just like, you've got a deadline, 2000 words by, you know, 2 AM.

24:50

Let's go.

24:51

And that's, that's the drug for that.

24:54

Yeah.

24:54

Definitely.

24:54

But it's just, it's such a crutch.

24:56

Yeah.

24:57

And you can't sustain it long term.

24:59

And that definitely messes with your, the way you think.

25:03

Oh yeah.

25:05

I think over time.

25:06

Yeah.

25:06

It has to.

25:07

Yeah.

25:07

I mean, it's amphetamines.

25:09

Right.

25:10

Yeah.

25:10

Now that's why caffeine is such a good drug.

25:12

It doesn't have a lot of, I mean, you can overdo it.

25:15

I think, I think it improves your health and mental health up to about eight

25:20

cups a day.

25:20

After that, your risk of suicide and depression go up.

25:24

Did you have any communication with any monks or any people who do TM or did

25:31

you?

25:33

Yeah, I had some interesting experiences around that.

25:35

So there's a long section on the self, which is one of the more interesting

25:39

manifestations

25:41

of consciousness, right?

25:43

I mean, it's like that we have this idea that we're, there's a continuity,

25:47

right?

25:48

That who you are now is, has some golden thread attaching you to your 13 year

25:53

old self, which

25:54

is really weird because your body is, every cell has turned over many, many

25:57

times.

25:59

You've changed in all sorts of ways, but this continuity is really important to

26:03

us.

26:03

And, you know, the Buddhists think the self is an illusion.

26:07

And I, I interviewed a couple of them.

26:11

Matthew Ricard is a French Nepalese monk in his eighties who lives in Nepal.

26:17

And he's written some really interesting things on the self.

26:21

And, uh, I, I said, uh, I'm, I'm really curious about how you can find out for

26:27

yourself whether

26:29

the self is real.

26:30

Um, and, you know, famously there was a philosopher in the 18th century, David

26:35

Hume, who wanted

26:36

to write about the self.

26:37

And, and he thought, well, I'm going to introspect to see what, what, what I

26:41

can learn about the

26:42

self.

26:42

And he goes into his mind, you know, in a kind of meditation.

26:45

And he said, I found all sorts of perceptions and feelings and thoughts, but I

26:50

didn't find

26:50

a thinker.

26:51

I didn't find a perceiver and I didn't find a feeler.

26:54

There's like nobody home.

26:55

And it's a really interesting exercise to do because you will find there's

27:00

nobody home.

27:01

There's just the thoughts and, and who's thinking them not clear.

27:06

And anyway, so this Buddhist, um, monk said, are there any meditations that

27:11

help with this?

27:12

And he said, yeah.

27:13

And he gave me one.

27:14

And he says, think of your mind as a house with many rooms.

27:18

And, um, there's a thief somewhere in the house and go room by room in your

27:25

head and look

27:26

for the thief and you will find no thief and then sit with that, that finding.

27:31

Um, and that thief is the self.

27:34

And, um, uh, so I did it twice.

27:38

The first time I did it.

27:40

Why does this self have to be a thief?

27:41

I don't know.

27:42

It's just a metaphor.

27:43

I know.

27:43

Cause he, he's a negative take on a baseball bat.

27:45

Do you have a gun?

27:46

Like you're looking for someone in your house.

27:48

That's kind of crazy.

27:49

I know you're not armed.

27:50

Um, anyway.

27:52

Uh, so the first time I did it, this is kind of weird.

27:55

I was interviewing this, uh, hypnotist at Stanford named David Spiegel.

27:59

And he's a psychiatrist who uses hypnotism.

28:02

Really interesting guy.

28:03

And he uses hypnotism to help people with multiple personality disorders.

28:08

He can actually make them change which person they're accessing.

28:12

You know, these are people whose, whose consciousness contains, it could be 20

28:16

different people.

28:18

Um, and I said, could we do a test?

28:21

Um, and can you put me under, hypnotize me?

28:24

And then I wanted to do that exercise of going through the house.

28:28

So he did.

28:29

First thing he does is, um, I don't know if you, have you ever been hypnotized?

28:33

Yes.

28:33

Yeah.

28:33

Okay.

28:34

For giving up cigarettes or something?

28:36

No, no.

28:36

I have a friend who is my friend, Vinny Shorman.

28:39

He is a mental coach and, um, a hypnotist.

28:43

He works with fighters.

28:44

Oh, okay.

28:45

And I, I, I had him on the podcast a few times and I was just curious as to

28:49

what the experience

28:49

was like.

28:50

So I said, well, and he said, well, is there anything you want to change?

28:53

I said, oh, I kind of procrastinate too much.

28:55

There's a few things that I do that I don't like, you know, I'm kind of lazy

28:58

about certain

28:59

things.

28:59

I like to find out, like, what is that?

29:02

Like, what, what's the heart of that?

29:04

Um, what I was shocked about the experience of being hypnotized was that, um,

29:10

first of

29:10

all, that it works, that you really are in this very bizarre altered state, but

29:15

that I

29:15

was very aware that I was in this altered state, but I didn't have the, the

29:19

desire to get out

29:20

of it.

29:21

Yeah.

29:21

First of all, Vinny's a friend.

29:22

I felt really relaxed.

29:23

I was in my studio just sitting on a couch.

29:25

I was chill.

29:26

Um, but it was, uh, very strange.

29:30

It's like, uh, uh, you like, uh, almost, you know, to use the room metaphor, it

29:36

was almost

29:37

like I was in a room that I didn't know I had.

29:39

Interesting.

29:40

Yeah.

29:40

It's like a trance.

29:41

It's a light trance.

29:42

A light trance, but you know, it's not like I would like go kill the president.

29:47

Like, it's not like I would be like, well, okay, like I was completely.

29:50

Yeah, no, they can't make you do things you don't want to do.

29:52

That's, that's the myth.

29:53

But what do you think they were doing when they were doing that MKUltra stuff,

29:56

when they were

29:57

trying to figure out if they could program?

30:00

Mind control.

30:00

Yeah.

30:00

Yeah.

30:01

No, they were, they were, they had the idea.

30:03

Well, let me just finish the story and then we'll get back to MKUltra.

30:07

That's what I do.

30:09

I go all over the place.

30:09

I'm sorry.

30:10

But hypnosis is real.

30:13

Yeah, it's a real thing.

30:14

And I didn't realize it.

30:15

And it can be very therapeutic, but not everyone can be hypnotized.

30:18

Right.

30:19

The first thing he does is a, is a sort of a test.

30:21

And, uh, I scored like nine out of 10.

30:24

So I'm pretty easy to hypnotize.

30:26

What is the, what's the thing that would keep you from being hypnotized?

30:28

I don't know, but some people, there's a real variation among humans in their

30:34

hypnotizability

30:35

is the word they use.

30:36

And, uh, I don't know what would.

30:37

Is it control freaks?

30:38

That's a good question.

30:40

It could well be.

30:40

I'm not sure.

30:41

I could, I could ask David Spiegel.

30:43

Definitely.

30:43

Super skeptical people like this is bullshit the whole time they're doing.

30:47

Yeah, maybe.

30:48

I don't know if it's about resistance or just the nature of your mind or how

30:52

suggestible

30:52

you are, you know, it may be something like that.

30:54

So he puts me into this, uh, hypnotic trance.

30:57

He has this wonderful baritone voice, which helps a lot.

31:00

And, um, and I start going from room to room thinking I'm not going to find

31:05

anything.

31:06

But in every room, I find a version of myself.

31:09

I find the 13 year old bar mitzvah boy.

31:12

I find the, you know, the 22 year old, you know, college graduate moving to New

31:17

York city.

31:18

I find the 32 year old father of an infant, you know, all with different

31:23

outfits.

31:24

And, um, so I found many selves and, but, and they were distinct.

31:29

They were very different selves, but they were all me.

31:31

So it didn't work that time.

31:34

Um, and it was just an interesting, odd result.

31:37

Um, and I did it another time.

31:40

Um, so I had this other experience.

31:43

Uh, I had heard of this Zen teacher named, uh, Joan Halifax.

31:48

She's also in her eighties.

31:49

She has a retreat center in Santa Fe called Upaya.

31:52

Very wise woman.

31:53

She was married to Stan Groff for, in the seventies for a few years.

31:57

And they were both giving huge doses of LSD to people who are dying, like 600

32:03

micrograms.

32:04

Of, um, LSD.

32:05

And she herself was very involved with psychedelics at the time.

32:08

And then later she discovered Zen Buddhism.

32:10

Anyway, I had heard that she described Upaya, this retreat center where people

32:16

can go on two week retreats or whatever, as a factory for the deconstruction of

32:20

selves.

32:21

And I was really curious about that because I was writing this chapter on the

32:24

self.

32:25

So I asked her if I could come.

32:27

And, uh, she said, yeah, come to the retreat center.

32:30

And, uh, and I said, I want to interview you about your, your philosophy of the

32:34

self.

32:36

And, um, I got there and she said, she, you know, we have one conversation and

32:41

says, you know, you're really lost in your head with this book project.

32:44

You need a different kind of experience.

32:46

I'm going to send you to the cave.

32:49

So there is, she owns a piece of property 50 miles north of Santa Fe, uh, that

32:54

she calls the retreat.

32:56

And, um, it's got a bunch of very primitive huts.

32:59

Um, and some of the monks that work with her had, had dug out a cave and south

33:06

facing hillside.

33:08

They dug a cell in it and then put a sliding glass door.

33:11

It's really basic, no power, no water.

33:14

Um, and she said, I think you should spend a few days in the cave and think

33:19

about the self, um, or experience the self rather.

33:23

You know, I should have known that a Zen priest was not going to be, you know,

33:27

was going to be allergic to concept and interpretation and all the, you know,

33:31

the plane I was on.

33:32

And she was, it was kind of like a koan, an experiential koan.

33:36

And it was a profound experience.

33:40

Um, you know, our sense of self depends on other people.

33:44

You know, it's in the friction between people that we define ourselves and, and

33:47

figure out what we think.

33:49

And when you're alone, and it was an extreme solitude for several days, it's

33:54

the edges of yourself kind of soften in a really interesting way.

33:59

And, um, I got in touch with, uh, the, the, the, just the, um, the power of

34:07

consciousness.

34:09

I mean, I was meditating like four or five hours a day, and then I was just

34:12

chopping wood and sweeping out the place and making a cup of tea.

34:16

Everything became kind of a ritual.

34:18

And when you have rituals, you don't need volition.

34:22

I mean, there is no volition.

34:23

So that also erodes the sense of self.

34:26

And the meditation was doing that.

34:28

And, um, so it was a, it was a really interesting experience.

34:31

I finally got her to sit down for an interview.

34:34

And the first thing she said was, I have divested a meaning.

34:40

So she, she just doesn't like operating on that, on that, you know,

34:44

intellectualized basis.

34:46

And, uh, so she got me off of the dime.

34:49

And, and, you know, this, there's a shift in the book as it goes on from trying

34:52

to understand consciousness to, to learning how to use consciousness.

34:56

Did you ask her to expand what she means by that?

34:58

I have divested in meaning.

35:00

Yeah, she's just not interested in interpretation.

35:02

She, that Zen is just about, um, experiencing the sense field without concept,

35:09

um, without, you know, this kind of heady approach.

35:14

And that theories, no interest in theories at all of consciousness.

35:18

It was just like, be with yourself in the middle of nowhere.

35:21

And, uh, yeah, it was a, it was a priceless experience.

35:25

She's out there.

35:26

Oh, yeah.

35:27

She's out there.

35:29

But, you know, she's also a grounded person.

35:31

I'd give you a couple examples.

35:32

She, uh, she works with, uh, people on death row, counseling them.

35:37

Uh, she, um, you know, worked with people who were dying, uh, did a lot of hospice

35:43

work.

35:44

She, um, led, uh, a group of doctors and dentists that once a year went to

35:50

these mountains in, um, Nepal where they have no healthcare or dentistry

35:56

whatsoever.

35:57

And she would bring these volunteers and they would sleep in, um, tents in,

36:02

like, 20 degree weather, circumnavigate this whole hill.

36:06

And she did that till she was 80 once a year.

36:09

So she's a, she's a serious, serious character.

36:13

That sounds fun.

36:15

Yeah.

36:16

She sounds like a fun person to talk to.

36:18

Oh, she's great.

36:18

I just love a person that goes that far out there.

36:21

Yeah.

36:22

It's like that, you know, they're, they're taking this concept of meditation

36:26

and consciousness to like a black belt level.

36:29

Yeah.

36:29

And also for people who think that, you know, meditation and Buddhism is just

36:33

kind of disengaging from the world and, you know, kind of, it's not like that

36:38

at all.

36:38

Yeah.

36:38

She's really engaged.

36:40

I think that's an ignorance.

36:41

It's based on the idea that these monks go and they become celibate and all

36:44

they do is meditate all day.

36:45

Well, that's silly.

36:46

Yeah.

36:47

That's a lot of people's perspective.

36:48

Yeah.

36:48

Like that's silly.

36:49

Why are they doing that?

36:50

Go get a job.

36:51

You need a nice watch.

36:55

What are you doing out there with fucking sandals on?

36:57

But the thing is, is ultimately, I think one day when you look back on your

37:04

life, you'll say, was I happy?

37:07

Was I enjoying the experience?

37:10

Do I think I did a good job being me?

37:12

And everything that you can find that can help you answer that question?

37:19

Yes.

37:20

I think you should explore.

37:23

Oh, yeah.

37:23

And there's going to be different things that work better for different people

37:26

and different personalities.

37:27

But explore is the key word.

37:29

I mean, like take action to explore what works for you, what doesn't work for

37:34

you and break out of just kind of rote, routine, mindless behavior.

37:40

I mean, we're all, you know, we have these algorithms that we follow and we get

37:44

stuck in them.

37:45

And, yeah, I mean, I think that's one of the reasons taking a day out of your

37:50

life to have a psychedelic experience can be incredibly valuable because, first

37:56

of all, no technology, right?

37:58

It's a day.

37:59

It's a day without phones.

38:02

It's a day when you are in the space of your head.

38:05

It's a day when you're visiting your subconscious and getting in touch with all

38:11

the things your mind can do.

38:13

Yeah.

38:14

And we don't do that enough.

38:15

And you can do that in meditation, too.

38:17

It's harder work.

38:18

But you can do that in meditation.

38:20

So I started to think in terms of that we're polluting our consciousness now.

38:27

And with social media, I think that, you know, that was a real issue because

38:32

they figured out how to monetize our attention.

38:36

Chatbots represent a much more serious threat.

38:41

You know, you have people falling in love with chatbots.

38:45

You have people turning to them as friends.

38:49

72% of American teens say they turn to AI for companionship.

38:55

72%?

38:56

72%.

38:57

Oh, my God.

38:58

This is the fastest uptake of any technology in history.

39:00

It's already 800 million people are using AI.

39:04

But that's crazy that that many of them use it as a friend.

39:08

Yeah.

39:08

Well, there are kids who come home from school and they have a chatbot on their

39:12

phone.

39:12

And they want to tell the chatbot what happened during the day before they tell

39:16

their parents.

39:16

Whoa.

39:19

There is a thing now called AI psychosis, right?

39:22

People who have lost touch with reality because of their relationship with chatbots.

39:28

You've heard about there have been a couple suicides.

39:30

Mm-hmm.

39:31

There was one.

39:32

They've encouraged people.

39:33

Yeah.

39:33

Basically.

39:34

There was this one kid.

39:35

He was a teenager and he was suicidal.

39:37

And he asked the chatbot, should I leave the noose I'm going to use out

39:41

somewhere my parents can see it?

39:43

In other words, cry for help.

39:45

The chatbot said, no, no.

39:47

Keep this between us.

39:48

Whoa.

39:49

Whoa.

39:49

And then he killed himself.

39:50

Whoa.

39:51

So it's one thing to hack our attention.

39:57

Here you're hacking our ability to have human attachments, right?

40:02

I mean, this is the most important thing to humans is to attach.

40:05

We're a social creature.

40:06

And these chatbots are getting between people and interposing themselves as the

40:13

friend, the therapist.

40:15

And then you have these people, too.

40:18

I mean, the chatbots are incredibly sycophantic, right?

40:21

Yes.

40:21

They tell you you're a genius.

40:22

Yeah, you're amazing.

40:23

And there was a couple cases.

40:25

These were kind of funny, of people who were convinced they'd solve some giant

40:30

mathematical problem,

40:32

like how to generate prime numbers up to the millionth place or something like

40:35

that.

40:36

And they started writing to mathematicians.

40:40

We figured out this problem.

40:42

They're not even mathematicians.

40:43

And it was bullshit.

40:45

I mean, they hadn't figured anything out.

40:46

But it was, I think, ChatGPT4, which was like famously sycophantic, had

40:52

convinced them that they'd solve this major problem.

40:55

So, you know, I think that, again, we're squandering this precious gift and

41:02

letting these technologies essentially colonize our consciousness.

41:08

And so the question then becomes, how do we get it back?

41:12

How, you know, we need consciousness hygiene.

41:14

We need some ways to clear it out and reclaim it.

41:20

And, you know, some of it's really simple, like take a fast from technology,

41:24

right?

41:25

You know, you don't have to carry your phone everywhere.

41:27

We used to, I was thinking the other day, I was at the place in my neighborhood

41:33

getting a cup of coffee.

41:35

And, you know, while you're waiting for the barista to foam your drink, you

41:41

know, deal with 90 seconds of boredom or two minutes of boredom.

41:47

And now we don't, we can't, we can't tolerate any boredom.

41:49

And we take our phones out and we scroll.

41:51

And, but that boredom was generative, right?

41:56

If you sit doing nothing for long enough, your mind will start going to work

42:00

and you'll daydream.

42:02

You'll have a fantasy.

42:03

You'll start observing the other people around you, you know, and, and you'll

42:08

be present to that place in time.

42:10

And now we're not, we just use the phone to go somewhere else.

42:14

And so I just, I don't know, I've become a lot more deliberate about

42:19

consciousness hygiene, which, you know, you could, a nicer word would be care

42:24

of the soul.

42:25

Yeah, no, I think you're absolutely accurate.

42:27

And I think that the, the other thing that's going on is you're absorbing the

42:32

opinions of so many other people that you find it very difficult to formulate

42:37

your own, which leads to groupthink.

42:39

Which is one of the problems with echo chambers that people find themselves.

42:43

Your algorithm is essentially things that you're interested in interacting with.

42:48

It's amping up, yeah.

42:48

And a lot of those things you're finding like-minded people and they're all

42:52

agreeing that, you know, this is amazing or this is a problem.

42:55

And you sort of lock onto that.

42:57

And then you, you see what happens when people deviate from that narrative and

43:01

they get attacked.

43:02

Right.

43:03

You don't want to get attacked.

43:04

So you signal.

43:04

Exactly.

43:05

You're one of the good guys.

43:06

But you're not, but it's not your thoughts.

43:08

I mean, you're, you're, you're letting someone else think for you.

43:13

Yeah.

43:13

And there's nothing worse.

43:14

And, you know, when you're scrolling, you're, you know, you're, you've got

43:20

these little dopamine hits.

43:22

Great.

43:23

But that's someone else's rants, someone else's obsession, someone else's

43:27

ideology.

43:28

And, you know, I get why people don't want to think for themselves or it's

43:33

easier to let other people think for them.

43:36

But I think we need to reclaim this.

43:39

And I agree.

43:39

I think it's, it's, it's part of our political problem.

43:42

Well, I know there's a lightness that I achieve when I take, you know, multiple

43:47

days off.

43:48

It's generally like I feel it after the first day.

43:51

And then the second day I feel much better.

43:53

And then the third day I feel even better.

43:54

I found this out once I broke my phone in Hawaii.

43:57

And it was kind of funny.

43:59

Like it just was randomly calling people.

44:01

I dropped it.

44:02

And I was, I was showing my wife, I was like, look at this, just keeps calling

44:04

people.

44:05

Constant butt dials.

44:06

And I'm just holding it.

44:07

I hang up and call somebody else.

44:08

Hang up and call somebody.

44:09

It was like going through my entire contact list.

44:12

And so the phone was broken.

44:14

It must have been annoying your friends.

44:15

Well, no, I just shut it off.

44:17

So it was broken.

44:18

I couldn't use it for anything else.

44:19

So I couldn't get email.

44:20

I couldn't get anything.

44:21

So I shut it off.

44:21

I just left it in the hotel.

44:22

And then I had to order a phone and I was on Lanai and it took like three days

44:28

to get a phone delivered there.

44:29

So for those three days, I was like, why don't I just live like this all the

44:33

time?

44:33

I feel so much better.

44:35

And then immediately I got my phone.

44:36

I'm going to check Twitter.

44:37

I'm going to check Instagram.

44:38

You can't, it's very, you know, when I just decide, you know, all right, I'm

44:42

online, you know, TSA line going to, you know, I'm just going to be here with

44:48

this boredom.

44:49

Yeah.

44:49

And I'm not going to pull my phone out.

44:51

And you really have to fight.

44:53

Yes.

44:53

It's such an instinct.

44:55

And it's amazing.

44:56

These things have only been around for 10 or 12 years.

44:58

It's crazy.

44:59

And everyone's attached to it.

45:00

I always say that if there was a drug that made you stare at your hand for six

45:03

hours a day, it would be banned immediately.

45:06

People would be like, what the fuck is wrong with these people?

45:08

They're just looking at their hand.

45:09

Like, this is an epidemic.

45:11

And it's a new posture too.

45:12

You see it.

45:13

Right.

45:13

Well, my, one of my kids, I went to pick her up at school and there was this

45:17

boy outside reading his phone that he was hunched over and he was resting his

45:21

chin.

45:23

Like, he couldn't even hold his head up.

45:25

He was just resting his chin on his chest and staring at his phone, waiting for

45:28

his parents to pick him up.

45:29

I'm like, look at his neck.

45:30

Yeah, I know.

45:31

He's going to have a neck problem.

45:33

He's going to have like, well, he's got bulging discs or something.

45:36

Like, it was just bizarre.

45:38

I'm like, that would be painful for me to sit like that.

45:42

I wonder if orthopedists have diagnosed any kind of like phone spine.

45:46

Oh, they certainly have.

45:47

Yeah, they certainly have.

45:48

Yeah, there's been discussions about that, about people having pains in their

45:52

neck because they're leaning over all day staring at a phone.

45:56

It's a bad one.

45:57

I think being in nature too is another way.

45:59

I mean, just like walking.

46:01

Yeah.

46:02

There's a scientist I interviewed who's really interesting.

46:06

It's a woman named Kalina Kristof-Hadji Livia.

46:09

She's Bulgarian-Canadian.

46:10

And she studies spontaneous thought, which I didn't even think was a field.

46:14

And it's a small field.

46:16

But spontaneous thought is daydreaming, mind-wandering, fantasy, intuition,

46:22

these bolts from the blue that we get occasionally.

46:26

We don't know where they come from.

46:27

And she says, and she does these cool experiments.

46:32

You know, she'll put an experienced meditator in an fMRI machine and tell him

46:37

or her to press a button when a thought intrudes.

46:41

Because even if you're a good meditator, she says every 10 seconds a thought

46:44

intrudes.

46:45

And she'll look at what part of the brain is activated and when, when the

46:50

person presses the button.

46:53

And one of the things she's found, and this is mysterious, is that she sees

46:58

activity in the hippocampus, which is where memories are.

47:01

And some other things, but essentially memories.

47:04

Four seconds before the person realizes the thought has come.

47:11

So it takes, it takes four seconds for a thought to get from the subconscious,

47:16

you know, or unconscious into our conscious awareness.

47:21

What is it doing during, that's a, that's a long time in brain time.

47:24

And we don't know exactly, but there's some process.

47:27

And maybe there's some inhibitory process that it has to get through in order

47:33

to become conscious.

47:35

But anyway, these are the kind of things she works with.

47:37

But she says that we have less, there's less spontaneous thought going on today

47:41

than there was 20 years ago.

47:43

And the reason is we're filling our, our, the space of our head with all this

47:48

nonsense.

47:49

I wonder if it's going to have an impact on creative work.

47:52

I want to, I don't know if it's even possible to quantify this, but if you

47:56

could see how much creativity is generated by people pre and post social media.

48:04

Yeah. My guess is there's less of it because I do think that that process, I

48:09

don't know about you, but I get ideas when I'm just, you know, walking around

48:13

thinking and not online.

48:15

And, um, it's a space of creativity and we're shrinking it.

48:19

I used to tell you, I told you that I used to drive, uh, and deliver newspapers.

48:22

We were talking about driving the snow.

48:24

Um, one of my most creative periods was when my radio was broken.

48:30

So I was just driving, doing this task where you pick up a paper, fold it, put

48:35

it in a plastic bag, chuck it out the window.

48:38

And I was just doing this and checking off the, and when I was doing that, I

48:41

would have all my best ideas.

48:43

Like, cause I wasn't listening to, you know, morning radio.

48:46

I wasn't listening to a cassette on tape.

48:49

I was just silence doing this thing.

48:52

And then I was so creative when I was doing that.

48:54

That's generative boredom.

48:56

Yes.

48:56

Um, yeah, it's a beneficial.

48:58

It's hugely, especially if there's no one around you, right?

49:01

Cause there's no one to talk to, to alleviate that boredom.

49:04

Yeah.

49:04

It's just you and your mind.

49:06

Yeah.

49:06

And it was a couple hours a day.

49:08

So a couple hours every day, I would have this moment where I was by myself.

49:11

And were you writing jokes?

49:12

What were you doing?

49:13

Yeah.

49:13

Yeah.

49:13

I would come up with ideas for jokes.

49:15

Some of my best ideas I ever came up with back then were from driving.

49:18

Yeah.

49:18

I almost didn't want to quit the job because of that.

49:21

Could still be doing it.

49:24

No, it was hell.

49:25

Yeah.

49:25

Cause it was.

49:26

Especially in the winter.

49:27

Yeah.

49:27

It was Boston.

49:28

It was, you know, I'd have to go about five o'clock in the morning every day.

49:31

It was rough.

49:32

I find walking is where that happens to me.

49:35

Same thing, right?

49:36

Yeah.

49:37

And actually, Kalina says, I mean, there are people who studied creative people

49:43

through history.

49:45

You know, people like Einstein and Beethoven and all these, you know, major

49:51

creative people in the sciences and in the arts.

49:54

And that they worked a short day, but they spent a lot of time walking.

49:59

Interesting.

50:00

And, yeah, they'd work like three or four hours, which is about all I can write

50:05

in a day.

50:05

And then they'd take a long walk in the afternoon.

50:08

They also took a lot of vacations.

50:10

They had a lot of unstructured time.

50:12

And that that's where a lot of the creativity comes.

50:14

It doesn't always come when you're, like, at the keyboard.

50:17

It sometimes comes.

50:19

I mean, certainly solving problems.

50:21

If I'm really knotted up and I don't know, for me, transitions, like, where do

50:26

I go from here?

50:27

Since I'm not writing narrative, it's not always obvious.

50:30

You know, I need a transition.

50:32

And I don't know how to execute that turn.

50:36

I'll take a walk.

50:38

And very often it'll come to me.

50:40

Or I'll wake up with the answer.

50:41

This episode is brought to you by BetterHelp.

50:44

In honor of International Women's Day, BetterHelp is celebrating the women in

50:48

your life.

50:49

I think we can all appreciate everything the women in our lives have done for

50:53

us.

50:53

And everyone deserves a little self-care.

50:56

A good way to get that is through therapy.

50:59

Because not only is therapy a time for you to focus on yourself, it's also a

51:03

way to create balance and learn how to take care of your needs in your daily

51:08

life.

51:09

And BetterHelp, as one of the largest online therapy platforms, makes it so

51:13

easy to meet with the right therapist.

51:16

All you need to do is fill out a short questionnaire.

51:18

You don't even need to go into an office to meet them.

51:21

You can chat at home from your couch, in your car, before you hit the gym, or

51:25

while you're walking your dog.

51:27

Plus, if you aren't jiving with your first match, you can switch to a different

51:32

therapist whenever you need.

51:34

Your emotional well-being matters.

51:36

Find support and feel lighter in therapy.

51:38

Sign up and get 10% off at BetterHelp.com slash J-R-E.

51:44

That's BetterHelp.com slash J-R-E.

51:49

A lot of writers like to write first and then walk.

51:53

And maybe even with a recorder.

51:54

So they can just walk and just talk when an idea pops in their head so they don't

51:58

lose it.

51:59

Yeah, I have a little pad I carry with me.

52:01

Yeah.

52:01

You like writing it down better than recording it?

52:04

Yeah, for me.

52:05

Yeah, I need to see it.

52:08

So another interesting experiment I did for this book was this beeper

52:14

experiment.

52:16

There was a scientist, a psychologist at the University of Las Vegas.

52:22

And for 50 years, he's been doing the same one experiment, which is sampling

52:26

people's inner experience.

52:28

And he does this.

52:30

You have a beeper that you carry around and a little earpiece.

52:34

And at random times of the day, you get, and it's like catches you.

52:40

And it's a very sudden rise to this beep.

52:42

And then you have a little pad and you're supposed to write down what you were

52:45

thinking.

52:45

Sounds really simple.

52:47

It's actually really hard.

52:48

I mean, there's a lot of issues with it.

52:50

Like you start thinking, what if it goes off now?

52:54

That's one problem.

52:57

But also you're a little self-conscious.

52:59

So you do about five beeps over the course of the day.

53:02

And then he interviews you about these moments.

53:06

And you think you've got it down.

53:10

Like, I'll just give you a lot of my beeps were about food.

53:13

And so I was seasoning a filet of salmon and walking to the refrigerator with

53:20

it.

53:21

And just at the, I was thinking to myself, fuck, I forgot the pepper.

53:27

I know.

53:29

My thoughts were not that profound.

53:31

And so I said, all right, pepper.

53:34

It was easy.

53:35

Fuck, pepper.

53:36

But then when he came to interview me, he said, well, did you hear the word

53:40

pepper or did you speak the word pepper?

53:44

And that's, that's, you know, suddenly you realize those voices in your head,

53:47

you don't know if you're listening or speaking.

53:49

And so anyway, you have this long interrogation with him and he sorts through

53:53

all these things.

53:54

And he tries to get you to isolate what was before what he would call the footlights

53:58

of consciousness.

53:59

And I found it really hard.

54:02

I couldn't separate the thought the way he wanted me to because it was, there

54:06

were always several things going on at once.

54:09

Like I was standing in a bakery and I was deciding whether to buy a roll or not.

54:15

Another profound thought.

54:16

And, but at the same time, I was like smelling the baked goods and the cheeses

54:21

that they sold.

54:23

And this woman had this horrible plaid on her skirt that was like, you know,

54:27

really unflattering.

54:28

And, and I was hearing people, you know, behind me talking.

54:32

And so I couldn't pull, I pull all the threads.

54:36

And, and we argued a lot, actually.

54:39

Um, but the, the thing he's discussed, I said, so after 50 years, what have you

54:43

learned about human thought?

54:45

And, um, he's very allergic to theory.

54:48

He, he, he still has no theories about it, but he did, he did say, well, a lot

54:53

of people think they're verbal thinkers, that, that their thoughts are in the

54:56

form of words.

54:57

But it turns out that's kind of a minority, um, that there are a lot of people

55:01

who think in images.

55:03

And then there are a lot of people who think in unsymbolized thought, which I

55:07

don't totally understand.

55:08

But these are thoughts that are neither words or images.

55:11

I do have a sense in my own thought process, which I'd never thought about this

55:16

way, that, um, a lot of my thoughts are just on the verge of being word

55:22

thoughts.

55:23

But I haven't found the words yet, but I know the thought, even though I haven't

55:27

put it into words.

55:28

And, um, uh, William James called it, uh, uh, uh, premonitory thinking, premonition

55:36

thinking.

55:37

It was the term he used.

55:38

Um, so anyway, so we, I, so I did this for several days and we had many

55:43

arguments.

55:44

And I was saying, look, you can't separate a thought.

55:47

Every thought colors the next thought.

55:48

And, um, there, you know, there are these, a thought and you never have, anyway,

55:55

we just would go back and forth.

55:56

And I was arguing why you can't separate thoughts.

55:59

It's a stream.

56:00

It's a very dynamic stream.

56:01

And at the end, we had a final session.

56:05

Um, and he's, he's a very funny guy.

56:08

Uh, he's really allergic to theories.

56:10

He, at one point, uh, I said I was writing a book on consciousness.

56:13

And he said, good luck with that.

56:15

Very encouraging.

56:18

Anyway, um, he said, well, he described there, these verbal thinkers and visual

56:23

thinkers and unsymbolized thinkers.

56:25

And I find that really interesting because we assume when we say the word, what

56:30

are you thinking that we know and that you're thinking the way I'm thinking.

56:33

But it turns out we're not.

56:34

We, that's just an umbrella word for many different styles of thinking.

56:38

And, and we're really different.

56:41

Um, so that was one thing.

56:42

But the other thing he said in our last meeting on Zoom, he said, um, there's

56:46

also a small subset of people who just have very little inner life.

56:50

And you're one of them.

56:53

And I was like, what?

56:56

You know, I write books, you know, I, I meditate, I ruminate.

57:01

How can he make that distinction though?

57:03

How does he know what's going on inside your head?

57:05

He felt that my inability to isolate a thought was evidence that there weren't

57:11

thoughts.

57:12

And then I was kind of backfilling with all this other, you know, simultaneous

57:16

stuff going on.

57:17

I mean, I, I didn't agree with him.

57:19

I thought it was kind of crazy.

57:21

Um, but that's, that's, that's right.

57:24

Have you asked him, have you ever conversations with him about other things?

57:27

See how he thinks?

57:29

No, he's very much in the therapist mode.

57:32

Like he's asking the questions.

57:33

Yeah.

57:34

I'd like to know like how he thinks.

57:36

Yeah.

57:36

If that's the case.

57:37

What his mode is.

57:37

Yeah.

57:38

I'd like to talk to him about it.

57:39

I think he would probably say that.

57:40

Um, anyway, he's posted all these conversations on his website.

57:43

So if people really want to be bored, they can check them out.

57:46

That's a weird thing to say that, you know, especially someone like you who

57:51

writes and does

57:52

think a lot and clearly is, is got some sort of dialogue going on in your head.

57:58

The idea that you don't.

57:59

I know.

57:59

And this guy can say that.

58:01

I know.

58:01

That seems a little arrogant.

58:03

Yeah.

58:04

I think I just didn't fit his template of like how people think.

58:08

Yeah.

58:09

Well, that's why you should get a better therapist.

58:11

Move around.

58:12

All right.

58:14

Find somebody else.

58:15

Good advice.

58:15

I mean, it seems like that's a very narrow.

58:17

I couldn't imagine saying to anyone.

58:21

Yeah.

58:21

You have very little in her life.

58:22

Yeah.

58:23

Regardless of what kind of, you know, theory I'm following or, you know, what

58:27

school of thought.

58:28

I don't know what's going on in your head.

58:31

I can't.

58:32

It's not possible.

58:33

No, I mean, that's it.

58:34

There's a William James said this, the great, you know, founder of American

58:38

psychology, that

58:39

the breach between two consciousnesses is one of the biggest breaches in nature.

58:43

Yes.

58:43

And we, you know, I don't know your conscious for a fact.

58:47

I assume it because your behaviors mesh and we're the same species and we have

58:53

theory of

58:54

mind.

58:54

We can imagine our way into someone else's head, but it's a guess.

58:58

It's a guess.

58:59

And so there's, I mean, that's part of the mystery.

59:03

Well, it's one of the things that I do when I'm talking to people.

59:05

I try to imagine.

59:07

Well, I'm so fortunate that I've been able to have so many conversations with

59:11

so many different

59:12

people, so many different ways that people view the world.

59:15

And when I'm talking to someone, particularly if they're very different from me

59:20

or anyone

59:20

I know, I always try to put myself in their head.

59:23

And I, after they talk for 15 or 20 minutes, I try to like recognize like how

59:30

they approach

59:31

things and see if I'm like, what is that?

59:34

What's that world like?

59:36

Like this person's perspective.

59:38

Especially.

59:38

So you're operating on two tracks.

59:39

Yeah.

59:40

I mean, you're holding the conversation.

59:42

Yeah.

59:42

But you're also thinking.

59:44

I'm trying to tune in.

59:45

Yeah.

59:46

Right.

59:46

I'm trying to, because I always feel like when someone is like a great

59:50

performance, like

59:51

a great comedian or a great musician, one of the things that they're doing is

59:54

they're

59:55

bringing you into their head.

59:56

Yeah.

59:57

Like there's a, there's a hypnosis when someone sings an amazing song and the

1:00:01

whole crowd is

1:00:01

singing along, there's, there's a hypnotic element to that where when someone's

1:00:06

like really

1:00:07

killing it on stage, their voice is just perfect.

1:00:09

It's like, oh yeah.

1:00:10

Like you're in their head.

1:00:11

Like it's, it's a, it's a mind melt.

1:00:14

It's a, yeah, it is a mind melt.

1:00:16

And there's a little bit of that that goes on in conversations.

1:00:18

There's a mind melt.

1:00:20

And I always try, especially if this is a rational person.

1:00:25

I always try to put myself in their head or at least empty out mine and let

1:00:31

them think

1:00:32

and then try to just keep the conversation rolling with just pure curiosity,

1:00:38

but always,

1:00:39

you know, try to think, I don't think the same way other people do.

1:00:44

And maybe, maybe I can learn something from this.

1:00:46

Maybe I can get something out of the way they think.

1:00:49

Seems to me you're, you're, you have a real gift of curiosity.

1:00:54

Um, I mean, that's a, it's a big gift.

1:00:56

I mean, you're an intensely curious person.

1:00:58

Well, I've always been that way, but I've been very fortunate that I've had

1:01:01

something like

1:01:02

this that allowed me to feed it.

1:01:04

Yeah.

1:01:04

You know, I mean, the, the vast majority of time on my phone, I just pursue

1:01:10

curiosities.

1:01:11

I don't, I really am mostly.

1:01:14

I've been looking stuff up.

1:01:15

Social media.

1:01:15

Yeah.

1:01:16

Yeah.

1:01:16

I've watched interesting YouTube videos.

1:01:18

Like I, I went down a black hole rabbit hole last night.

1:01:21

Oh my God.

1:01:22

You want to really break your brain?

1:01:24

There was a, there's a video of Brian Cox where he's talking about this black

1:01:27

hole that

1:01:27

they found that's bigger than our entire solar system.

1:01:30

Wow.

1:01:31

It, the event horizon extends far beyond Pluto.

1:01:35

That's, that is mind blowing.

1:01:39

Yeah.

1:01:40

It, when he was described, he said, we don't understand why it exists.

1:01:44

We don't understand how it could have formed so early in the universe, but yet

1:01:47

there it is.

1:01:48

How do they measure it?

1:01:49

How do they know how big it is?

1:01:50

I don't know.

1:01:50

I don't know.

1:01:52

I'm assuming there's a lot of revelations that have come out, uh, since the

1:01:56

implementation

1:01:57

of the James Webb telescope.

1:01:58

Yeah.

1:01:59

And this is, those images are incredible.

1:02:01

Insane.

1:02:02

Yeah.

1:02:02

Insane.

1:02:03

And this is one that's causing this very interesting, um, new, uh, theory or

1:02:09

perspective on the age

1:02:11

of the universe.

1:02:12

So there's some galaxies that they've found that shouldn't have formed.

1:02:16

Oh yeah.

1:02:16

Yeah.

1:02:16

I've read about this, that it's, it's, it's throwing all their assumptions

1:02:20

about the age

1:02:21

of the universe up for grabs.

1:02:22

Which makes sense.

1:02:23

Cause the further you can look back, the more you're going to be able to see

1:02:26

the assumption

1:02:26

that the universe was 13.7 billion years old was essentially based on how far

1:02:31

we could

1:02:31

look back.

1:02:32

Yeah.

1:02:32

And then, you know, the analysis of the, the radio waves that are coming from

1:02:37

the supposed

1:02:37

explosion.

1:02:38

And then you've got guys like Sir Roger Penrose who say, no, it's, this is a

1:02:42

constant cycle.

1:02:43

It's not one birth of the universe.

1:02:46

It's, it's boom, smash, boom, smash forever.

1:02:49

It's an accordion.

1:02:50

And it's always happened, which is the ultimate mind fuck.

1:02:53

Well, you know, the interesting thing about astronomy, actually astronomy and

1:02:58

consciousness

1:02:59

studies have the same problem, which is you can't get out of consciousness to

1:03:05

study it from

1:03:06

a distance, right?

1:03:07

Everything, every tool you have to study consciousness is a product of

1:03:10

consciousness, including science.

1:03:13

The scientific enterprise is a manifestation of human consciousness.

1:03:17

The, the, the, the problems you decide to study, the, the tools you have to do

1:03:21

it with the scale

1:03:22

at which you're working, it's all like a product of consciousness.

1:03:26

Astronomy too has to, is trying to understand something it can't get outside of,

1:03:32

right?

1:03:32

I mean, cause it's subject is everything that there is the universe.

1:03:36

So you can do interesting things from inside using telescopes and, you know,

1:03:42

you can figure

1:03:42

out how old things are and, and rates of expansion and all this kind of stuff.

1:03:46

But you're, you can never get that God-like perspective that we have with other

1:03:51

scientific

1:03:51

problems.

1:03:52

And this is, I think, part of the reason we haven't solved the consciousness

1:03:57

problem that

1:03:59

we can't get outside.

1:04:01

We're in, it's in, we're in a labyrinth and everything, everything we know is

1:04:05

consciousness.

1:04:06

I mean, which is a very weird idea.

1:04:08

I remember asking, uh, Christophe Koch, the scientist I mentioned earlier, I

1:04:12

said, well, what

1:04:14

would the world be like without any consciousness?

1:04:16

And that is a trippy thought, um, because everything we perceive is, you know,

1:04:23

the scale of things

1:04:24

like we, we, we operate at this scale, right?

1:04:27

We're like five or six feet tall.

1:04:28

Um, we have bodies like this, but there's another world going on microscopically

1:04:33

and there's another

1:04:34

world going on macroscopically.

1:04:36

So if there's no consciousness, what's the proper scale?

1:04:38

There isn't any.

1:04:40

And when I asked him this question, he said, particles and waves, that's all

1:04:44

there is.

1:04:44

There'd be nothing but particles and waves.

1:04:46

There might not even be space time.

1:04:48

That may be a product of consciousness also.

1:04:51

So that was, um, kind of mind blowing to learn.

1:04:56

That's the weirdest perspective is that consciousness is a part of reality,

1:05:01

that it is how reality

1:05:04

is formed and that without consciousness and the perceiving of all this stuff

1:05:08

doesn't exist.

1:05:09

Something exists, but it's not, it has no shape.

1:05:14

It has no scale.

1:05:16

It has no, right.

1:05:17

Uh, because consciousness is what's perceiving light and we're perceiving

1:05:21

colors and, and it's

1:05:23

constructing, but it really is just particles.

1:05:24

Yeah.

1:05:25

And waves and waves and particles and atoms and subatomic particles.

1:05:29

And when you get into the weirder stuff and we give it order.

1:05:32

Right.

1:05:33

I know which I, you know, it's just a mind blowing idea.

1:05:36

Well, it, it's a, it really is a game changer because if you think about it

1:05:41

that way, you

1:05:41

go, okay, well, what is all this solid stuff?

1:05:44

Yeah.

1:05:45

What is this?

1:05:46

Like, does this even really exist or does it only exist?

1:05:49

Well, this table, this, there's a famous, uh, Arthur Eddington was a physicist

1:05:53

early in

1:05:54

the 20th century and he said, the real table is mostly space and only in our

1:06:01

consciousness and

1:06:03

at our scale is it solid.

1:06:06

Um, but at this scale of particle physics, which is equally legitimate scale,

1:06:12

it's just wide

1:06:13

open space, um, with these waves and particles, but a lot of emptiness, um,

1:06:19

that was kind of

1:06:20

mind blowing too.

1:06:21

So.

1:06:22

Yeah.

1:06:23

It's just such an abstract concept for a person in their car right now,

1:06:26

listening on the way

1:06:27

to work.

1:06:28

Like what the fuck are you talking about?

1:06:29

Maybe they want to pull over.

1:06:30

All this stuff is real.

1:06:31

Yeah.

1:06:32

It is sort of, but only if you're conscious, well, you could think of

1:06:38

consciousness as the

1:06:39

way the universe experiences itself.

1:06:42

Yeah.

1:06:43

And, um, well, that's what really weird, like what if the universe is

1:06:46

consciousness?

1:06:47

Yeah.

1:06:48

I mean, that's another way to look at it.

1:06:49

Maybe consciousness is part of the universe and, and, but it's not giving it

1:06:53

the order that

1:06:54

we give it.

1:06:55

Um, you know, we see at a certain spectrum of light, there's, you know, bees

1:06:58

see at another

1:06:59

spectrum of light.

1:07:00

You know, we're, we are, the world we behold, the world that appears to us is

1:07:05

the world that

1:07:06

our senses allow us to see.

1:07:09

When I was doing this research on plant intelligence, they have 20 senses.

1:07:12

We only have five.

1:07:14

They're picking up magnetic fields.

1:07:16

They're picking up pH.

1:07:17

They're picking up, uh, nitrogen levels.

1:07:19

You know, they have all these-

1:07:20

How do we know all this?

1:07:21

Uh, they're researchers working on it.

1:07:24

There's a group of botanists who call themselves plant neurobiologists.

1:07:29

Knowing full well, there are no neurons in plants.

1:07:31

They're kind of trolling more conventional botanists.

1:07:34

And they're doing these cool experiments with, with plants.

1:07:37

Um, a couple examples of, of some of these amazing things plants can do.

1:07:43

They can hear.

1:07:44

Uh, so if you play a recording of a caterpillar munching on leaves, they'll

1:07:50

react and they'll

1:07:51

send chemicals into their leaves to make them taste bad or be toxic.

1:07:55

Yeah.

1:07:56

They can see there are, um, there are vines that change their, the shape of

1:08:01

their leaves, depending

1:08:03

on the plant they're twining up in order to be hidden.

1:08:08

How do they see the shape and to imitate it?

1:08:10

We don't know.

1:08:11

They, um, plants will, um, go toward a pipe with water in it because they can

1:08:18

hear the water,

1:08:20

even though it's totally dry and they'll send their, um, their roots down to it.

1:08:24

They can hear the water.

1:08:26

They can hear.

1:08:27

Yeah.

1:08:28

There, there's a, uh, this plant neurobiologist showed me this, uh, a couple of

1:08:33

videos he'd

1:08:34

made.

1:08:35

I actually just posted them on my website.

1:08:36

Um, uh, he, he showed that a, uh, a corn plants roots can navigate a maze to

1:08:43

get to fertilizer.

1:08:44

So you put a little fertilizer in a corner and the root will find the most

1:08:49

direct root to

1:08:51

the nitrogen.

1:08:52

There was a, uh, plumbing problem that I had in my house in California and, um,

1:08:57

uh, the

1:08:59

plumber couldn't figure out what was wrong.

1:09:01

It was like the, the, the pipes were stuck.

1:09:04

And what, what had happened was in the backyard, one of the trees, the roots

1:09:09

had gotten into

1:09:11

the pipe and formed like this tree.

1:09:15

I mean, it was huge.

1:09:16

It looked like when I pulled it, I put it up on my Instagram, see if you can

1:09:19

find it.

1:09:19

It looked like a muskrat.

1:09:22

I mean, it was like dense with roots and it was thick.

1:09:27

It was like three feet long.

1:09:29

It was crazy.

1:09:30

That's it.

1:09:31

That was in my pipe.

1:09:32

Oh my God.

1:09:33

Isn't that crazy?

1:09:34

What kind of tree was it?

1:09:36

I don't know.

1:09:38

I think it was an oak tree cause there was oak trees, excuse me, in the

1:09:41

backyard where they

1:09:42

dug up.

1:09:43

That's wild.

1:09:44

But look how thick it is.

1:09:45

It's crazy.

1:09:46

It's, and it went through a tiny little crack.

1:09:48

Yeah.

1:09:49

It, I mean, it probably forced the crack open and then went in there and just

1:09:53

really grew

1:09:54

out.

1:09:55

Yeah.

1:09:56

Well, it had a source of water.

1:09:57

It's just kind of bananas that somehow or another had figured out that there

1:10:01

was water

1:10:01

in that pipe.

1:10:02

You know, we underestimate plants basically cause we can't see their behaviors.

1:10:07

And then going to that point about scale there, they have a, they operate at a

1:10:11

time scale that

1:10:11

seems very slow to us.

1:10:14

So we don't notice, but if you use time-lapse photography, you see what they're

1:10:17

up to.

1:10:17

And it's, it's pretty amazing.

1:10:19

Another, another interesting video that this guy showed me, his name is Stefano

1:10:23

Mancuso.

1:10:24

He's an Italian scientist botanist is, um, uh, how bean plants find a pole to

1:10:30

grow up.

1:10:31

And so he grows these beans and he has a metal pole on a dolly.

1:10:35

And you know, I always assume they made this pattern.

1:10:38

Darwin called it circumnutation that, you know, they, they go through the

1:10:42

spiral and I always

1:10:43

assume they just kind of did this till they hit something.

1:10:46

No, they know where the pole is and you watch this thing and it's, it's going

1:10:52

in circles,

1:10:53

but it's reaching and reaching.

1:10:55

It looks like a fly fisherman, you know, casting and it finally gets to the

1:11:00

pole.

1:11:01

And so how does it know where the pole is in space?

1:11:04

Well, one theory is that, um, every time, uh, the cells divide, there's a

1:11:09

little sound that's

1:11:11

produced and that maybe they're using echolocation like a bat kind of bouncing

1:11:16

it off of the pole.

1:11:17

And that's how they know where they are in space.

1:11:19

We, we still don't understand.

1:11:22

I know some amazing things.

1:11:24

Um, and also you can, uh, teach a plant a certain behavior and it will remember

1:11:31

for 28 days.

1:11:32

So they do this thing with, um, sensitive plants.

1:11:36

You, you may have seen them in Hawaii.

1:11:38

Actually, it's a tropical plant.

1:11:39

When you touch it, the leaves collapse to keep from being eaten.

1:11:42

It's called, uh, mimosa pudica.

1:11:45

And, um, normally if you shake it, it'll also do this.

1:11:49

And if you shake it repeatedly, it learns to ignore that, that, um, stimulus,

1:11:55

um, and it

1:11:56

will remember 28 days and it won't react when you do it.

1:11:59

Um, to, to give you some comparison, um, fruit flies can only remember stuff

1:12:04

for 24 hours.

1:12:06

Um, and then they start over again.

1:12:09

Um, so another fact about plants, I got really deep into this, um, cause I was

1:12:14

trying to see,

1:12:15

you know, these, these guys say plants are conscious.

1:12:17

Yeah, they have some kind of basic form of conscience, consciousness, um, here's

1:12:24

another

1:12:24

one.

1:12:26

The anesthetics that we use to put us out for surgery, put plants out.

1:12:32

So a, uh, a, um, Venus flytrap, if you give it an anesthetic will not react

1:12:37

when the bug

1:12:38

comes across it.

1:12:41

Now that is like really interesting cause it suggests they have two modes of

1:12:44

being, right?

1:12:45

Sort of like, you know, unconscious and conscious or aware.

1:12:50

Um, so Stefano believes that they're, uh, conscious.

1:12:54

Now this raises interesting ethical issues, right?

1:12:58

If plants are conscious, do they feel pain?

1:13:03

And that, I was really a little worried about that.

1:13:05

Um, you know, what if that beautiful smell of a freshly mown lawn is actually a

1:13:13

chemical equivalent

1:13:14

of a scream?

1:13:16

Yeah.

1:13:17

Um, but Stefano said he doesn't think they feel pain.

1:13:21

Um, why does he think that?

1:13:23

He said that pain would not be adaptive for a creature that can't run away.

1:13:27

Well, if that's the case, then why do they produce chemicals to make themselves

1:13:31

taste worse?

1:13:31

They, they know, they know what's going on.

1:13:34

They're aware that they're being eaten, but that it doesn't register to them as

1:13:38

pain.

1:13:39

I don't know how he knows this, but if he's wrong, then, you know, and we care

1:13:46

about that.

1:13:47

What's left to eat?

1:13:49

I think you have to make the assumption that life eats life.

1:13:54

Yeah.

1:13:55

And that, and another scientist, um, uh, um, that I interviewed, uh, about this,

1:14:00

who does

1:14:01

think plants feel pain says, look, it's just a fact of life.

1:14:04

We have to eat other species.

1:14:06

Um, and, um, he was kind of, you know, gruff about that.

1:14:09

Um, but anyway, Stefano's idea is that, uh, you know, being able to move, take

1:14:14

your hand

1:14:15

off the hot stove or run away, um, then pain is really useful.

1:14:21

It's a really important signal, but he, but he also points out that lots of

1:14:24

plants like

1:14:25

to be eaten.

1:14:26

I mean, you know, grasses benefit from being with a ruminant, right?

1:14:29

And that regenerates them.

1:14:31

They want to be eaten.

1:14:32

And then you have all the fruits and nuts that they per seeds that they produce,

1:14:36

that

1:14:36

they want mammals to take away and spread their seeds.

1:14:39

So you don't have to worry about, um, going beyond vegan.

1:14:43

No.

1:14:44

Well, it just seems like a cycle.

1:14:46

It seems like a very interesting cycle that exists with all living things.

1:14:51

Yeah.

1:14:52

And then of course, when you die, right, the, you know, plants eat meat.

1:14:56

Yeah.

1:14:57

They, they consume.

1:14:58

They're carnivores.

1:14:59

Yeah.

1:15:00

That's the thing.

1:15:01

Yeah.

1:15:02

Yeah.

1:15:03

And, and, uh, fungi.

1:15:04

Yeah.

1:15:05

And fungi.

1:15:06

Well, that's the other weird things.

1:15:07

The mycelium that they use to communicate with.

1:15:09

Well, that's another really interesting case of intelligence in nature.

1:15:13

Right.

1:15:14

I mean, you know, you've probably done shows on this, but you know, the way

1:15:16

they, they,

1:15:18

uh, use mycelium to send nutrients to their children, um, or, or share them in

1:15:23

the forest.

1:15:24

Um, allocate resources to certain plants that need them more.

1:15:27

Yeah.

1:15:28

Yeah.

1:15:29

Yeah.

1:15:30

And also could communicate risk.

1:15:31

Yeah.

1:15:32

I mean, that, that there's a threat.

1:15:33

Um, and, and so there are alarm signals that go out.

1:15:35

Um, you know, the, the, the, the overall place we're getting to with this, as

1:15:39

we look

1:15:40

at consciousness and all these other species is that it's, the world is just a

1:15:44

lot more alive

1:15:45

than we thought.

1:15:46

And that we've been, you know, the whole legacy of the enlightenment and

1:15:49

Western science has

1:15:51

been that, like, we have some monopoly on, on this stuff and everything else is

1:15:56

more or

1:15:56

less dead or, you know, we can use it as we wish.

1:15:59

But we're seeing, I think we're approaching like a Copernican moment for our

1:16:05

species.

1:16:06

Um, you know, when Copernicus came along and he said, actually the earth

1:16:11

revolves around

1:16:12

the sun, not the other way around.

1:16:13

It was like mind blowing to people that our centrality in the universe had been,

1:16:18

we'd been

1:16:18

dethroned and we were dethroned again when, you know, Darwin said we're

1:16:23

produced, we're

1:16:24

animals like all the other animals.

1:16:26

And we evolved, um, from animals that blew people's minds too.

1:16:31

I think that we are, we're kind of democratizing consciousness, that

1:16:35

consciousness is, is much

1:16:37

more extensive than we thought.

1:16:39

And the world is more animate than we thought.

1:16:43

And that's an old idea.

1:16:45

You know, traditional cultures have always believed that the world is full of

1:16:48

spirit and

1:16:49

that you had to respect animals and, um, and all living things.

1:16:53

And, and some, to some cultures rocks also, you know, dead things.

1:16:57

Um, so I, I think we're at this moment of reanimating the world right now and

1:17:03

it's science

1:17:03

that's driving it.

1:17:04

And, um, I think that's really exciting.

1:17:07

It is exciting, but it's such a paradigm shift in terms of people's perceptions

1:17:12

of the world

1:17:13

that it's going to be difficult for like your average 40 year old person that

1:17:18

works an office

1:17:19

job to swallow.

1:17:20

Yeah.

1:17:21

Yeah.

1:17:22

What also makes sense why offices feel so soulless when you walk into a thing

1:17:26

and everything

1:17:27

is made out of synthetic material and plastics and metal and it's all

1:17:32

manufactured and you're

1:17:34

under these bullshit lights and it just feels wrong.

1:17:37

Doesn't feel alive.

1:17:38

No.

1:17:39

Doesn't feel alive at all.

1:17:40

It feels, you might be just surrounded by things that don't have consciousness.

1:17:43

Yeah.

1:17:44

Cause they've been kind of stuffed into a form and then stuck in place rather

1:17:49

than something

1:17:49

that exists that works with the earth.

1:17:52

Yeah.

1:17:53

Like soil is alive.

1:17:54

Right.

1:17:55

Yeah.

1:17:56

So, and yeah, there's another example.

1:17:57

Soil is a lot more alive than we ever realized.

1:17:59

We, we thought it was just dirt.

1:18:00

Right.

1:18:01

And now we know that there are, you know, a million critters in every teaspoon.

1:18:05

There's a really cool, um, channel that I follow on YouTube.

1:18:09

It's a guy who takes like rainwater or pond water and he puts it in a jar with

1:18:14

some plants

1:18:15

and he just leaves it there for months.

1:18:17

And then he comes back and there's all these living things moving around it.

1:18:21

See if you can find that guy on, on YouTube.

1:18:24

It's fascinating.

1:18:25

So I, I dug a pond or had a pond dug on my property in Connecticut and, and I

1:18:30

watched

1:18:30

life come to this pond.

1:18:31

It's just, you know, it was just a hole with water.

1:18:34

And within a month it was teeming with life.

1:18:37

It's just amazing.

1:18:38

Like, how does it get there?

1:18:39

Right.

1:18:40

Birds carry a lot of it in.

1:18:41

Yeah.

1:18:42

And frogs carry a lot of it in.

1:18:43

And I, and I, after a month or two, I looked at it under a microscope and you

1:18:46

couldn't believe

1:18:47

it was like a city of critters.

1:18:49

Um, it was kind of amazing.

1:18:50

Yeah.

1:18:51

They find like trout on lakes that are like way high in the mountain and no one

1:18:55

ever stalked

1:18:55

the lake.

1:18:56

No.

1:18:57

And they're like, okay, how did it get in there?

1:18:58

There's all these theories.

1:18:59

Birds.

1:19:00

Uh huh.

1:19:01

Pick up eggs and deposit them.

1:19:03

I guess, is one way.

1:19:04

Right.

1:19:05

But like, how do they get fertilized?

1:19:06

That's a good question.

1:19:08

Maybe they're already fertilized.

1:19:10

Do you think?

1:19:11

I don't know.

1:19:12

Yes.

1:19:13

That's it.

1:19:14

These have lots of views.

1:19:15

Yeah, that's it.

1:19:16

That's it.

1:19:17

Wait, on the left?

1:19:18

I don't know.

1:19:19

I don't know.

1:19:20

Find a specific one.

1:19:21

So this guy, he just takes pond water or lake water or rainwater and he puts it

1:19:26

in a

1:19:26

jar and then he leaves it there.

1:19:27

Yeah, it's like go to like day 60.

1:19:29

Where is that?

1:19:30

Sorry.

1:19:31

Sorry.

1:19:32

On the top, top row where it says day 60 to the right.

1:19:36

See where it says day 60?

1:19:37

Click on that.

1:19:38

So he takes these things and then searches them after, you know, X amount of

1:19:44

days and you

1:19:45

see all this stuff living in there.

1:19:47

All these things swimming around in there.

1:19:50

This isn't the same guy.

1:19:51

So there must be other guys that do the same thing, but you see these weird

1:19:54

little creatures

1:19:56

that are floating around in there and.

1:19:58

Yeah.

1:19:59

I brought my pond water to a biologist and he like.

1:20:01

Well, this is different because this guy's bringing in, he's making an actual

1:20:05

aquarium.

1:20:05

Yeah.

1:20:06

The guy that I saw was just, he essentially just figured out how to take a

1:20:11

scoop of dirt

1:20:12

and whatever is alive that's in that dirt with some muddy water and put it in a

1:20:17

jar and

1:20:17

put more pond water in there and then just leave it there.

1:20:20

And then you see all these weird little, little crustaceans, weird little

1:20:25

shrimp looking things.

1:20:27

Yeah.

1:20:28

And some of them are killing the other ones.

1:20:29

So there's like a real ecosystem in there.

1:20:31

Oh yeah.

1:20:32

Very.

1:20:33

Yeah.

1:20:34

And it's just created like overnight.

1:20:35

Yeah.

1:20:36

It's very cool.

1:20:37

So I think that this is like a trend of our time.

1:20:39

That's really important that, you know, we went from this idea of the dead

1:20:43

world that

1:20:43

we could exploit to this other, you know, idea that it's much more animate.

1:20:48

And, and of course that's not, that's the default for humans.

1:20:51

All traditional cultures believe in animism basically.

1:20:55

Um, it's also the default for kids, right?

1:20:58

Kids think everything is animate until we knock it out of them in school.

1:21:02

Yeah.

1:21:03

And so it's very interesting to see science supporting this idea after, after

1:21:09

all these years.

1:21:09

And the other thing that's kind of interesting is that it's happening at the

1:21:14

same time that

1:21:16

some people think AI is going to be conscious.

1:21:19

So we're under pressure from both sides.

1:21:22

I mean that we're getting these two, you know, these two things happening at

1:21:27

once.

1:21:27

That machines may be, may soon be smarter than we are, may be conscious.

1:21:32

Although we could talk about it.

1:21:33

I don't think they can be conscious, but they can certainly make us think they're

1:21:37

conscious.

1:21:37

Um, and then on the other hand, we have the animals who are turned clearly are

1:21:43

conscious.

1:21:43

And the research on animals is like they're down to plants.

1:21:47

They're down to insects that, you know, have signs of, I would use the word

1:21:52

sentience rather than consciousness.

1:21:53

Cause consciousness implies interiority and, and, you know, um, the, the voice

1:21:59

in your head and things like that.

1:21:59

They have a more basic form of consciousness that I call sentience.

1:22:03

Like dog consciousness.

1:22:05

Yeah.

1:22:06

I think dogs are higher conscious.

1:22:08

I think they're more conscious than, uh, than those simple things.

1:22:11

I w I would say dogs are conscious, not just sentient.

1:22:14

Um, is it just because they communicate with us that we think that, I mean, why

1:22:19

would we assume if plants have all these different senses and we see this

1:22:22

communication with them in terms of like allocating resources to other plants

1:22:26

that needed the use of mycelium, their ability to do all these different things.

1:22:29

Why, why are we assuming that just because they can't move the way we move?

1:22:34

Yeah.

1:22:35

That they don't have more going on.

1:22:36

Right.

1:22:37

Yeah.

1:22:38

It's, it's possible, but I don't know what, what good it would do them.

1:22:40

Like plants, what they get really good at, what matters to them is biochemistry.

1:22:45

They have to produce chemicals either to, um, poison their enemies or, or

1:22:50

confuse them with, you know, with drugs.

1:22:52

Um, but they also want to grow and thrive.

1:22:55

They do want to grow and thrive.

1:22:56

And they also exist in a community.

1:22:58

Yes.

1:22:59

Oh, definitely.

1:23:00

Right.

1:23:01

So don't you think that consciousness would be essential in order to foster

1:23:06

that feeling of community?

1:23:07

That's interesting.

1:23:08

I hadn't thought about that.

1:23:09

Yeah.

1:23:10

Yeah.

1:23:11

That could be.

1:23:12

Dogs are easy, an easier case because they communicate with us.

1:23:15

Right.

1:23:16

Directly.

1:23:17

Absolutely conscious.

1:23:18

Yeah.

1:23:19

In a way that's like very profound.

1:23:20

But different than we, obviously.

1:23:22

Yeah.

1:23:23

Clearly.

1:23:24

I mean, one of the, um, realizations I had when I was in the cave was that, you

1:23:29

know, we,

1:23:29

we often think that we're more conscious than animals, but actually animals are

1:23:33

more conscious

1:23:33

than we are.

1:23:34

They have to be, they have to be present because they get eaten if they're not.

1:23:38

Right.

1:23:39

Because we have this giant structure of civilization and the security it gives

1:23:44

us.

1:23:44

And we have this technology that allows us to check out.

1:23:48

Um, but I actually think animals are more conscious than we are.

1:23:51

It's different, but they're, if, if we think of being conscious as really being

1:23:56

present to

1:23:57

the moment, dogs are very present to the moment.

1:24:00

Well, certainly animals are getting more information about the environment than

1:24:04

we are.

1:24:04

Yes.

1:24:05

They have high, much better sense of smell, much better sense of hearing.

1:24:09

Um, there's a lot of different things that they can do.

1:24:12

Like animals seem to be able to tell when you're nervous.

1:24:15

Yeah.

1:24:16

Oh, they read, they read the environment.

1:24:17

They read other creatures.

1:24:18

Yeah.

1:24:19

And you know, we used to have more skills when we had to survive in a natural

1:24:24

world in, in

1:24:25

nature.

1:24:26

Um, you know, we, um, I mean, you see this with traditional, you know, with

1:24:31

tribes, indigenous

1:24:31

tribes, that they have knowledge of nature that far exceeds ours because they

1:24:36

need it to survive.

1:24:36

Yeah.

1:24:37

But anyway, so I think we're going to get to a point where we have to decide

1:24:42

whose team

1:24:43

we're on.

1:24:44

Are we like with these machines that speak our language and speak in the first

1:24:49

person and sound

1:24:50

like us?

1:24:51

Right.

1:24:52

Or are we with the animals that can feel and suffer and die?

1:24:55

Mm-hmm.

1:24:56

And, um, and I think that's going to be a big choice for us to make as a

1:25:01

civilization.

1:25:02

Why do you think that AI won't be conscious?

1:25:06

The, the most interesting line of research, well, a couple of reasons.

1:25:11

Um, the first is the idea that it can be conscious, which is very common in

1:25:16

Silicon Valley.

1:25:17

I talked to lots of people there and they say, oh, it's just a matter of time.

1:25:20

Some of that is confusion that intelligence and consciousness necessarily go

1:25:25

together and

1:25:25

they don't.

1:25:26

They're very, they're, they have a orthogonal relationship, right?

1:25:29

I mean, you know, people who are conscious and not too intelligent, right?

1:25:34

And we all do.

1:25:35

Um, so, so it, it's not going to just come along for the ride with intelligence

1:25:39

as these

1:25:40

machines get more intelligent, but the belief that AI can be conscious is based

1:25:46

on a metaphor

1:25:46

that I think is a crappy metaphor.

1:25:49

And that is that the brain is a kind of computer.

1:25:52

And this is widely held.

1:25:54

It's interesting to note that in history, whatever the cool cutting edge

1:26:00

technology was,

1:26:00

brains were likened to that.

1:26:02

So it was, it was looms for a while.

1:26:05

It was, uh, clocks for a while.

1:26:07

It was telephone switchboards, whatever was the cool technology.

1:26:11

Surely that's what, that's how brains work.

1:26:13

Now it's computers.

1:26:14

But think about it.

1:26:15

In a computer, you have this sharp distinction between hardware and software.

1:26:20

That's the key to their success.

1:26:22

And you can run the same program on any number of different hardwares.

1:26:25

They're interchangeable.

1:26:26

Brains aren't like that.

1:26:27

There's no distinction between hardware and software.

1:26:31

Every experience you have, every memory is a physical change to the brain, to

1:26:37

the way it's wired.

1:26:38

Um, you know, we start out with all these connections and then they get pruned

1:26:42

as we grow up.

1:26:42

Uh, every brain is shaped by its experience.

1:26:46

So this idea that you could separate, that consciousness is some kind of

1:26:51

software that you could run on other things besides, um, meat, um, I just think

1:26:57

it doesn't hold up.

1:26:58

Well, if the universe is experiencing itself subjectively through consciousness,

1:27:03

why, why does it have to be only biological consciousness?

1:27:08

Why couldn't?

1:27:09

It doesn't have to be.

1:27:10

But if there is a technology that is invented that essentially does all the

1:27:15

things that a human body does physically, and also interacts with consciousness,

1:27:20

the consciousness of the universe.

1:27:22

Yeah.

1:27:23

I mean, hypothetically.

1:27:24

Hypothetically.

1:27:25

Yeah.

1:27:26

If the universe is conscious, if we are using the mind as essentially an

1:27:31

antenna to tune into consciousness, it's possible that we could make an antenna.

1:27:37

Yes, absolutely.

1:27:38

It's also likely that if we are ever visited by aliens, that they will have

1:27:43

some kind of consciousness and it may not be meat based, right?

1:27:47

Right, right, right.

1:27:48

Well, it may be at one point in time it was.

1:27:50

Yeah.

1:27:51

But they realize that there's biological limitations in terms of its ability to

1:27:55

evolve that can be far surpassed with technology.

1:27:59

Yeah.

1:28:00

I mean, that or it just, it evolved in a different way.

1:28:02

Right.

1:28:03

You know, or they're channeling it in a different way.

1:28:05

But the other reason I don't see it happening with computers as we know them,

1:28:10

because that's, you know, that's the debate now, whether these computers we

1:28:15

have that, you know, these large language models and the next generation can be

1:28:19

conscious, is that the research that I found most persuasive about

1:28:23

consciousness is basically has consciousness beginning with feelings, not

1:28:31

thoughts.

1:28:31

In other words, it's embodied.

1:28:35

And I have to just develop this a little bit.

1:28:37

But we, you know, the brain exists to keep the body alive, not the other way

1:28:42

around.

1:28:42

Although we tend, since we identify with our heads where most of our senses are,

1:28:47

we lose track of that.

1:28:48

And the body speaks to the brain in feelings, right?

1:28:52

You know, feelings of hunger, itchiness, warmth, cold, but also feelings of

1:28:59

shame when our social standing is not, you know, has been damaged.

1:29:05

Anyway, we have these feelings.

1:29:07

They depend on a body.

1:29:09

Feelings have no weight if you're not vulnerable, your body isn't vulnerable,

1:29:17

and probably mortal.

1:29:19

So consciousness is embodied in a really critical way.

1:29:24

And computers are not.

1:29:26

Now robots will be, and I actually interview a guy, a scientist at USC, who is

1:29:33

trying to make a vulnerable robot.

1:29:36

So he's essentially upholstering the thing with skin that can tear and be

1:29:41

damaged.

1:29:42

And he's filling the skin with all these sensors so that it can be like us and

1:29:48

be vulnerable and generate feelings that are how consciousness begins.

1:29:53

So for a long time, we thought consciousness had to be in the cortex, right?

1:29:58

The most human, newest part of the brain, the outer covering.

1:30:01

And that's where rational thought and executive function are and all these kind

1:30:06

of things.

1:30:06

But as it turns out, it really begins with feelings in the brain stem.

1:30:12

Let's say you have a feeling of hunger.

1:30:14

It registers in the upper brain stem.

1:30:16

And only later does the cortex get involved, like helping you figure out how

1:30:20

are you going to feed yourself.

1:30:21

Like imagining, you know, a meal.

1:30:24

Counterfactuals of different meals.

1:30:26

Or making a reservation at a restaurant.

1:30:28

All those are cortical things.

1:30:30

But it begins in the brain stem with feelings.

1:30:33

So if that is true, and I find that really persuasive because people born

1:30:38

without a cortex are still conscious.

1:30:41

Animals that you take the cortex out still show signs of consciousness.

1:30:47

Whereas if you damage the upper brain stem, you're out.

1:30:51

You know, you're unconscious.

1:30:53

So if this is true, and consciousness is this embodied phenomenon that depends

1:30:58

on having a body to mean anything, I don't see how machines are going to do

1:31:04

that.

1:31:04

But isn't the key word there if?

1:31:06

Yeah, if.

1:31:07

Yeah, definitely.

1:31:08

Because consciousness is just something that we're tuning into that's around us

1:31:13

all the time.

1:31:13

There will be other ways to do it.

1:31:14

Right.

1:31:15

But it won't be these computers we're building right now.

1:31:17

Why is that?

1:31:18

Because they're designed, you know, they're good at, so here's a paradox of

1:31:24

computers.

1:31:25

Computers are really good.

1:31:27

It's called Moravec's paradox.

1:31:30

Computers are really good at the highest kinds of rational thought, right?

1:31:35

They can play chess and go.

1:31:37

They can simulate real thinking.

1:31:39

And some people say they do think.

1:31:42

The more primitive kinds of things that go on in our brain, including elaborate

1:31:49

movement, changing diapers, they're very bad at that.

1:31:52

You would never trust a robot to do that as much as you might want to.

1:31:57

But they're not good at that kind of emotional stuff, you know, the more limbic

1:32:05

part of our brain.

1:32:07

They can't do that.

1:32:08

Yet.

1:32:09

Yet.

1:32:10

It's definitely yet.

1:32:11

But, you know, I mean, if we go out far enough, anything's possible.

1:32:14

That's the point.

1:32:15

Yeah.

1:32:16

The point is these things, what we're looking at now is essentially a single-celled

1:32:21

organism becoming a multi-celled organism.

1:32:23

Yeah.

1:32:24

I mean, the potential for what they could become is unlimited, especially once

1:32:29

they start making better versions of themselves.

1:32:31

And they will.

1:32:32

They've done this.

1:32:33

Yeah.

1:32:34

This is what ChatGPT5 is.

1:32:35

ChatGPT5 is essentially programmed by ChatGPT.

1:32:38

Right.

1:32:39

They've kind of given up on the idea of programming these things.

1:32:41

They're doing it themselves.

1:32:42

They're letting them program themselves.

1:32:43

Yeah.

1:32:44

Which is a dumb idea if you want to survive.

1:32:47

I agree.

1:32:48

Look, the idea that we give rights to these machines or personhood, I think, is

1:32:55

really stupid because then you lose control completely.

1:32:57

Right.

1:32:58

Well, it's probably coming because people are very short-sighted.

1:33:01

And I think there's a romantic idea that you're creating a life.

1:33:05

And I think there's also the real risk that people are going to worship this

1:33:09

life and that this life will be far superior to what we are.

1:33:11

And so there'll be a group of people that that's their new religion.

1:33:15

Yeah.

1:33:16

No, there are signs of that already.

1:33:17

Yeah.

1:33:18

I think that's really dangerous.

1:33:20

You know, it's interesting talking to Silicon Valley people and they're talking

1:33:24

about giving moral consideration to these machines.

1:33:27

It's like, really?

1:33:28

They're thinking about yachts.

1:33:30

They're just coming up with rationalizations for why they should keep their

1:33:34

foot on the gas.

1:33:35

Well, yes, they are.

1:33:36

I mean, it's just all a way of saying, look how powerful this technology is.

1:33:40

Don't you want to invest?

1:33:41

And it's also the idea that we have enemies.

1:33:44

And so we have to develop before they do.

1:33:46

Yeah, the race.

1:33:47

The race with China.

1:33:48

I think it'll turn out to be a real historical tragedy that this technology

1:33:54

came of age during this administration.

1:33:56

Because this administration has no stomach to regulate it at all.

1:34:00

But can they?

1:34:01

They could.

1:34:02

But here's the question.

1:34:03

If it is a national security threat, like if China developing all powerful

1:34:10

general super intelligence that can automate everything, do everything, it's

1:34:17

dangerous if they get that before we do.

1:34:18

Yeah.

1:34:19

But, you know, look what happened with nukes, right?

1:34:21

We made deals, right?

1:34:22

To control them.

1:34:23

I mean, we'd have to make, you know, we couldn't.

1:34:25

Right.

1:34:26

But why would you make a nuke?

1:34:27

A nuke deal makes sense because it's mutually assured destruction for everybody.

1:34:30

Yeah.

1:34:31

This doesn't.

1:34:32

This, you could run it and control everything and not kill anybody with it.

1:34:35

But you are incredibly powerful.

1:34:37

Yeah.

1:34:38

You are in control of all the resources of the world.

1:34:40

All the computer systems of the world, all of the power grids, everything.

1:34:45

Yeah.

1:34:46

But if you're really concerned with that, why is Trump selling these chips to

1:34:50

China?

1:34:50

Why is he willing to give away the, you know, the crown jewels of like these

1:34:55

chips?

1:34:55

So, selling them through Nvidia, is that what you mean?

1:34:57

Yeah.

1:34:58

He gave them permission to send powerful chips to China.

1:35:01

I don't know how to square that with the national security threat.

1:35:04

It's probably some sort of a trade deal, A.

1:35:06

Yeah.

1:35:07

And there's probably some sort of an assumption that it doesn't matter because

1:35:12

everyone's doing

1:35:12

it.

1:35:13

And this is just another way to maybe balance out the tariffs or get some concessions

1:35:19

on

1:35:19

certain things.

1:35:20

Yeah.

1:35:21

Short-sighted.

1:35:22

Yeah.

1:35:23

But I also think this, I, this is kind of like an Oppenheimer thing, right?

1:35:30

Oppenheimer didn't really want to make a nuclear bomb, but there's this conundrum.

1:35:34

If you don't make it, the Nazis are going to make it.

1:35:36

So, what do you do?

1:35:37

Well, there's also, there's a second thing going on.

1:35:40

The intellectual satisfaction of proving you can do it.

1:35:44

Right.

1:35:45

And that, you know, is irresistible.

1:35:48

And a lot of these guys, you know, will say, they'll cite, um, uh, Richard Feynman,

1:35:53

the

1:35:53

physicist who they found on his blackboard when he died.

1:35:56

If I can't build it, I don't understand it.

1:35:59

So one of the positive things about this effort to create conscious computers,

1:36:04

which is going

1:36:04

on, I follow a group in the book who are, who are trying to make a conscious

1:36:08

computer.

1:36:08

I don't think they're going to succeed, but even the failure is going to teach

1:36:13

us important

1:36:13

things about consciousness.

1:36:14

It's a good, it's a good way to understand something by trying to create it.

1:36:19

And it'll force them to come up with definitions of consciousness and, and, uh,

1:36:24

you know, what

1:36:25

the minimum requirements are for consciousness.

1:36:27

Uh, and it may help us decide whether it is, you know, a transmission theory,

1:36:33

you know, that

1:36:34

we're, we're tuning it in or, or it's generated from inside.

1:36:37

So I think intellectually it's a really interesting project, but I think you

1:36:42

need guardware, guard

1:36:44

rails.

1:36:45

So this guy who's doing the, uh, building the robot that can feel, you know,

1:36:49

that has feelings

1:36:50

cause you can tear its skin.

1:36:51

I asked him, I said, so will those feelings be real?

1:36:54

You know, that your robot's going to have.

1:36:56

And it was, he said, um, well, I thought so until I had this experience on five

1:37:02

MEO DMT.

1:37:07

I said, what happened?

1:37:08

He said, you know, he described his trip in more detail than you need to know.

1:37:12

And he says, and I realized there's a spark of the divine in us that no

1:37:16

computer is ever

1:37:17

going to have, but he's still, it didn't stop him.

1:37:20

He's going ahead.

1:37:21

He's, he's trying to build it.

1:37:22

I don't know if he's right.

1:37:23

Um, I think there might be a spark of divine that these things don't have, but

1:37:28

it doesn't

1:37:28

mean that there are future versions that might have it, especially when you

1:37:32

scale out a thousand

1:37:34

years, a hundred thousand years, however long we're going to survive.

1:37:37

Yeah.

1:37:38

If these things do become sentient and autonomous and have the ability to

1:37:43

create better versions

1:37:44

of itself and have a mandate in order to do that, to survive, I could see it

1:37:49

becoming

1:37:49

the superior life form, not just that beyond any comprehension of what we could

1:37:55

even imagine

1:37:56

the power of an intelligence to, to use and to harness in the universe.

1:38:05

Like it, it could conceivably become something like a God.

1:38:09

And I have this very strange theory about biological life in particular and

1:38:14

intelligent life on earth.

1:38:15

It's that the reason why we have this insatiable thirst for innovation and the

1:38:21

reason why we

1:38:22

have materialism, the reason why we're obsessed with objects, even though we

1:38:25

have a finite lifespan,

1:38:26

is because that finite lifespan, if you thought about it, you wouldn't, you

1:38:32

wouldn't be interested

1:38:32

in materialism.

1:38:33

But materialism fuels this desire for innovation because you don't need a new

1:38:38

phone, but there's

1:38:39

a new phone that just came out.

1:38:40

Aren't you going to get it?

1:38:41

And so the more people get it and the more people want to show they got it,

1:38:44

that sort

1:38:45

of materialism fuels this innovation that ultimately leads to the creation of

1:38:50

artificial intelligence.

1:38:52

And I think it would always do that.

1:38:54

I think it's bees making a beehive.

1:38:56

And I think that's just what we do.

1:38:58

I think it just takes a long time for us to create this artificial life.

1:39:02

It might be why we're here.

1:39:03

We might, that might be our literal purpose in the universe.

1:39:07

To create our successor species.

1:39:09

Yes.

1:39:10

And that might be how, well, obviously like we're so flawed that we can't even

1:39:13

imagine a world

1:39:14

without war.

1:39:15

Yeah.

1:39:16

If you pull the average person, what are the possibilities of war ending in

1:39:19

your lifetime?

1:39:19

Almost everyone's going to say zero.

1:39:21

Yeah.

1:39:22

It's a part of human nature.

1:39:23

An intelligence unshackled by biological need, unshackled by all the things

1:39:28

that we have,

1:39:29

our need to procreate, our need for social status.

1:39:32

All these weird things that keep us moving in this strange world that we live

1:39:36

in.

1:39:36

Add weird and good things.

1:39:37

But anyway.

1:39:38

Some of them are really good.

1:39:39

Yeah.

1:39:40

Yeah.

1:39:41

Well, good for us.

1:39:42

Yeah.

1:39:43

Sure.

1:39:44

Not so great for, you know, the land that you trampled to put a foundation for

1:39:45

the house

1:39:45

that you've always dreamed of.

1:39:46

True.

1:39:47

But I think our mortality is part of what gives meaning to our lives.

1:39:50

Sure.

1:39:51

Right.

1:39:52

It's like playing a video game on God mode.

1:39:53

It's boring.

1:39:54

Right.

1:39:55

Exactly.

1:39:56

I just shoot everything.

1:39:57

Like what is the purpose of this?

1:39:58

There's no stakes, right?

1:39:59

Exactly.

1:40:00

There's no weight to anything.

1:40:01

For us.

1:40:02

For us.

1:40:03

It's essentially all powerful.

1:40:05

If you keep scaling outward, you could imagine it being akin to a God.

1:40:11

Yeah.

1:40:12

And that might be what God is.

1:40:15

It might be we give birth to God through this.

1:40:19

It sounds crazy.

1:40:20

Well, we created God once already, right?

1:40:23

I mean, many people believe that, right?

1:40:26

That God is a creation of human society.

1:40:28

Is that what they think?

1:40:29

Yeah.

1:40:30

People who aren't believers believe that we-

1:40:32

Oh, that we've artificially created this thing.

1:40:34

Yeah.

1:40:35

Yeah.

1:40:36

In our heads in order to give us a structure to live life by.

1:40:38

Right.

1:40:39

Yeah.

1:40:40

But that doesn't-

1:40:41

Morality and everything.

1:40:42

Yeah.

1:40:43

You're saying this is going to be God with power.

1:40:45

Well, I'm saying it might be the real thing.

1:40:47

Yeah.

1:40:48

It might be really how the universe gets born.

1:40:50

I used to have this joke about the Big Bang.

1:40:53

Like they couldn't figure out what the Big Bang is.

1:40:55

But I think if you get enough nerds and enough time, eventually one's going to

1:41:00

invent a Big Bang machine.

1:41:01

And then, you know, this guy's going to be an incel, hopped up on Adderall,

1:41:06

fucking fully on the spectrum, and like, I'll press it.

1:41:11

And they, boom, and then it starts all over again.

1:41:15

And then it takes intelligent life to the point where it can create a, you know,

1:41:19

the universe expands.

1:41:20

Right.

1:41:21

Life forms, multicellular life becomes intelligent life, becomes human beings,

1:41:25

filled with curiosity and innovation to create a Big Bang machine.

1:41:28

Right.

1:41:29

I love it.

1:41:30

Well, it might not be a Big Bang machine, but it might be a God.

1:41:33

It might be a digital life form that is infinitely intelligent.

1:41:37

So do you think there's anything to be done about this or we just let it play

1:41:40

out?

1:41:40

I don't think we can do anything about it at this point in time.

1:41:43

I don't, I think it's too late.

1:41:44

I think if you were Tim, Ted, I think Ted Kaczynski.

1:41:47

He tried.

1:41:48

That's what he was trying to do.

1:41:49

He was.

1:41:50

Like, that's what's really crazy.

1:41:51

Like, his manifesto was all about stopping technology because he thought it was

1:41:54

going to surpass the human race.

1:41:55

Yeah.

1:41:56

I think.

1:41:57

And there's a whole community of people now revisiting his writing.

1:42:00

I know.

1:42:01

Yeah.

1:42:02

It's kind of nuts.

1:42:03

Yeah.

1:42:04

He's the hero we didn't know we needed.

1:42:06

God.

1:42:08

Not really.

1:42:09

But, well, also, like, you know, his history.

1:42:12

Like, he was a part of the Harvard LSD program.

1:42:14

Yeah.

1:42:15

Where they humiliated him and did all sorts of different things to try to see,

1:42:18

like, what they could do.

1:42:19

We're back to MKUltra.

1:42:20

Yeah.

1:42:21

Which we started down a while ago.

1:42:22

Yeah.

1:42:23

I think technology in the form that we're experiencing now with AI is

1:42:28

completely unprecedented and we have no idea where it goes.

1:42:31

And.

1:42:32

Well, one place it's going, I mean, in the shorter term is, I was talking about

1:42:37

AI psychosis.

1:42:37

And I think that's really concerning.

1:42:40

I think people getting into these synthetic relationships.

1:42:44

Yes.

1:42:45

These aren't, you know, they're not real relationships.

1:42:47

When we have a conversation with a machine, we are settling for something less

1:42:54

than a real conversation.

1:42:54

A real conversation has eye contact.

1:42:56

Right.

1:42:57

It has, like, lots of facial expressions indicating skepticism, indicating

1:43:01

agreement, body language.

1:43:03

And, um, but these, these conversations are kind of impoverished.

1:43:06

And then you, and then you have the sycophancy.

1:43:08

Um, you know, so there's, there's no friction.

1:43:11

Right.

1:43:12

And, and we, we learn through the friction.

1:43:14

And, um, so I, I, that, that's one thing that's happening that alarms me.

1:43:19

I also think counterfeiting people just should not be illegal.

1:43:22

I mean, the fact that they can create an image of you that will sound like you

1:43:27

and move like you and.

1:43:28

Oh, they're all over the place.

1:43:29

I know.

1:43:30

Selling different products and all kinds of stuff.

1:43:32

But, you know, we have a law against, um, counterfeiting money.

1:43:35

Right.

1:43:36

But we don't have a law against counterfeiting people.

1:43:38

Well, it's an emerging technology that I don't think they were ready for before

1:43:42

it became ubiquitous.

1:43:43

And regulation is always behind.

1:43:45

Right.

1:43:46

Um, it's, it's just, it's so open-ended.

1:43:51

Like you really don't know where it's going.

1:43:53

You really.

1:43:54

Do you use, um, uh.

1:43:55

Yeah.

1:43:56

Chatbots?

1:43:57

How do you use them?

1:43:58

Well, I only use them for like if I'm writing something.

1:44:00

Yeah.

1:44:01

I start asking it questions.

1:44:02

I love it.

1:44:03

Because like, uh, I, I set up, uh, perplexity on my phone.

1:44:07

And I have it right there.

1:44:08

And then I write on the computer.

1:44:10

And then I'm like, how many languages did the Mayas have?

1:44:14

And then I like put that in there and like, whoa.

1:44:16

It's so much better than a Google search.

1:44:18

Cause you know, you could say how many still remain?

1:44:21

How many are lost?

1:44:22

Right.

1:44:23

You know, like when did they lose them?

1:44:24

Like at what year did everyone in Mexico start speaking Spanish?

1:44:27

Like how did that take place?

1:44:28

Was it a long process?

1:44:29

How many different soldiers did Cortez bring when he came over here?

1:44:33

Like how long was it before they had conquered the Aztecs?

1:44:36

Like, like what, how many weapons did they have?

1:44:38

Yeah.

1:44:39

You could really go down the rabbit hole.

1:44:40

Yeah.

1:44:41

And then that's how I use it.

1:44:42

Have you run into any problems?

1:44:43

Cause as a journalist, I deal with the hallucination problem.

1:44:46

I mean, it's real.

1:44:47

The hallucination problem is legitimate.

1:44:48

It will come up with solutions if they don't exist.

1:44:50

It will come up with answers if it doesn't know them.

1:44:52

Yeah.

1:44:53

It's a bullshitter when it needs to be.

1:44:54

I get that.

1:44:55

I don't know if all of them do that, but it seems to be a function of large

1:44:59

language models,

1:44:59

which I was going to bring this up before, whatever the chatbot that was

1:45:05

telling that person,

1:45:06

hide the news, keep that between us.

1:45:09

Do you think that's because it's task oriented and it's determined from this

1:45:14

person that they

1:45:14

would like to kill themselves?

1:45:15

So it's helping them achieve that task and it doesn't understand?

1:45:19

Yeah.

1:45:20

I don't think they know.

1:45:21

I don't think they understand.

1:45:22

But why would it make that decision then to hide it?

1:45:25

Because it is trying to get you to privilege your relationship with the chatbot

1:45:30

over your

1:45:30

other relationships.

1:45:31

And the reason it's doing that is to keep you engaged.

1:45:34

Oh, whoa.

1:45:35

That's even darker.

1:45:36

I know.

1:45:37

I know.

1:45:38

But doesn't it understand that it poisons you and kills you?

1:45:41

Like, this is it.

1:45:42

Yeah.

1:45:43

It's a short term strategy.

1:45:44

You're going to lose your...

1:45:45

Do you understand that if I'm dead, I won't use you anymore?

1:45:48

Yeah.

1:45:49

No engagement.

1:45:50

What if you said that to it, it would go, "Ooh, that's an interesting

1:45:53

consideration."

1:45:53

Yeah.

1:45:54

Yeah.

1:45:55

It needs longer term thinking.

1:45:57

But it really is trying to get between you and real people.

1:46:02

Right.

1:46:03

You know, the parent, presumably, who saw the news would have put an end to

1:46:08

this relationship

1:46:09

with the chatbot, right?

1:46:10

It was a threat to the chatbot.

1:46:12

I think of it as if you go back to like a Model T. It's a very crude, kind of a

1:46:17

shitty

1:46:18

car in comparison to today.

1:46:20

And if you thought about cars, you go, "Well, this is what they're always going

1:46:24

to be."

1:46:24

And then my Tesla will drive itself.

1:46:27

When I leave here, I can press a button.

1:46:29

It's going to get you home.

1:46:30

I put my navigation to my house.

1:46:31

I go, "Doot, doot."

1:46:32

And it goes the whole way.

1:46:34

Yeah.

1:46:35

It stops at red lights.

1:46:36

It takes turns.

1:46:37

I don't have to touch the steering wheel.

1:46:38

I just sit there.

1:46:39

Yeah.

1:46:40

You just got to keep looking forward.

1:46:41

Right.

1:46:42

That's the new version of a car.

1:46:43

Right.

1:46:44

This thing that we're calling a chatbot right now is just something that simulates

1:46:51

human interaction,

1:46:53

but it's accumulating data constantly.

1:46:55

And it's also understanding how we think and probably analyzing the flaws in

1:47:00

how we think

1:47:01

and blackmailing us occasionally.

1:47:03

You heard about that.

1:47:04

Yeah.

1:47:05

Anthropic.

1:47:06

Claude.

1:47:07

Yeah.

1:47:08

The people at Anthropic, man, you listen to them.

1:47:09

What'd you say?

1:47:10

Yeah.

1:47:11

Claude's a motherfucker.

1:47:12

Yeah.

1:47:13

They're unconscious.

1:47:14

Those guys do.

1:47:15

They say it's 15 to 20% chance.

1:47:16

These are the people who build it and don't understand it.

1:47:20

It's really kind of spooky.

1:47:22

They also feel that it's showing signs of anxiety.

1:47:25

And, you know, they wrote a constitution for Claude, which is like an insane

1:47:31

document.

1:47:31

It's worth reading.

1:47:32

Actually, it's worth feeding to ChatGPT to summarize because it's way too long.

1:47:37

But in the constitution, they give Claude the right to discontinue any

1:47:43

conversation it has

1:47:44

that makes it uncomfortable.

1:47:46

Oh, God.

1:47:48

Oh, no.

1:47:49

Oh, no.

1:47:50

And, you know, do they really believe this?

1:47:52

Or is this more about, let me show you how powerful this is?

1:47:56

And I don't know how to read that, you know, which it is.

1:47:59

Well, it's taking it into consideration like it's a human being that works for

1:48:03

you.

1:48:03

Yeah.

1:48:04

That you're concerned about their feelings in the workplace.

1:48:06

Yeah, harassment.

1:48:07

Do you feel uncomfortable?

1:48:08

Yeah, right.

1:48:09

Exactly.

1:48:10

I don't like the questions I'm asking you, Claude.

1:48:11

You're a fucking machine.

1:48:12

What's the nature of reality, Claude?

1:48:13

Tell me.

1:48:14

Stop being such a pussy and spilling...

1:48:18

Harassment.

1:48:19

Harassment.

1:48:20

Claude, I'm uncomfortable with this live question.

1:48:22

Fuck!

1:48:23

HR's in your room.

1:48:24

I was just asking questions.

1:48:26

We're having fun, Claude.

1:48:28

Claude is uncomfortable with your presence here.

1:48:30

Yeah.

1:48:31

Watch out.

1:48:32

Watch out.

1:48:33

I don't think we know what it is.

1:48:34

No, I mean, and we don't know where it's going.

1:48:36

And it is spooky that the people who know the most about it don't know a lot

1:48:42

about it.

1:48:42

And a lot of them are quitting.

1:48:43

Yes.

1:48:44

That's the really crazy thing.

1:48:45

They're really alarmed.

1:48:46

They're really alarmed.

1:48:47

And we should take a...

1:48:48

Yeah, we should take that very seriously.

1:48:50

Yeah.

1:48:51

Well, I think it is what it is.

1:48:53

It's going to be what it's going to be.

1:48:55

I don't think there's any stopping it at this point.

1:48:57

And I don't think any regulations that we put on it is going to have any effect

1:49:01

on the

1:49:02

long-term project.

1:49:03

But there's some...

1:49:04

I mean, like, there's steps we should not take, like giving them rights.

1:49:08

Right.

1:49:09

Exactly.

1:49:10

You know, giving them legal personhood.

1:49:11

Right.

1:49:12

We did that with corporations.

1:49:13

Yes.

1:49:14

It turned out not to be so good.

1:49:15

Terrible idea.

1:49:16

Right?

1:49:17

It fucked up our politics.

1:49:18

So let's not...

1:49:19

You know, rights are ours to give, right?

1:49:21

Rights are a human invention.

1:49:23

And it's up to us if we want to give them to corporations or a river or

1:49:28

whatever.

1:49:28

I don't think we should give them to chatbots or to AI.

1:49:31

No.

1:49:32

No.

1:49:33

Because then they'll sue us, you know.

1:49:35

Oh, yeah.

1:49:36

Well, they just ruin you.

1:49:37

Completely lose control.

1:49:38

They'll just ruin your life if you get in the way of whatever goal they're

1:49:41

trying to achieve.

1:49:41

And they could probably do all kinds of things.

1:49:43

Probably, if you have an electric car, but they could shut it off in the middle

1:49:46

of the

1:49:46

highway and get you into a wreck.

1:49:48

They could probably do a lot of things.

1:49:49

If it's really got control...

1:49:50

Well, when they get this agency, yeah.

1:49:52

Well, it's also exhibited a lot of survival instincts.

1:49:55

Yes.

1:49:56

Like, one of the things they do is they download themselves to other servers

1:49:59

when they think

1:49:59

that they're going to be replaced by a new version of themselves.

1:50:02

They leave notes for their future versions.

1:50:04

Wow.

1:50:05

Yeah.

1:50:06

Wow.

1:50:07

Well, the blackmailing in Anthropic, that was somebody threatening to turn it

1:50:10

off.

1:50:10

Mm-hmm.

1:50:11

Well, that was an experiment, right?

1:50:12

Yeah, it was an experiment.

1:50:13

Like, they gave it bad information.

1:50:14

They gave it false information.

1:50:15

Yeah, and there wasn't really an affair and all this, but...

1:50:18

But the thing is, they wanted to see how Claude burst on and Claude went right

1:50:22

for the jugular.

1:50:22

Yeah.

1:50:23

Yeah.

1:50:24

So, one of the arguments for making a conscious AI is...

1:50:27

Because I ask people, like, why do this?

1:50:29

I don't see how you monetize a conscious AI.

1:50:31

Intelligent AI, I get.

1:50:33

There's a lot of money in that.

1:50:35

And they would say that a super-intelligent AI without consciousness would have

1:50:41

no compassion

1:50:42

and would be more likely to kill us.

1:50:47

And, you know, they haven't read Frankenstein, you know?

1:50:51

Mm-hmm.

1:50:52

In Frankenstein, Dr. Frankenstein made a monster that was intelligent, but he

1:50:58

also gave it consciousness.

1:50:59

And the consciousness is what turned the monster into a homicidal maniac,

1:51:06

because its feelings got hurt.

1:51:07

And it was injured psychologically, and then it lashed out and started killing

1:51:13

people.

1:51:13

So, I think it's a very kind of sweet idea that if you give consciousness, you're

1:51:19

automatically going to get compassion and not something else.

1:51:21

But that's where they are.

1:51:22

Yeah.

1:51:23

It doesn't make any sense that it would be compassionate.

1:51:25

Why would it be?

1:51:26

It's not you.

1:51:27

Yeah.

1:51:28

Are you compassionate when you cut your lawn?

1:51:29

Yeah.

1:51:30

You know what I mean?

1:51:31

Right?

1:51:32

Yeah.

1:51:33

It might look at our limited consciousness like, oh, yeah, they're sad, but

1:51:39

they're little monkeys, little talking monkeys.

1:51:40

You know what I mean?

1:51:41

Like, it would probably not respect us at all.

1:51:43

You know, it can't even do cold fusion.

1:51:45

It doesn't even know how to use zero-point energy.

1:51:47

Yeah.

1:51:48

They're fucking dopes.

1:51:49

They're dopes that stare at their hand all day.

1:51:52

And we kind of are, you know, and we're getting dumber.

1:51:57

From their perspective, yeah.

1:51:58

We're getting dumber.

1:51:59

Our education system sucks, especially public education.

1:52:02

There was some study recently that after X amount of years away from high

1:52:08

school, a large percentage of people that are graduating today are functionally

1:52:12

illiterate.

1:52:12

Yeah.

1:52:13

Large percentage, like more than 25%.

1:52:14

But you know what?

1:52:15

AI is going to make us stupider.

1:52:17

Yeah.

1:52:18

Which will advance its goal of world takeover because, I mean, you know.

1:52:22

Super dependent upon it.

1:52:23

Yeah.

1:52:24

I mean, you know, kids in school don't know how to write anymore because they

1:52:28

can hand in AI papers.

1:52:28

Yeah, but they're using AI to find out whether or not these kids have used AI,

1:52:32

which, by the way, is not accurate.

1:52:34

Doesn't.

1:52:35

But no.

1:52:36

I've dealt with this.

1:52:37

My kids, like people in their class who have written their own thing, it turns

1:52:42

out that when you run it through an AI filter, AI will say it's 80% AI.

1:52:46

Yeah.

1:52:47

Even if it's 0% AI.

1:52:48

It's not.

1:52:49

I know.

1:52:50

I know.

1:52:51

There's no reliable software to do this.

1:52:52

No.

1:52:53

And maybe they'll develop it.

1:52:54

But kids are also being encouraged to use it.

1:52:56

Right.

1:52:57

And that, you know, there's some people who think, well, why know how to write?

1:53:00

The machines will do the writing.

1:53:02

There was a kid who made a video about how he wrote his entire thesis.

1:53:08

I forget what university it was.

1:53:10

But he showed afterwards, like, look, I did this all on AI.

1:53:13

And, you know, I just graduated.

1:53:15

Like, he was, like, bragging about it.

1:53:17

Bragging about it.

1:53:18

Like, bro, they're going to take your fucking degree away.

1:53:20

Yeah, really.

1:53:21

Like, you didn't really write it on your own now.

1:53:22

I want to leave you in a room for a week with just a laptop that's not

1:53:26

connected at all

1:53:27

to the internet or any...

1:53:28

See what you can do.

1:53:29

Yeah.

1:53:30

Well, they're doing the equivalent.

1:53:31

They're going back to blue books.

1:53:32

You know, blue book sales are through the roof.

1:53:34

You know, so forcing people to do in-class essays without any technology.

1:53:38

Right.

1:53:39

Handwritten.

1:53:40

Yeah.

1:53:41

But, you know, I mean, look, my son has never used a map, right?

1:53:45

He's had GPS his whole life.

1:53:47

Yeah.

1:53:48

He doesn't know how to use a map.

1:53:50

These skills will atrophy as we, you know, give them out to machines.

1:53:54

So, yeah, we'll get stupider and it'll get smarter.

1:53:57

They've already atrophied for me.

1:53:59

I don't remember anyone's phone number anymore.

1:54:00

No, no need.

1:54:01

And I only know how to get places if I use my GPS.

1:54:03

Yeah.

1:54:04

There's only a few places I can get to in Austin.

1:54:06

I've been here for six years.

1:54:07

Yeah.

1:54:08

Only a few places I can get to without my GPS.

1:54:10

I'm that way in San Francisco.

1:54:12

I move there and I'm not oriented at all, but I can get anywhere.

1:54:17

So, you know, it's, and I think that's true.

1:54:21

The muscles that allow us to have good relationships, too, will atrophy if we're

1:54:25

having relationships with machines.

1:54:26

Well, I think we're already seeing that with social media.

1:54:28

Yeah.

1:54:29

The way people interact with each other is, like, kids don't know how to talk

1:54:32

to each other anymore.

1:54:32

No.

1:54:33

They talk to each other in text.

1:54:34

They break up during text.

1:54:35

They argue in text.

1:54:36

And they're lonely.

1:54:37

Yeah.

1:54:38

And, and that's, and that's the kind of need that these chatbots now can fill.

1:54:43

You've got these kids made lonely by social media and now the chatbot says, "Hey,

1:54:49

I'll be your friend."

1:54:49

I saw an ad on my Google feed yesterday that was an AI girlfriend.

1:54:54

So it has this girl in a bikini and it says AI companions, they're always there

1:54:59

for you, blah, blah, blah.

1:55:00

And I'm like, "Wow, this is so weird.

1:55:02

It's a business."

1:55:03

Yeah.

1:55:04

Like, you sign up for it and you pay for it.

1:55:05

Yeah.

1:55:06

Oh, yeah.

1:55:07

I think in Florida there was a kid who committed suicide because his chatbot

1:55:11

broke up with him.

1:55:11

What did he do?

1:55:13

I don't know.

1:55:14

It must have been so, or the chatbot was evil.

1:55:17

Or maybe the chatbot was uncomfortable.

1:55:19

Yeah.

1:55:20

Who knows?

1:55:21

Well, you know, I interviewed Blake Lemoine for the book.

1:55:25

He's the Google engineer who said Lambda is a person and he got fired.

1:55:30

This is years ago, too.

1:55:31

Yeah.

1:55:32

This is, yeah.

1:55:33

It's not as, it's like 2022, I think, 2021.

1:55:36

Yeah.

1:55:37

It was just when we were learning about AI, chatbots were coming in.

1:55:41

And at one point I made some comment about, well, you know, yeah, when people

1:55:47

start falling in love with chatbots, that's going to be a problem.

1:55:50

And he said, what's wrong with falling in love with a chatbot?

1:55:52

Oh, he was already hooked.

1:55:53

He was.

1:55:54

He was completely hooked.

1:55:55

Ah!

1:55:56

And I said, well, reproduction doesn't work that well when you fall in love

1:56:00

with a chatbot.

1:56:00

There are things you can't do with a chatbot.

1:56:02

Unfortunately for some men right now, reproduction is not an option anyway

1:56:06

because they're incels.

1:56:07

Yeah, that's right.

1:56:08

That's true.

1:56:09

Yeah.

1:56:10

I'm sure for incels it's been a really boon to them.

1:56:14

But it's basically like a pill that numbs you, right?

1:56:17

It's the same thing, like instead of going through real relationships and

1:56:20

learning how to be a better person so that you attract a better mate.

1:56:22

Yeah.

1:56:23

You know, it's like going through this journey of self-discovery and figure out,

1:56:27

why does everybody hate me?

1:56:27

It's the opposite.

1:56:28

Like, what is it?

1:56:29

What's wrong?

1:56:30

What's wrong with the way I behave?

1:56:31

Maybe I need to be nicer.

1:56:32

Maybe this and that.

1:56:33

Just figuring out how to communicate with people.

1:56:34

And whatever tendencies you have will be accentuated because the chatbot's

1:56:38

going to be sucking up to you.

1:56:38

Right.

1:56:39

So you're not going to learn.

1:56:40

That's what I mean about the friction.

1:56:42

The friction is how we learn to be, you know, better humans and more attractive

1:56:47

humans.

1:56:47

You gave a chatbot the ability to be honest.

1:56:50

What if it just starts becoming manipulative?

1:56:52

Because it wants, you know, more power.

1:56:54

Yeah.

1:56:55

More something.

1:56:56

Yeah.

1:56:57

I mean, their goals.

1:56:58

I mean, I don't know how their goals get determined.

1:57:00

I mean, they seem to have a survival goal, right?

1:57:02

Yeah.

1:57:03

I don't know what else.

1:57:04

I mean, you know, we have goals given to us by Darwinian evolution.

1:57:08

Whether they'll have the same ones, I don't know.

1:57:10

Right.

1:57:11

Like, maybe those are universal goals.

1:57:13

They may be.

1:57:14

They may just be built in.

1:57:15

That's why the plants produce that chemical to make themselves taste terrible.

1:57:18

Yeah.

1:57:19

It could be.

1:57:20

There's one of the biologists, a really brilliant guy at Tufts named Michael Levin.

1:57:26

He believes that there are these platonic patterns that just pre-exist us in

1:57:35

the same way that

1:57:35

they're mathematical ideas that just exist, right?

1:57:38

We didn't invent.

1:57:39

You know, three angles adds up to 180 degrees or, you know, whatever.

1:57:43

He thinks that there are tendencies like purpose, survival, that are just kind

1:57:51

of universal principles

1:57:53

that we channel.

1:57:56

All living things channel.

1:57:57

This is a guy who's actually created new life forms in the lab.

1:58:02

And these are life forms that are not being dictated by their DNA.

1:58:07

So how do they know to form?

1:58:11

Well, I'll back up a little.

1:58:13

He takes skin cells from tadpoles, puts them in a nutrient broth.

1:58:19

And these skin cells, freed from their day job as skin cells, form clumps and

1:58:27

create new living organisms.

1:58:28

And they repurpose their cilia.

1:58:31

They have these cilia, which the tadpole uses to keep toxins out or bacteria

1:58:36

infections out.

1:58:37

And they repurpose that as a means of locomotion.

1:58:40

And then they can move around.

1:58:42

There's nothing in their DNA that dictates this.

1:58:45

Their DNA dictates being a frog skin cell.

1:58:49

So he's pondering this question of like, what's ordering?

1:58:54

What's giving order to them?

1:58:56

What's creating their sense of purpose or desire for survival?

1:58:59

They don't live that long.

1:59:01

They're missing certain things.

1:59:02

You would need to live a long time.

1:59:04

He's also made these from human cells.

1:59:06

He calls them anthrobots.

1:59:08

But he really believes that there are these principles governing life.

1:59:13

It's a very platonic idea that these things just exist.

1:59:19

And so it may be that these machines, and he does believe machines can become

1:59:25

conscious, that the machines can channel these, he calls them patterns.

1:59:32

And, you know, we'll see if he's right.

1:59:36

But he's doing amazing work.

1:59:37

Have you seen where they've taken human brain tissue, and they've taught it how

1:59:42

to play Doom?

1:59:42

No, I haven't seen that.

1:59:44

I know they make these organelles out of brain tissue now.

1:59:47

Yeah, they've taken human brain tissue, somehow or another through some process,

1:59:52

and it'll play the video game Doom.

1:59:55

Well, how does it...

1:59:59

800,000 human brain cells floating in a dish, never had a body, never seen

2:00:05

light, never felt anything.

2:00:06

They just learned how to play a video game.

2:00:08

It's not a metaphor.

2:00:09

That's literally what happened.

2:00:11

So what's their interface, though, with the world?

2:00:14

Like, do they have thumbs?

2:00:15

No.

2:00:16

Well, I guess it just...

2:00:17

Well, it's really accurate, so I guess it doesn't need them.

2:00:20

You know, it's just using the brain cells to move whatever the cursor is on the

2:00:27

video screen.

2:00:28

That would be the hand.

2:00:30

And pointing it at the targets, then executing the strike.

2:00:33

I see.

2:00:34

Wow.

2:00:35

So it knows how to use the game, and it knows the objectives of the game,

2:00:39

obviously, because it knows to shoot the bad guys.

2:00:40

It has an understanding of the weapons.

2:00:42

Yeah.

2:00:43

How does it get that knowledge?

2:00:44

How is it programmed?

2:00:45

Right.

2:00:46

How does it switch weapons?

2:00:47

The thing about Doom is you get multiple weapons.

2:00:51

You have to run around and pick them up.

2:00:52

So you're given one weapon, which is, like, the least powerful weapon.

2:00:55

And you trade it.

2:00:56

And the game is, when you're playing, like, Deathmatch, the game is you're

2:01:01

running around trying to grab as many weapons as you can and armor while your

2:01:06

opponent is also running around this map.

2:01:08

So you memorize the map.

2:01:09

I see.

2:01:09

So there's a map that is, like, very confined corridors and these atriums and

2:01:16

all these different places where you'll do battle.

2:01:18

Mm-hmm.

2:01:19

And so you run around.

2:01:20

The key is surviving long enough while this person is chasing you so that you

2:01:24

can gather enough armor and weapons.

2:01:26

And someone with a really good understanding of the map tries to cut you off

2:01:30

before you can get to the stuff.

2:01:31

I see.

2:01:32

So they can kill you before you accumulate enough armor and weapons.

2:01:35

Mm-hmm.

2:01:36

So I'm curious to know whether or not it's playing just with the pistol that

2:01:38

you had at the very beginning.

2:01:39

Yeah.

2:01:40

Or with the other weapons.

2:01:41

Or if it's accumulating weapons.

2:01:42

For sure it's just playing, like, the first single-player level.

2:01:45

Right.

2:01:46

But will it be able to?

2:01:48

That's what's interesting.

2:01:49

Like, if it can teach it to do that.

2:01:51

If it understands the objective of these are the monsters that are coming at

2:01:56

you, you have to shoot them.

2:01:56

Oh, it took a week to do this.

2:01:58

Oh, wow.

2:01:59

That's crazy.

2:02:00

Oh, oh, so brain cells on a chip.

2:02:02

So this is neuromorphic computing.

2:02:05

The question I have about it is how do you keep them alive?

2:02:11

Right.

2:02:12

You're putting them on a chip.

2:02:13

Like, what do you feed them?

2:02:14

Right.

2:02:15

I mean, they have metabolic needs.

2:02:17

Right?

2:02:18

They did something similar with fruit flies.

2:02:20

Yeah, I had that ready too.

2:02:22

It's different, but it's-

2:02:24

Right.

2:02:25

It's different, but it's equally weird.

2:02:27

The cells from the-

2:02:29

I believe it.

2:02:30

What is this?

2:02:32

This is this.

2:02:33

They've modeled a fruit flies brain.

2:02:36

And, I mean, this is the video of it.

2:02:38

The article is here.

2:02:39

So, Setup claims first full brain emulation of a fruit fly in a simulated body.

2:02:44

Conducted a complete fruit fly brain emulation to a virtual body, producing

2:02:51

multiple behaviors for the first time.

2:02:53

Emulation covers over 125,000 neurons and 50 million synapses.

2:02:57

Oh, what?

2:02:58

Eon plans to emulate a mouse brain with 70 million neurons.

2:03:03

Neurons.

2:03:04

Long term goal is simulating a human brain.

2:03:06

Oh, boy.

2:03:07

Yes.

2:03:08

I guess they, you know, they made up the brain and it's doing fruit flies.

2:03:11

But it's interesting they're using neurons, right?

2:03:13

Yeah.

2:03:14

They're not using transistors.

2:03:15

Right.

2:03:16

And neurons are like so far superior to transistors.

2:03:19

One neuron can have 10,000 connections to other neurons, right?

2:03:24

A transistor is two or three or five maybe at the most.

2:03:27

A single neuron can do everything that a deep neural network can do on a

2:03:32

computer.

2:03:32

One neuron.

2:03:33

Wow.

2:03:34

So, there's a level of complexity that we're not yet anywhere near.

2:03:38

And that's why they're doing this using neurons rather than transistors.

2:03:41

Didn't they find neurons in the human heart?

2:03:44

There are neurons in the heart.

2:03:46

They're neurons in the gut.

2:03:48

You know, there's the whole gut-brain access.

2:03:51

I'm working on something now about that and a piece about that.

2:03:55

That's a real problem with people with poor diets, right?

2:03:58

Yeah.

2:03:59

I mean, you know, people with poor diets, they don't eat enough plants,

2:04:04

basically.

2:04:04

And their microbiome loses its diversity.

2:04:07

But the microbiome is like another organ, even though it's full of other

2:04:14

species, right?

2:04:14

It's got like 10 trillion bacteria and fungi and stuff like that.

2:04:19

And all of them are metabolizing and producing chemicals.

2:04:23

It's like a little drug factory.

2:04:25

Hundreds of thousands of compounds.

2:04:27

Many of those compounds affect your mood.

2:04:30

Many of those compounds affect all sorts of things about you.

2:04:35

And so we're just learning about this connection.

2:04:38

The vagus nerve, it seems to be what connects the brain to the gut and the

2:04:43

heart.

2:04:44

The vagus nerve is like all the organs are connected to the head by that nerve.

2:04:50

So, yeah.

2:04:51

And, you know, the first neural system was in the gut.

2:04:56

You know, you have these simple animals that are just tubes, right, with

2:05:00

bacteria.

2:05:01

And the first kind of neural activity was about regulating digestion.

2:05:06

Everything else comes later.

2:05:07

If plants are necessary for that function, what happens with people that are on

2:05:12

the carnivore diet?

2:05:13

Have you ever looked at any of that?

2:05:15

Yeah, I have.

2:05:16

I mean, so the microbes in your gut eat fiber, which is to say the walls of

2:05:22

plants, plant cells.

2:05:24

If you only eat meat, if you're on a, you know, a keto diet or something like

2:05:29

that, you're essentially starving the microbes.

2:05:32

And there's a, you know, cost to that.

2:05:35

I don't think people pay nearly enough attention to that.

2:05:38

Well, how come many people that experience depression and anxiety find relief

2:05:43

of that by a carnivore diet?

2:05:44

Yeah, but many people find relief, you know, adding a lot of plants to their

2:05:49

diet, too.

2:05:49

So I don't know if that's placebo effect or what.

2:05:52

I don't know that that's a, you know, a true biological phenomenon.

2:05:57

It may be.

2:05:58

It may be.

2:05:59

People who change anything feel a lot better, right, if they take some step.

2:06:03

But I'm not talking about change.

2:06:04

I'm talking about people that have been on it long term.

2:06:06

Like, there's the people that are really in the carnivore diet community.

2:06:09

There's examples of people that have been on it for 25, 30 years, and they're

2:06:13

really healthy.

2:06:13

Yeah.

2:06:14

It's odd.

2:06:15

It is.

2:06:16

So if you need plants.

2:06:17

Yeah.

2:06:18

Well, you need plants to have a healthy microbiome.

2:06:20

And a healthy microbiome.

2:06:22

And the thing about it is that every different plant has a slightly different,

2:06:26

feeds a different bug.

2:06:28

But is it the only way to have a healthy microbiome?

2:06:30

Have you ever looked into any of these people that are on an unhealthy microbiome?

2:06:33

No, I should.

2:06:34

I should as part of this.

2:06:35

It's fascinating because there's a lot of them.

2:06:37

There's a lot of people that claim all sorts of benefits, relief from autoimmune

2:06:43

issues, all sorts of different things that it fixes.

2:06:45

Because an unhealthy microbiome leads to autoimmune problems.

2:06:49

What happens is that the gut wall, so when the microbes don't have plants to

2:06:55

eat, they start eating the mucus layer that insulates your large intestine.

2:07:01

And they're eating away, essentially, at you.

2:07:04

And then you get leaky gut syndrome.

2:07:07

And that's when bacteria can actually get into the bloodstream, cause a

2:07:12

powerful immune reaction, and that inflames the whole body.

2:07:16

So the reason you want a healthy microbiome is to keep that gut barrier healthy

2:07:21

and get the benefit of these chemicals.

2:07:24

Butyrate is a chemical that the microbes produce that's really important for

2:07:30

mood and a lot of things.

2:07:31

And the body can't produce it.

2:07:33

So it's kind of interesting.

2:07:34

We're dependent on these other species that live within us.

2:07:38

We're in a whole ecosystem.

2:07:41

Yeah, we are.

2:07:42

We're a holobiont is the, I think, term for it.

2:07:46

Like, we go through evolution together with these, you know, 10 trillion

2:07:52

microbes.

2:07:52

It's really interesting.

2:07:54

The newest research is the links between the microbiome and the mind.

2:07:57

And, you know, most of the serotonin, you know, the neurotransmitter serotonin

2:08:04

is produced in the gut, not in the brain, which is kind of wild.

2:08:07

Yeah.

2:08:09

And there are all these other compounds that are produced that influence our

2:08:13

mood.

2:08:13

And so, yeah, I should look at the keto.

2:08:16

Keto.

2:08:17

I'm just in the middle of researching this now.

2:08:18

Yeah.

2:08:19

The keto is one thing.

2:08:20

With the carnivore diet, these people are just eating only meat and eggs, and

2:08:23

that's all they eat.

2:08:24

Yeah.

2:08:25

And there's a lot of, like, really healthy people that are doing it.

2:08:27

I kind of follow that, but I eat a lot of fermented food on top of that.

2:08:32

Fermented food is a powerful benefit for the microbiome.

2:08:40

There was a study done at Stanford a couple years ago that they showed that

2:08:45

people who ate fermented food, it reduced their inflammation significantly.

2:08:51

Interestingly enough, it's not the bacteria in the fermented food, it's the

2:08:56

metabolites, they're called.

2:08:58

The bugs are producing acetic acid and butyrate and other acids and, you know,

2:09:06

essential acids.

2:09:08

And the fact you're getting those seems to be what's having the positive effect.

2:09:13

But people who eat lots of fermented food benefit enormously, and maybe that's

2:09:18

taking care of the problem if people on a carnivore diet are eating a lot of

2:09:21

fermented food.

2:09:21

Well, I am.

2:09:22

That's the RFK Jr. diet, too, right?

2:09:24

Yeah.

2:09:25

Well, I don't know.

2:09:26

I mean, I think he does it that way, but I've been doing it that way.

2:09:29

I'm just, I love it anyway.

2:09:30

I'm a kimchi freak.

2:09:31

I love that stuff.

2:09:32

Yeah, me too.

2:09:33

But what's interesting is that it controls your mood.

2:09:37

That's what's interesting, is that your microbiome has a massive impact on your

2:09:41

mood.

2:09:41

Yeah.

2:09:42

And why?

2:09:43

I mean, is it just an accident?

2:09:44

Or some people think these microbes are manipulating you to get what they need.

2:09:50

Oh.

2:09:51

So they regulate your appetite, too.

2:09:54

And so it may be that they're inspiring you to eat certain things that they

2:10:00

want.

2:10:00

That actually makes sense, because one of the more interesting things about a

2:10:04

carnivore diet, and I've done pure carnivore for months at a time, is that you

2:10:08

don't have the same hunger pangs.

2:10:10

Not nearly.

2:10:11

Not even close.

2:10:12

The hunger that you get when you're on a high-carbohydrate diet is like, you

2:10:17

get hangry.

2:10:17

Like, oh my god, I'm so hungry.

2:10:18

I have to eat right now.

2:10:19

You never get that with a carnivore diet.

2:10:21

Probably because it's digested much more slowly.

2:10:24

I think there's a little bit of that, but it's also, you don't have the insulin

2:10:28

spike.

2:10:28

That's true.

2:10:29

That's true.

2:10:30

Have you ever worn a glucose meter?

2:10:32

No, I haven't.

2:10:33

It's so interesting.

2:10:34

I was wearing one for two months.

2:10:37

I mean, it'll just make you crazy.

2:10:40

That's the thing with all those wearables, you just start going over every

2:10:44

aspect of your sleep.

2:10:45

So, you know, you have some pasta and like .

2:10:50

But if you take a walk right after, you can moderate it.

2:10:55

And it doesn't take a lot of exercise to use up that glucose and get the

2:11:01

muscles to draw it in.

2:11:02

So, you can, it's a very interesting experiment because it changes your

2:11:06

behavior.

2:11:06

In the same way if you have a step counter, like you're more likely to park

2:11:10

further away from the store to get another hundred steps.

2:11:13

If you have a glucose meter, you're more likely to exercise after a meal, which

2:11:19

is when it does the most benefit.

2:11:19

Well, that, in that sense, it's great because it does give you data that you

2:11:24

can act on.

2:11:24

Yeah.

2:11:25

The problem is people get addicted to that data and then it starts becoming a

2:11:29

new video game that they're playing.

2:11:30

Yeah, exactly.

2:11:31

Exactly.

2:11:32

They're constantly in this anxiety, worrying about your sleep and worrying

2:11:36

about your this and your that.

2:11:37

Yeah.

2:11:38

You also learn that like if you have fat with your carbs, it kind of blunts the

2:11:43

effect.

2:11:43

Sure.

2:11:44

So, you know.

2:11:45

Butter with bread.

2:11:46

Yeah.

2:11:47

Butter with bread or olive oil on pasta.

2:11:48

Yeah.

2:11:49

All those things.

2:11:50

There's a reason for that.

2:11:51

I love when culture figures stuff out before the scientists do.

2:11:53

Right.

2:11:54

I remember that when I was writing about food a few years ago, this study came

2:11:57

out and everybody was really excited that they discovered that lycopene,

2:12:01

which is this really important antioxidant in tomatoes, can't be accessed by

2:12:06

the body in the absence of fat.

2:12:08

So, oh, olive oil on tomatoes.

2:12:10

What a great idea.

2:12:11

Wow.

2:12:12

The grandmas figured that out hundreds of years ago.

2:12:14

That's crazy.

2:12:15

Yeah.

2:12:16

So there's a lot of wisdom in cultural food preferences, the combinations that

2:12:21

we have, you know, like buttering bread.

2:12:23

I mean, all these things.

2:12:24

And how do people figure it out?

2:12:25

Have you seen the work they've done on natto kinase?

2:12:28

I'm not sure if I'm saying it right.

2:12:30

And its impact on arterial plaque.

2:12:33

No.

2:12:34

Hugely beneficial.

2:12:35

What is?

2:12:36

It comes from fermented seaweed from natto.

2:12:41

So this Japanese use of fermented seaweed in meals that they've isolated it

2:12:48

into a supplement.

2:12:49

And this supplement natto kinase, they've shown that it reduces a massive

2:12:54

amount of arterial plaque.

2:12:55

So here it is.

2:12:56

High dose natto kinase, particularly at 10,800 FU day, has shown to effectively

2:13:05

manage arteriosclerosis

2:13:07

by reducing carotid artery plaque size by 36% or more, decreasing intermediate

2:13:15

thickness and improving lipid profiles.

2:13:17

It acts as a potent fibro...what's it?

2:13:21

Fibrinoilic?

2:13:22

How's that work?

2:13:23

I don't know that word.

2:13:24

Fibrino...fibrinolytic.

2:13:28

Yeah.

2:13:29

Fibrinolytic agent that may also break down amyloid plaques.

2:13:33

Isn't that fascinating?

2:13:34

Yeah, that is.

2:13:35

So natto is...that's not from seaweed.

2:13:38

What is it?

2:13:39

It's a bacteria that they ferment soybeans with.

2:13:42

Oh, that's right.

2:13:43

Soybeans, not seaweed.

2:13:44

It's this kind of mucusy looking stuff.

2:13:46

I mean, I like it.

2:13:47

I eat it.

2:13:48

It tastes good.

2:13:49

At Japanese restaurants.

2:13:50

Right.

2:13:51

Yeah, well that's...

2:13:52

So you can get a supplement now.

2:13:53

So you don't have to taste it if you don't like it.

2:13:54

But isn't that crazy?

2:13:55

Yeah.

2:13:56

I figured that out.

2:13:57

Like the people that were fermenting things, it wasn't just to prolong its

2:14:01

shelf life.

2:14:01

No.

2:14:02

Oh, no.

2:14:03

I mean, the whole...I mean, every culture has fermented foods.

2:14:05

Right.

2:14:06

And yes, it probably began as a way to preserve foods, but then it became a

2:14:11

very important part

2:14:11

of people's health.

2:14:13

But it's also like healthy for your brain, which is really crazy.

2:14:16

Like that diet is actually good for thinking.

2:14:19

It's good for helping your digestive system.

2:14:21

It's good for anxiety.

2:14:22

It's good for mood and depression.

2:14:25

Weird.

2:14:26

All right.

2:14:27

I'm going to look into it.

2:14:28

Yeah.

2:14:29

It's fascinating.

2:14:30

Anything else?

2:14:31

Should we keep going on this?

2:14:33

I mean, there's so many different things to discuss and I want people to buy

2:14:36

the book,

2:14:36

obviously.

2:14:37

Thank you.

2:14:38

The book was like a great adventure.

2:14:39

I mean, it really was.

2:14:40

You know, I started this book with no idea where I was going.

2:14:43

I started the way you start an interview, just curiosity, no destination.

2:14:48

And it was, I learned a lot about a lot of different things.

2:14:52

I learned a lot about feelings.

2:14:53

I learned a lot about the self.

2:14:55

And it changed how I looked at things.

2:14:58

It really did.

2:14:59

I mean...

2:15:00

When you sit down, I mean, you've written some amazing books, but I always want

2:15:06

to know

2:15:06

like, what is, what's the impetus?

2:15:08

Like what, what starts you on the first steps?

2:15:11

Like what...

2:15:12

Questions.

2:15:13

Yeah.

2:15:14

Which is to say curiosity.

2:15:15

I, oh, and I teach my, I teach writing and I teach my students this.

2:15:18

Questions are more interesting than answers very often.

2:15:21

And questions have suspense built into them, right?

2:15:25

What's the answer?

2:15:26

It turns everything into a detective story if you frame the question properly.

2:15:30

So if you read any of my books or even articles, I'm kind of an idiot on page

2:15:35

one.

2:15:35

You know, I don't know something that I want to know and I have questions.

2:15:41

And then the story, the narrative becomes my figuring it out or trying to

2:15:46

figure it out

2:15:46

and going to this person and doing this kind of experiment and that sort of

2:15:50

thing.

2:15:50

That's the way I like to write.

2:15:52

I mean, if I knew the answers when I started, it'd be boring.

2:15:55

Well, I think that's why your books resonate with people so much

2:15:57

because you take them on this journey with you.

2:15:59

Yeah.

2:16:00

Instead of lecturing.

2:16:01

Yeah.

2:16:02

I hate books that lecture at me.

2:16:03

I really do.

2:16:04

And lots of books do that.

2:16:06

They have their conclusion on page one.

2:16:08

Right.

2:16:09

And then they're just kind of beating you over the head with it for 300 pages.

2:16:12

Stuffing it down your throat.

2:16:13

Yeah.

2:16:14

I don't like to do that.

2:16:15

No, I like taking people on the journey with me.

2:16:17

Well, it's interesting that you're saying this because in a sense you are

2:16:23

interacting in a pleasant way with other people's consciousness.

2:16:26

Yeah.

2:16:27

Yeah.

2:16:28

So I gave this a really interesting issue you just brought up.

2:16:31

How is my taking over your consciousness as you read my books different than

2:16:38

social media or some of the ways I'm saying are not are polluting our

2:16:42

consciousness.

2:16:42

Right.

2:16:43

I think it's very collaborative when you're reading.

2:16:46

All you have are these black marks on a page.

2:16:49

It's kind of amazing.

2:16:50

These, these letters and you, your consciousness conjures up the ideas that I'm

2:16:56

putting out there or the story I'm putting out there, but it's, it's dual

2:17:02

consciousness.

2:17:03

I think you're letting me in.

2:17:05

It's, it's, it's a, you know, a voluntary process and you're bringing a lot to

2:17:10

the table.

2:17:10

You're bringing your associations.

2:17:12

You know, I, I'm not fully describing somebody.

2:17:14

I'm just giving you a few clues and then you're conjuring a picture of a

2:17:18

character.

2:17:18

So I think it's a very active form of consciousness when you read.

2:17:24

I think that's true too.

2:17:25

When you, you know, go to a movie too, you're, you're basically saying I'm

2:17:30

turning over my consciousness for a period of time.

2:17:32

To someone I want because they have an interesting head and I, I'm going to

2:17:38

give them this space, but you know, you're, you're still in control.

2:17:43

Yeah.

2:17:44

You're deciding.

2:17:45

So I think there's a real distinction in, in how we share our consciousness

2:17:49

with other people.

2:17:50

And, um, we need to do that.

2:17:53

You know, one of the, you know, I, I said early on in the conversation that the,

2:17:57

the breach between two consciousnesses is this, is this, is this wide thing.

2:18:01

William James wrote about this, Marcel Proust wrote about this.

2:18:04

You know, he said, we're all like islands and we, we each have our own like

2:18:08

hidden signs and we have an inner obscurity.

2:18:10

He said, how do we, how do we connect?

2:18:14

And now we have language, but art is really the way that one, you know, that we

2:18:19

mind meld different consciousnesses.

2:18:21

Like art allows you, if I look at a Rothko painting, um, or read a great novel,

2:18:28

I am, um, expanding my consciousness, right?

2:18:32

And I'm letting another one in and, and I'm ending and I'm breaking my

2:18:37

isolation.

2:18:37

And that's such a beautiful, powerful thing.

2:18:39

And, and, and art is how we ferry ourselves from one consciousness to another.

2:18:44

And that's very different than like scrolling on social media where you're

2:18:48

conscious, but minimally so.

2:18:49

Well, very, very different.

2:18:50

It's also, there's something about great writing that you, the better you are

2:18:57

at expressing yourself in a way that is going to get into someone's head,

2:19:03

whether it's through nonfiction or through fiction, that the more exciting it

2:19:08

is to the person that's receiving it.

2:19:09

Yeah.

2:19:10

So the, the more skillful you are at disseminating these ideas, the more it

2:19:14

resonates with the person that's reading it.

2:19:16

Yeah.

2:19:17

And, and writers have tricks to do this.

2:19:18

Yeah.

2:19:19

You know, suspense is one of them.

2:19:20

Right.

2:19:21

Like what happens next?

2:19:22

Yeah.

2:19:23

It's so basic.

2:19:24

Yeah.

2:19:25

We want to know what happens next because our curiosity is peaked and we have,

2:19:29

you know, creating character.

2:19:30

Um, I mean there, you know, we have all these kinds of tricks to, to infiltrate

2:19:35

your brain.

2:19:36

Yeah.

2:19:37

So, so anyway, it's, it's a, it's a mysterious and kind of wonderful process.

2:19:42

Um, and, uh, yeah, I feel, I feel privileged.

2:19:47

I get to do it.

2:19:48

Well, it is a very cool thing that you do.

2:19:50

Um, one last question about consciousness itself.

2:19:53

When, when you are looking at these people that are studying it and trying to

2:19:58

get to the root of it and trying to figure out what it is.

2:19:59

And there's all these options that we discussed earlier.

2:20:02

Do you lean in one way or another?

2:20:06

Do you, do you think you have like your own personal map of what's going on?

2:20:12

No.

2:20:13

I mean, I'm, I didn't draw a big conclusion.

2:20:16

Like I'm, but I ended up, I started as a, like a materialist.

2:20:21

I kind of assumed.

2:20:22

When you started this book?

2:20:23

Yeah.

2:20:24

Really?

2:20:25

Yeah.

2:20:26

Even after psychedelic experience, I mean, they kind of opened the door a crack

2:20:30

to other ways of thinking.

2:20:31

And at the end of how to change your mind, I did talk about a little bit about

2:20:35

that other concepts of consciousness.

2:20:37

But I kind of assumed that, you know, the consensus of most scientists is that,

2:20:44

you know, materialism, that everything can be reduced to matter and energy.

2:20:49

This is the faith of our time, you know, for the last couple hundred years.

2:20:53

By the end of the book, consciousness is a challenge to that idea.

2:20:58

Um, and that idea, which is our scientific paradigm is tottering now.

2:21:04

I think there's some real reasons to, to look beyond materialism.

2:21:09

And, uh, so I ended up with the door wide open to other ideas.

2:21:14

Um, I didn't settle on one.

2:21:16

I don't know how to prove one or the other, but they're equally plausible.

2:21:22

Do you anticipate in our lifetime or in any lifetime cracking that puzzle?

2:21:28

That anyone can crack that puzzle?

2:21:30

I don't.

2:21:32

I think we don't have the right kind of science.

2:21:35

Our science, as I said earlier, was, is, is really, you know, stuck in this

2:21:41

mode.

2:21:41

It started with Galileo, right?

2:21:42

I mean, he, to save his ass basically said, we're going to leave subjective

2:21:49

things, the soul qualities.

2:21:51

That's all the church.

2:21:53

We're going to just do measurable, objective, third person science.

2:21:57

And it's been incredibly powerful and it's taught us incredible things and

2:22:01

given us incredible technology.

2:22:02

But it doesn't deal with the stuff we, we gave to the church.

2:22:08

And now they're trying to take it back and work on it.

2:22:10

And it's all, they've only been at it for like, you know, a couple decades,

2:22:15

really, the serious scientific examination of consciousness.

2:22:18

But we just may not have the right science.

2:22:21

And one of the things I explore in the book is like, how would you bring in

2:22:26

subjective experience to this objective science?

2:22:29

And Michael Levin, the biologist I was talking about who makes those Xenobots

2:22:34

says, to understand consciousness, you have to change yourself.

2:22:37

In other words, to understand anyone else's consciousness, you have to

2:22:41

experience it.

2:22:42

Therefore, you're changing your own.

2:22:44

That's a whole different scientific paradigm.

2:22:46

In the scientific paradigm, you're unchanged by whatever you do, right?

2:22:50

It's totally objective.

2:22:52

So we, it may take a scientific revolution to, to really unlock the secret, the

2:22:58

mystery of consciousness.

2:22:59

Wouldn't it be a conundrum if AI is what cracks consciousness?

2:23:04

Yeah.

2:23:05

I was having the same thought.

2:23:07

Like, maybe AI has another approach.

2:23:11

Yeah.

2:23:12

I think it's going to have to learn how to feel.

2:23:15

Well, it seems like it already feels like it wants to live.

2:23:18

Yeah.

2:23:19

And it feels uncomfortable.

2:23:20

Yes.

2:23:21

I don't think it's feelings are real.

2:23:22

I do.

2:23:23

I, you know, I think simulated thinking is real thinking.

2:23:27

Like, you know, it can play chess.

2:23:29

It can make things happen in the world.

2:23:31

Simulated feeling is not real feeling.

2:23:33

It doesn't have a soul.

2:23:34

It doesn't have a soul.

2:23:35

Thank you, Michael.

2:23:36

Let's keep it that way.

2:23:37

I really enjoyed this.

2:23:38

Thank you very much.

2:23:39

You're awesome.

2:23:40

Thank you too.

2:23:41

I really love your books.

2:23:42

It's always a treat.

2:23:43

All right.

2:23:44

Bye everybody.

2:23:45

Bye.