#2473 - Bill Thompson

45 views

2 months ago

0

Save

Audio

Bill Thompson

1 appearance

Bill Thompson is a retired U.S. Army Chief Warrant Officer and the founder and CEO of Spartan Forge, a company that develops AI-powered mapping and predictive tools for hunting. www.youtube.com/@spartanforgeai www.spartanforge.ai

ChatJRE - Chat with the JRE chatbot

Timestamps

0:12Rendezvous living and traditional crafts: the gifted knife, brain-tanning hides, and pre-1840 camping rules
9:58Camp culture, coming-of-age rites, and the fallout of divorce culture
19:57Discipline vs "suicidal empathy": government incentives, fraud-driven GDP, and why bureaucracies grow

Show all

Comments

Write a comment...

Transcript

0:00

Joe Rogan podcast, check it out.

0:03

The Joe Rogan experience.

0:05

Train by day, Joe Rogan podcast by night, all day.

0:09

What's up, Bill?

0:12

How you doing, Joe?

0:13

Good to see you, bro.

0:14

Good to see you.

0:14

This might be one of the coolest things anybody's ever given me.

0:17

So, you gave me this knife.

0:20

Explain all this.

0:22

All right, so, I mean, there's a larger explanatory reason behind this.

0:26

My brother and I grew up, my father died when I was five.

0:30

My brother and I grew up doing these things called rendezvous.

0:33

Have you ever heard of them?

0:34

In what way?

0:36

What is a rendezvous?

0:37

So, there you go.

0:37

So, what a rendezvous is, it's not, you know, you go to those, like, I don't

0:42

even know what

0:43

they're called, but people do, like, reenactments.

0:45

Oh, okay.

0:46

Like, Civil War reenactments?

0:47

It's not like that.

0:48

So, that's the closest thing, approximation, to probably what it is.

0:52

You get invited to them, or these days are easier to get to, but my stepfather,

0:57

the guy

0:57

my mother remarried, brought us to them.

0:59

All you do is camp, but you're only allowed to camp, and no one comes to the

1:03

camp, or sometimes

1:04

they might have people at the end, but while you're in the camp, everything in

1:06

the camp

1:07

has to be 1840 or prior.

1:09

So, there can be no modern appurtenances, nothing like a, you know,

1:12

refrigerator or nothing

1:13

like that.

1:14

1840?

1:15

Why that year?

1:16

At the end of the fur trapping.

1:17

At the end, like, that was considered, like, Jeremiah Johnson time, like, peak

1:20

fur trapping.

1:21

Oh.

1:22

So, there's people, you know, they dress like either, you know, revolutionary,

1:27

like, American

1:27

revolutionaries, or they dress like mountain men, or they dress like Indians.

1:30

How'd you guys dress?

1:31

Mountain men.

1:32

So, while we're there, you learn all kinds of stuff while you're reenacting.

1:36

Like, I learned how to brain tan hides.

1:39

I learned how to traditionally art, or do traditional archery.

1:42

Stuff like that.

1:43

So, anyway, this knife was a knife I had actually started working on with my

1:45

brother a while

1:46

ago.

1:47

I do more of, like, the brain tanning, tomahawk filling.

1:51

And when you're saying brain tanning, you're talking about using brains to tan

1:54

animal hides,

1:55

right?

1:55

Yes, yes.

1:55

Using animal brains.

1:56

Yeah.

1:57

What does brains do?

1:58

Why does brains work?

1:59

It softens the leather in a natural way.

2:01

And what's cool about it is every animal, no matter what animal you kill, has

2:05

the exact

2:05

amount of brain needed in order to tan the hide.

2:08

So, you don't need any additional, like, people use egg yolks or mayonnaise or

2:12

something like

2:12

that, all you do is you take the brain out of the cavity, you grind it up, you

2:16

mix it

2:16

into some water, and then after you've cleaned the leather and you've scraped

2:21

it clean, you

2:22

stretch it.

2:22

I usually use, like, a dull shovel.

2:24

You stretch it over the dull shovel, and then you soak it in the brain water

2:28

mixture, and

2:30

then you just keep repeating that pattern, and the leather gets, like, a really

2:34

nice soft

2:34

feel to it.

2:37

What is it about the brain?

2:38

Is it the fat?

2:39

It breaks down the leather.

2:41

I'm not sure if it's the fat, or I haven't gotten that deep into it, but it

2:44

breaks down

2:45

the leather and just makes it feel really soft, really nice.

2:47

So, anyway, this knife here, I started, I killed that bear, so the jaw is made

2:52

out of two bear

2:54

jaws, or out of one bear jaw split in half.

2:57

So, that was a bear I killed in Canada in 2017.

3:01

It was my biggest black bear.

3:03

And so we split the jaw, put that together.

3:07

It's Irish linen threading.

3:09

Then that's a knife that my brother picked up that was from 1860.

3:13

It was totally rusted.

3:14

We had to grind it back, or he had to grind it back down.

3:17

And then the sheath is traditional.

3:21

Like, you know, you could, the cool thing about doing rendezvous, and the cool

3:24

thing about

3:24

this is, you could have a DeLorean and drop that in 1840, and somebody would

3:28

pick it up

3:29

and think it was made yesterday.

3:30

And so, everything on there has been done traditionally, from the quilling on

3:35

the beadwork is made from

3:37

porcupine quills.

3:38

The backing is buffalo brain tan.

3:41

And then the front is beaver hide, or beaver tail, I'm sorry.

3:46

And then the sides are horse and turkey hair hanging off of it.

3:51

And these are bear teeth?

3:52

And those are bear teeth, yep.

3:54

Wow.

3:55

From the same bear.

3:55

So, when I was thinking about what I was, because I wanted to give you

3:58

something for inviting

3:59

me on, because it's still a shock to me that you did it.

4:01

Even though we've been talking for so long, I just never imagined a scenario

4:05

where you'd want

4:06

to have me on here.

4:07

Well, you're an interesting dude.

4:08

I thought, what could I give this guy that, you know, money or people or

4:12

whatever couldn't

4:13

get you?

4:13

And so, I thought, this is the right thing to do.

4:16

So, it went from a me project to a you project.

4:18

And my brother, Aaron, helped me out with it tremendously.

4:22

So, how'd you find this knife from the 1860s?

4:25

Well, he found it.

4:26

My brother is even more esoteric and odd than I am, believe it or not.

4:31

And he collects this kind of stuff.

4:33

I mean, the guy who dated it said 1860 to 1890 is what they figured.

4:39

And you can tell by the way that like around the hilt and the way that it's the

4:44

pitting

4:45

on it and stuff like that and the way that it was made that it fits that era.

4:49

I mean, it could have been somebody redid it in 1900.

4:52

But it's definitely that old, like the type of steel and the way that it was

4:56

worked and

4:57

the way that it is around the hilt around the bottom there.

4:59

Wow.

5:00

And so, it's at least, you know, 130, 140, but most likely 160, 170.

5:06

It actually fits my hand perfect.

5:08

Yeah.

5:08

So, that's also something my brother and I talked about, about how long it was

5:12

going to

5:12

be and we made some educated guesses and put it all together.

5:16

So, yeah, I mean, like I said, not something you can just go pick up somewhere

5:19

or something

5:20

that will, you know, hopefully mean something.

5:22

Not saying it's practical, like it's not something you'd be gutting an elk out

5:27

with, but.

5:27

Well, if we get attacked by zombies in the studio, it's a good thing to have on

5:31

the desk.

5:32

Yeah.

5:32

I mean, if you're going to make a last stand, you know, that's a pretty good,

5:35

that's a pretty

5:36

good knife to make your last stand with.

5:38

That's a good way to go out.

5:39

Yeah, exactly.

5:40

That's awesome, man.

5:41

Yeah.

5:41

So, the rendezvous, we did those from.

5:44

How long do they last?

5:45

They vary from a week and then some go up to three weeks.

5:49

And what do you do for food while you're out there?

5:51

So, inside of your, so there's two types of rendezvous.

5:54

At most rendezvous inside of your lodge, you can have a cooler as long as it

5:59

doesn't leave

6:00

the lodge.

6:01

So, I have like a 20-foot teepee that I take to these things.

6:04

And inside of my teepee, you can have a cooler.

6:08

Oh.

6:08

And some modern appurtenances.

6:09

Did they have any kind of coolers in the 1800s?

6:12

I mean, they had ice boxes and like steel ice boxes and that type of thing, but

6:16

nothing

6:16

like we have today.

6:17

You know, stuff was getting dug out, buried in the ground, or put into the

6:21

ground, like

6:22

cool areas of the ground or dig outs.

6:24

And they dried everything.

6:25

So, pemmican would have been the, you know, everyday thing to eat.

6:29

That's just dried.

6:31

So, did you bring your own food or did you have to hunt for food?

6:33

Well, so you bring your own food, but there are other rendezvous that are kind

6:37

of invite

6:38

only.

6:38

And I don't even think a lot of people who do rendezvous knew about these, but

6:41

there's

6:41

ones that I think they're called, I think I might be speaking out of school.

6:44

Somebody might send me an email after this, but I'm going to talk about it

6:46

anyway, because

6:47

I never got read the right act.

6:48

They're called juried.

6:49

I think they called them juried Southerns, and I've only been to one of those.

6:52

And that's where everything in the camp has to be pre-1840.

6:55

You meet down in a parking lot, you put everything on the back of a mule, and

7:00

when I did mine,

7:01

it was up in the, I think it was the Bighorns.

7:03

So, you know, you talk to a rancher, get everything packed up, you go into the

7:08

back of

7:09

the Bighorns, and everything in camp has to be pre-1840, as close as it can get.

7:13

They'll even look at your stitching and say, oh, that was sewn with a sewing

7:17

machine.

7:18

You got to take that off, and it's always these weird, like, eccentric history

7:22

teachers

7:22

that run them, like guys who, you know, teaches history at Berkeley or

7:26

something like that,

7:27

or other places, and they just really enjoy living like this.

7:30

And at those ones, if they're in season, you can hunt whatever's in season.

7:33

You're hunting with traditional archery, and it's really good for kids.

7:37

Like, the internet wasn't a problem as much when I was a kid.

7:40

I was certainly into computers.

7:41

I have been since I was a child, but you could just detach.

7:45

Everyone's running around, crazy, sitting around the campfire at night.

7:49

People are singing with, you know, songs and the guitar.

7:51

You're learning how to do things like this.

7:53

You're learning how to brain tan.

7:54

You're learning how to live traditionally.

7:55

And it's an eccentric cult, kind of.

7:59

It's not a cult.

7:59

It's an eccentric group of people.

8:01

It's a lot of fun.

8:02

People take it very seriously.

8:03

People take it very seriously.

8:05

There's more advertising surrounding it now than there used to be, because

8:09

numbers are kind of dwindling.

8:11

But I did my last one last year with my brother.

8:14

So if you go on my Instagram, there's a picture of my brother, my son, and I

8:18

doing, I think, our second rendezvous together.

8:20

And we're just dressed like, you know.

8:22

I've actually got an awesome war shirt.

8:24

I can show you the picture.

8:25

I've got an awesome war shirt that a friend of mine went to war with.

8:29

He was half Native American.

8:31

His grandfather was Ojibwe or something, Chippewa, something like that.

8:37

And he was, I don't remember what his role was.

8:40

But anyway, we deployed to Iraq together.

8:43

And his grandpa made me this war shirt.

8:45

Oh, there you found it.

8:46

Jamie?

8:47

He pulled it up.

8:48

That's my lodge.

8:49

How much do you enjoy a shower after you get out of here?

8:54

I mean, as long as you keep, you know, they have showers in camp.

8:59

They've got a showering area where it's just like pallets.

9:03

That's the inside of my lodge.

9:04

So there's a cooler at this one.

9:06

This is not a juried rendezvous.

9:08

And so you can shower while you're in.

9:12

Some of them, they call them hooters.

9:13

They'll be like a latrine and a shower area in camp.

9:16

But also, like, some of them, I don't do it at all.

9:19

This is wild.

9:20

And so there's no reenactment.

9:22

Like, there's not like civilians walking around.

9:24

It's not like renaissance figure.

9:25

Yeah, exactly.

9:26

It's just more like I want to act like it's 1840 for a couple of weeks and not

9:30

look at my phone one time and not worry about the news.

9:33

It's amazing.

9:34

After a week here, you really forget about the world.

9:37

And you're like, don't even know you're supposed to be stressed out about

9:39

things.

9:40

You're just out there doing your thing for a couple of weeks.

9:43

And you just cook over open fire.

9:45

Everything gets done traditionally that way.

9:46

And did you bring your own meat and everything?

9:48

Yeah, you bring your own meat and stuff in the cooler.

9:50

And then there's also cooking classes where they teach you, like, all the

9:54

recipes to do with, like, a Dutch oven.

9:56

Like, an old cast iron oven.

9:58

And they do gambling at nights.

10:01

So you'll walk into, like, a huge, they call them marquees, but it's, like, a

10:04

huge 100-foot square lodge.

10:06

There'll be three gambling tables in there, girls in, like, the low-cut shirts

10:10

and dealing cards and smoking cigars and just having an amazing time.

10:14

And you go by camp names while you're in there.

10:17

Nobody uses their real name.

10:18

Well, some people use their real name.

10:19

I'd say 60% of people don't use their real names.

10:22

What was your camp name?

10:23

This is embarrassing.

10:26

It should be.

10:27

Yeah.

10:28

So I got my camp name.

10:30

I got christened with my camp name in the bighorns when I was 14 or 13.

10:36

And it was Talks-a-lot.

10:38

Talks-a-lot.

10:39

Yeah, in Sioux, it was pronounced I-a-ota.

10:41

Just because you talk a lot?

10:43

When I was a kid, I talked a lot.

10:45

Actually, as an adult, I don't talk that much unless I know you.

10:48

But as a kid, I would never shut up.

10:51

I had really bad ADHD.

10:52

They kind of diagnosed me with having some low-level version of Asperger's.

10:57

And I was a rapscallion in class, just never shut up, never listened, never did

11:02

anything.

11:03

Those are the people that are the most fun.

11:06

Well, they didn't enjoy me in high school or in grade school.

11:09

I probably would have been your friend.

11:11

But, yeah, they called me I-a-ota.

11:14

And, you know, we got christened.

11:16

And it was a, you know, it's a, one of the things we're kind of missing in

11:20

culture today

11:21

or something that I'm trying to reinvigorate, especially with my son and with

11:24

other, you know,

11:25

young men that I run into is kind of like coming-of-age rights.

11:28

Yeah.

11:29

Something to say, you're a man, and I'm going to start treating you like a man

11:33

from this moment forward.

11:34

Like, you know, what does that, there should be structure to that.

11:37

You know, we're tribal.

11:38

And it's important to me, so.

11:41

I think that is really something that's missing from society.

11:44

I think that, I used to think it was silly when I was young.

11:48

And then as I got older, I realized, well, I went through that.

11:51

I became a black belt, and I started fighting.

11:53

And you had a group of men telling you, you're at this level, we're going to

11:56

treat you like that.

11:57

Yeah.

11:58

And if you fall from grace, we're going to remind you right away.

12:00

Yeah.

12:01

And we just don't do that with young men.

12:03

And we have a society now where young men act like young men until they're 45

12:07

or 50 or 60.

12:08

And sometimes never stop.

12:09

Yeah.

12:10

And, you know, women, nature imposes itself on women.

12:13

They become fertile.

12:15

They're able to have babies.

12:16

And they've got to seek security or find a husband or a really good job that

12:21

will supplement whatever a husband would provide.

12:23

And they've got to start acting like a woman, whereas men can sit in a basement,

12:27

you know, and it becomes very dangerous.

12:29

Especially men that never have children.

12:31

Yeah.

12:31

And they're perpetual children.

12:33

Yeah.

12:34

And if you don't impose nature on yourself by undergoing those types of rights

12:37

and understanding what it means to become a man, nature will impose itself on

12:40

you by either A, you're never going to have children, and therefore you're dead

12:43

forever.

12:44

Or B, it will kill you because you're fat and in your mom's basement, you get

12:48

diabetes, the foot chopped off, and you're 35.

12:51

And, you know, we just don't tell men.

12:54

We don't have a—the military did it for me.

12:56

I had really put off responsibility or seeking meaning or any of those things

13:02

until I was in the military.

13:04

And like I said, my father died when I was five, so I really had no central

13:07

male authority until I was about 13 or 14 when I met this guy, Steve.

13:13

And he kind of initiated some of those rights for me and held me to account.

13:18

But it was really the military, which was a turning point for me where there

13:22

was a standard and I was expected to hold it.

13:25

I think there's a reason why most ancient cultures and a lot of ancient

13:29

religions have these rites of passages where you are like now officially,

13:33

officially a man.

13:34

Yeah.

13:35

Officially, you know, you're responsible.

13:37

You have to think of yourself as a different thing now.

13:41

Whereas if you leave it up to your own decision, men sort of dwindle into this

13:46

perpetual state of childhood.

13:49

Yep.

13:50

And it's not about you anymore.

13:51

It's about other people.

13:52

Like that—for me, having children, I've got four kids.

13:55

Really, you know, the military was kind of the first inkling of responsibility.

14:01

But then having children and realizing this isn't about me at all.

14:04

Right.

14:04

And I need to be willing to break my back for these people who depend on me.

14:07

There's this weird primal feeling that, you know, you're responsible for these,

14:12

like, very vulnerable little people that you love more than life itself.

14:16

It just changes everything.

14:18

It just kicks you into gear.

14:19

But for some people, it doesn't.

14:21

You know, some people that are so stuck in that perpetual childhood thing, they

14:25

just wind up deciding it's too much of a drag and they get divorced.

14:29

Yeah.

14:29

You know, and then they fuck up the kids.

14:31

Yeah.

14:31

God, we have so many rabbit holes we could go down on this.

14:35

Yeah.

14:35

But, I mean, it was, you know, growing up in the 80s and the early 90s, it was

14:41

really like a divorce culture.

14:44

And I obviously understand that if you're in a bad relationship or an abusive

14:47

relationship or, you know, there's certainly there's a threshold where marriage

14:51

should dissolve.

14:52

No question.

14:54

But I kind of feel like the central thrust of a lot of culture at that time was

14:57

about, like, divorce or not getting married or, you know, discovering yourself

15:01

and that type of thing, which in some ways is good.

15:04

There's goodness there.

15:05

But when it becomes a central thrust or a central narrative and divorce becomes

15:09

very easy or it's happening everywhere, it's super normalized.

15:12

And it's normalized.

15:13

It's super destructive.

15:14

Children are the ones who suffer the most on it.

15:16

And I think the data is clear on that.

15:18

When you look at, you know, single parent homes or no parent homes or being

15:21

raised, you know, without an authority.

15:24

Or an abusive step person.

15:25

Or an abusive.

15:26

And that is, you know, when you look up the stats on that, like remarriage and

15:30

having a new family, like that becomes the single most likely vector of abuse

15:34

in a young child's life is that new person, right?

15:38

Because now they're raising someone else's kid or whatever.

15:40

I mean, it's a, that's in every old movie, the evil stepmother, you know, or

15:46

evil stepfather.

15:47

But in the old movies, it's always the stepmother that abuses the girl.

15:51

Yes.

15:52

And so, you know, I kind of, I kind of resented that part of that time, that

15:57

culture was, I shouldn't say when I was a child, I should say as I got older,

16:03

because I wasn't a single mom home.

16:04

And the guy that my mother remarried right after my father died was abusive.

16:09

And, you know, he really got hard on my younger brother.

16:12

And, you know, my mother moved us out almost immediately.

16:15

But when I reexamined that time, it really was, you know, I don't know how to

16:21

describe it.

16:22

But, you know, there are no rules when it comes to relationships and family.

16:27

And every family is special in particular in its own way.

16:30

And they all need to be venerated.

16:31

And there's, of course, some truth to that.

16:33

We shouldn't deride someone because they come from a broken family.

16:36

But we shouldn't elevate it like it's at the same level as a unified family.

16:40

And that's a tricky line to walk.

16:44

But also, the people who are making those movies and that culture came from the

16:47

50s and 60s, where divorce was just not in the cards.

16:51

And so that was, you know, Hook's law.

16:53

As you bend any object, it wants to return back to its natural state.

16:58

And Hook's law kind of played there where nobody could get divorced in the 40s,

17:01

30s, 40s, 50s, and 60s.

17:04

And then you had the baby boomers who kind of culturally said, you know,

17:07

actually, it's not as bad as we think.

17:09

But then it overcorrected.

17:11

And then it became kind of part of that cultural zeitgeist.

17:14

That's kind of what humans do, right?

17:17

Yeah.

17:17

We always overcorrect.

17:18

Yeah, we do.

17:19

Yeah.

17:19

We go in one direction until we realize it's destructive.

17:22

And then we overcorrect until we realize that's destructive.

17:26

This episode is brought to you by Ketone IQ.

17:29

The demands on my time, energy, and focus are immense.

17:32

So when I need my brain to lock in for hours and hours and fire at its fastest,

17:38

most alert state, I'm taking Ketone IQ.

17:41

It's an energy shot powered by this little miracle molecule that your body

17:46

already naturally makes and your brain especially loves.

17:50

Ketones.

17:51

I've been talking about ketones for over a decade, and this company's finally

17:55

figured out how to put them in a bottle.

17:57

When I take Ketone IQ, I drop right into a state of laser-like focus and

18:02

sustained mental clarity.

18:04

Whether I'm podcasting, training in the gym, or just want to show up locked in

18:09

when it matters, the difference is night and day with Ketone IQ.

18:13

Visit ketone.com slash Rogan for 30% off your subscription order.

18:20

Or find Ketone IQ at Target stores nationwide in the protein and electrolyte

18:26

aisle and get your first shot free.

18:29

Plus, they have a 60-day money-back guarantee.

18:32

That's how confident they are that you're going to love the increased focus you

18:36

get from Ketone IQ.

18:38

And I would say that's the—and this isn't a political thing, this is just the

18:42

reality of it.

18:43

That's mostly what makes me conservative in nature, is I agree systems need to

18:47

change, but they need to change slowly and pragmatically.

18:50

So we—because, you know, any social—any social scientist worth their salt

18:55

will know a social experiment almost never has the outcome that we thought it

18:59

was going to have.

19:01

In other words, we thought doing something to society would form society this

19:04

way, but it almost has the inverse, the anti-pattern, like we talked about

19:08

before, and it almost ends up propagating itself.

19:12

And so that makes me—I'm still a proponent for change, but it should be slow

19:18

and thought out and done in pockets first.

19:21

Yeah.

19:22

Kind of, you know, federalism.

19:23

Let's do little changes here.

19:25

Let's let California be crazy for a while and see how that works out for them.

19:28

But let's not nationalize the craziness.

19:31

Let's learn from what they learned there, and there'll be goodness, you know,

19:35

hot baristas that make great coffee and cool art.

19:37

And let's take those parts, but how about the rampant homeless?

19:41

Let's find out what caused that and solve for that.

19:45

And, you know, that was kind of the founder's intent with federalism.

19:48

They're really federalist-minded, state-minded, and there's, you know, even for

19:53

that being as 250 years ago, there's a profound amount of profanity in that.

19:57

Like, let's change things slowly and let social experiments take place and

20:01

adopt the best parts of those things and then integrate them to the culture

20:04

overall as we move along.

20:06

But, you know, let's not throw the baby out with the bathwater.

20:09

Yeah, I think in this country, one of the primary problems that people have is

20:14

a profound lack of respect for discipline and how important discipline is for

20:18

your life.

20:19

And discipline is associated with conservatism.

20:23

And because of that, like, a lot of people think that I'm—I don't think I'm

20:27

anything.

20:28

I think I have—politically or ideologically, I have a lot of everything in me.

20:34

I don't think I identify with one side or another.

20:37

But if one thing that I agree with conservative people on, conservative people

20:41

lend more towards the importance of discipline, hard work, discipline, don't

20:46

complain, get things done, deal with the hand that you've been dealt with, and

20:51

just sort it out and get to work.

20:54

Don't cry, don't look for other people to save you.

20:56

They're not going to.

20:57

And this is not something that's celebrated in society.

21:02

It's thought of as a cruelty that if you say that you need discipline, that,

21:06

well, you're not treating these people that are victims of circumstance with

21:11

the proper respect or with the proper empathy.

21:14

And I think a certain amount of empathy is probably not so good for you at a

21:18

certain point in time.

21:19

There comes a point in time where you're letting people wallow in their

21:22

bullshit and just make excuses for why they're not getting anything done.

21:25

And in that sense, I think California is—that is a giant part of what's wrong

21:29

with California.

21:30

What's wrong with California when it comes to crime?

21:33

What's wrong with California?

21:34

You know, the way they address crime and the way they address homelessness and

21:38

all these issues that they have, they don't put their foot down.

21:42

At a certain point in time, you've got to realize, like, what Gadsad calls

21:46

suicidal empathy.

21:47

Society can suffer from suicidal empathy.

21:50

And at a certain point in time, you've got to enforce rules and you've got to

21:53

make it so that people have to get their shit together.

21:55

Yeah, and that suicidal empathy becomes a way for the person who's imposing it

21:59

on someone else to feel good about themselves, which makes it even trickier and

22:03

even more insidious because they're feeling good from the weaponization of

22:10

other people's lot in life.

22:13

And the thing about that is none of the rules that you're going to impose,

22:17

especially as a legislator or as somebody in a think tank, you'll never feel

22:21

the repercussions of them.

22:23

You'll never have to actually deal with it day to day.

22:26

You're just imposing it on someone else and saying, I better understand the

22:29

structure of reality and the fabric of the world.

22:32

And you can't help but be this way.

22:34

It's the system that's done this to you.

22:36

So let me give you pittance that I'm going to take from someone else.

22:40

And that makes me benevolent.

22:42

I get to feel good about that.

22:44

That's a giant part of government for sure.

22:45

That's a giant part of what's the problem with like liberal governments.

22:48

Liberal governments should, they should get paid based on whether or not the

22:55

city does better or worse financially than when they were in office.

23:00

If their policies lead to greater domestic production of goods and services and,

23:06

you know, GDP does better and everything does better, then you should get paid

23:11

more.

23:12

If more real estate sales, more people are making more money, medium income

23:15

raises, less homeless people, you should get paid more.

23:19

And you should get paid less if homelessness goes up, if crime goes up, if

23:23

there's more destruction, if there's more, you know, assaults and home invasions,

23:27

you should get paid less.

23:29

Right.

23:30

You're doing a shitty job.

23:31

And if you did that, I think they would impose laws that made it safer and

23:36

healthier and made it for, you know, better for society.

23:39

Yeah.

23:40

And then they would just inevitably change the ways that we track and measure

23:43

those things and pay themselves more.

23:45

Well, they shouldn't have the opportunity to do that.

23:47

Then you need some sort of an oversight.

23:49

I'm being cynical.

23:50

You're right, though.

23:51

You're right to be cynical because that's what they do about everything.

23:53

Someone was explaining to me yesterday that one of the problems with cleaning

23:59

up fraud is that fraud is responsible for a giant percentage of GDP.

24:05

And if you have hundreds of billions of dollars of fraud in this country and

24:10

you eliminated that, you actually lower GDP because you actually lower the

24:15

amount of money that's in circulation.

24:18

That's interesting.

24:19

I've never thought about that before.

24:21

He was explaining to me and I was like, oh, my God, that is crazy that a giant

24:25

percentage of our GDP is fraud.

24:28

And if that was somehow or another eliminated, it'd be like one of the things

24:32

that they do when they raise jobs, like they increase GDP.

24:36

We've added, you know, 200,000 jobs to the market.

24:40

Well, what are those jobs?

24:42

Like, what are those jobs?

24:43

Are these government jobs?

24:44

Because the government is a giant percentage of our GDP, government jobs.

24:49

You know, it's way bigger than it should be.

24:51

Way bigger.

24:51

And those jobs, a lot of them are bullshit and waste, a lot of them.

24:55

Yeah.

24:56

You know, and that was some of the stuff that was uncovered during Doge.

24:59

You know, the limited amount of access that Doge had to it.

25:01

Just the beginning of it where you got to see the curtain pulled back and get

25:06

to see exposure of so many of these fraudulent, supposedly charitable

25:11

organizations that were really just money laundering.

25:15

They were really just funneling money into these people's hands, like the

25:19

homeless thing in California.

25:21

Oh, my goodness.

25:21

It's a bonkers situation where they've spent $24 billion, they cannot track it,

25:27

they've tried to audit it, the government has vetoed these audits, and they

25:33

have no idea where that $24 billion went, and yet homelessness went up.

25:38

But you've got a giant machine that is this homeless establishment, this

25:44

homeless industrial complex that is being funneled money into that, and that

25:49

actually aids the GDP, which is kind of crazy.

25:53

Yeah, I mean, it was one of the things, my last three years in the military, I

25:58

was advising a colonel and a two-star general, and they were in charge of all

26:02

of the offensive cyber development, ethical hacking, offensive cyber

26:07

development.

26:08

I was their technical advisor, and one of the things I kind of learned about

26:13

government at that point was these systems have their own incentive, and the

26:18

incentive's not the output of their purported mission.

26:21

The incentive is the growing of the organization and the execution of budget.

26:27

So while they're in there, you know, I've never seen a field grade officer get

26:30

dressed down more than when he didn't spend all of the money that he was budgeted

26:34

for for that year.

26:35

Isn't that crazy?

26:36

He would go to the Pentagon, and they'd be like, well, you didn't execute $300

26:40

million of OCO, of Overseas Contingent Operations Funds, here, and they would

26:44

dress him down for an hour.

26:46

And what people don't understand is, if you don't spend that money, your budget

26:50

for the next year will be lower, because there's no need to have a higher

26:54

budget.

26:54

Instead of tying it to mission to say, did you achieve your mission objectives,

26:58

we started the year agreeing from the president's framework, the NIPF, the

27:02

National Intelligence Priority Framework, we wanted to achieve these effects.

27:06

What you would want to hear is, we achieved them, and we saved 25%.

27:10

But instead, it's, we achieved them, but we didn't execute all of this money.

27:14

Well, you're fired.

27:15

And I literally have seen that happen.

27:16

I've literally seen that happen.

27:18

And that kind of-

27:20

What a sick society.

27:21

Yeah, and that kind of shifted my thinking in that these systems have their own

27:26

incentive to exist and to grow, because those guys that were holding that

27:30

general officer, that 06's, that colonel's feet to the fire, they also have an

27:35

incentive to-

27:36

Because they were part of that trickle down.

27:37

And they've got bureaucracy that surrounds them.

27:40

And if they didn't execute it, that means they didn't execute it.

27:43

And that means they have to go to whomever.

27:45

This was during the Biden administration.

27:46

I believe Hegseth, for everything we could say, has actually tightened this up

27:50

quite a bit.

27:51

And he's kind of rehauled the way development works, especially on the

27:54

offensive cyber side.

27:56

But they have bureaucracies, and the incentive of the bureaucracy is to make

27:59

sure that we grow.

28:00

And that's it.

28:01

And then you think about that for a minute, and you're like, well, it's no

28:05

longer a question why we have $30 trillion of debt.

28:08

$39.

28:09

And then, what, like $150 trillion of unfunded liability.

28:13

In other words, we've promised people money for the next 30 years.

28:17

And it's debt that, you know, I don't see how we'll ever escape that debt.

28:23

And it's the thing about it is, and I don't want to be pigeonholed because I'm

28:28

actually quite liberal when it comes to my politics are like yours in that I'm

28:32

kind of a man without a home.

28:34

But they also change at different levels of analysis.

28:36

I'm very liberal with my family, and I'm very, like, communist.

28:40

I protect them.

28:42

I give them everything they need.

28:43

I'm trying to give them structure.

28:45

And even in my community, I'll help someone out out of pocket or do something

28:49

for them that's a strain on my time or might hurt something else because there

28:53

are really no solutions.

28:54

There's just tradeoffs.

28:55

That's supportive for the community, though.

28:57

That's how people are supposed to do charity.

28:59

And I'm also very nonjudgmental in someone how they care.

29:02

I don't care what they do in their house.

29:03

I don't care if it's a Roman orgy on the weekends, like, be a predictable,

29:07

productive person Monday through Friday and go do your Roman orgy on the

29:11

weekend.

29:12

I don't care.

29:12

I won't judge you.

29:13

Like, I don't – I really have enough crap in my own life.

29:16

As long as someone's not getting hurt.

29:18

Yeah, as long as no one's getting hurt.

29:19

Consenting adults.

29:20

Like, I have enough problems, and I screw up enough, and people have – there's

29:23

a laundry list of things that people can say about me, how I've screwed up in

29:26

my life.

29:27

But then as I graduate and get higher and higher, more conservatism takes place.

29:33

And that's a result of just, you know, having an engineering mindset when I'm

29:39

looking at life and understanding that it's just not Republican or Democrat or

29:45

leftist or rightist or liberal or classically liberal.

29:49

All of these monikers don't work for me because they break down at some level

29:54

of analysis.

29:56

I think that's the problem.

29:57

I think the problem is these ideologies that people subscribe to, where you

30:01

have a predetermined pattern of thinking that you're supposed to adopt.

30:04

Yes.

30:05

You're supposed to adopt these opinions.

30:06

And some of them just don't fit.

30:08

And that's how people get pigeon – that's like people on the left.

30:11

They get pigeonholed into weird stuff that you can't really, really justify,

30:15

like trans women in sports.

30:17

Like, what the fuck are you doing?

30:18

Like, we're, you know, we're being inclusive.

30:20

Like, no, you're not.

30:21

We're loving the borders of Ukraine while hating our own border.

30:23

Yeah.

30:24

Fucking bonkers.

30:25

Yeah.

30:26

There's so many crazy things.

30:28

There's so many crazy things that people just adopt that don't make any sense.

30:32

And, you know, when you subscribe to an ideology, the problem is if, like, you

30:36

define yourself as this person, I am this, I am a hardcore right-wing, blah,

30:41

blah, whatever it is.

30:43

You immediately close the door to all the very productive and interesting

30:47

things that the other side thinks.

30:49

Yeah.

30:49

And you're also making yourself into a tool of propaganda.

30:52

Mm-hmm.

30:52

Because if I meet someone and they just say, I'm this.

30:56

Right.

30:56

It's like, well, I could reasonably predict everything that's going to come out

30:59

of your mouth.

31:00

Yeah.

31:00

That's not entertaining.

31:01

I don't want to have a conversation with that person.

31:03

Right.

31:03

I can't seek to learn from them because I could just pick up the Communist Manifesto

31:07

or Mein Kampf and have a pretty good understanding of who I'm dealing with.

31:10

And, therefore, a conversation is not relevant.

31:12

It's not needed.

31:14

A lot of people are afraid of social ostracization, too.

31:17

So they're afraid of straying outside of the narrative, whatever side they're

31:21

supposed to be on.

31:23

And, you know, some groups are really good at making you feel like dark shit if

31:28

you don't agree entirely with even things that don't even make any sense.

31:31

Yes.

31:32

And so that's why people go along with stuff that's illogical, like open

31:35

borders or whatever it is.

31:37

Yeah.

31:37

They go along with things that's not in their best interest because they're

31:40

scared.

31:41

They're scared of being ostracized.

31:43

They're scared of being cast out of the kingdom.

31:45

They're scared of being excommunicated.

31:47

Yeah.

31:47

I dealt with a lot of people first when I retired from the military and then

31:52

more recently leading up to the last election where, you know, I was

31:56

entertaining the deal of doing some work for government, believe it or not.

32:01

And because I'm as we talk more, you'll figure out I'm pretty anti institutions.

32:06

I'm really against those types of things.

32:10

But I really felt if you would have asked me three years ago how I felt about

32:13

the Trump election and all of that stuff, I was very excited because he was

32:16

saying a lot of things that I wanted someone to say.

32:19

Trump fits a pattern and this is what people I think kind of lack when they, my

32:24

whole life is built around pattern analysis.

32:27

I really enjoy patterns and exhuming and looking into patterns.

32:32

And there's a pattern of like a, you'll laugh when I say this first part of the

32:36

pattern, but then I'll make it make more sense later.

32:40

But he fits the pattern well, first he's a Jacksonian and, and, and, and, and

32:45

that he's a pragmatic person the way that he governs, which I liked, or at

32:49

least I did.

32:50

And, you know, there's some things he's done recently that I don't enjoy.

32:53

And, um, but he's also a, an outsider or a, or a savior type, uh, Allah, you

32:58

know, I don't remember the movie, but the magnificent seven back in the day, I

33:04

don't remember the actor's name.

33:07

There's this group of, you know, there's this Western town, everything's going

33:10

to shit.

33:11

These seven guys walk in.

33:13

I think Chris Pratt remade it with Denzel Washington or someone else.

33:16

Oh, really?

33:16

I think so.

33:17

I can't remember, but there's an old one that I used to watch from my grandpa.

33:20

God, there's too many movies.

33:21

Yeah.

33:21

And, uh, there's this pattern where you wouldn't invite these guys to a dinner

33:25

party.

33:26

You wouldn't want them in church on Sunday, but when a system is so corrupt and

33:30

so horrible, you have to rely on these types of people to come in and be a

33:34

check to the system.

33:36

But then also you don't want them to stick around when the system is reset.

33:39

So there's a scene in the movie where he says, uh, you know, man, these, this

33:43

set, these seven guys are talking and they said, man, these people must have

33:46

really wanted us.

33:47

Like, it's crazy.

33:48

They must be happy we're here.

33:50

And I think it's Gary Cooper or someone, or one of these guys says, looks at

33:52

him and says, they're going to be even happier when we leave.

33:55

And Trump kind of fits that narrative Wolverine from the X-Men would be another

33:59

one who fits this narrative.

34:00

Like, is he going to be at the X-Men Christmas party?

34:03

No.

34:04

Right.

34:05

Is he trying to hit on Scott Gray's wife, Cyclops?

34:07

I'm a comic nerd, so I'm sorry.

34:09

Is he trying to hit on, is he trying to sleep with Cyclops' wife?

34:12

Yes.

34:13

Did he chop a guy's head off and throw it at a car?

34:15

Yes.

34:16

But we're about to go face Galactus and we're going to need him.

34:19

And so we have to put up with all of this other stuff because we understand

34:24

that when the system is corrupt at every level, you need someone who's outside

34:28

of the system to come in and set the system right.

34:30

It's a Western pattern as well.

34:34

Other people who fit this would be like Patton, right?

34:36

Married his cousin, slapped soldiers.

34:39

Did he really?

34:39

Married his cousin?

34:40

Yeah, I think it was his third cousin.

34:42

How many cousins removed does it become okay?

34:45

I don't know.

34:46

Is it third?

34:47

Fourth?

34:48

If there's blood.

34:49

Have you never met them?

34:49

I'm Icelandic, so I really can't say anything, right?

34:52

They literally have apps in Iceland.

34:53

Like my grandparents and my great-grandparents are all from Iceland.

34:56

They settled in Manitoba, Gimli, Manitoba, which is this Icelandic community.

35:00

And they literally have apps in Iceland to make sure you're not dating your

35:03

cousin.

35:03

So, you know, less than a million people on one island.

35:11

So, you're trying to prevent that stuff.

35:13

But anyway, Patton, yeah.

35:14

Slap soldiers who had tuberculosis.

35:17

One of them probably had shell shock.

35:19

It got in the newspaper.

35:20

They wanted his head.

35:21

And thankfully, the generals were like, no, he's the guy that we need for the

35:24

moment, right?

35:25

He had the ivory pistols and he dressed like not like a general.

35:29

He didn't talk like a general.

35:30

He wasn't like a Eisenhower where he had the veneer of a general.

35:35

But we knew he was the only guy we could have at the Battle of the Bulge.

35:39

Like the Germans talked about him like he was already a mythic legend in his

35:43

own, in his lifetime.

35:45

But part of this pattern that people should understand or when they examine

35:48

this pattern is it never ends well for these antiheroes.

35:51

They're always killed and they're always killed or defamed in the final

35:54

analysis.

35:55

So, when the Magnificent Seven come in, they'll go to another town and I'll get

35:58

killed.

35:59

When Patton retired, he died in some weird Jeep accident.

36:04

You know, Wolverine, he's the only guy left on this desolate like world where

36:08

the Hulk's in charge and it's a horrible existence.

36:11

Patton, or not Patton, Petraeus is another one.

36:15

You know, I briefed Petraeus.

36:17

I worked for, not for him, but for people who worked for him in Iraq.

36:20

And he was the guy that got us through with the surge.

36:25

But he was really a weird guy when you would talk to him.

36:29

Like, you knew that he knew something you didn't and that he was seeing things

36:33

that you weren't.

36:34

But even for myself, as being like a chief warrant officer at that time, a low-level

36:38

technician, he would ask questions like he got it.

36:41

He didn't act like other generals.

36:42

Like other generals would have their three things they want to talk about and

36:45

they'd want to get out of Dodge.

36:46

He would ask questions that really had implications.

36:49

And he is another one of these outsiders who came in to write a system that was

36:54

not working vis-a-vis Iraq in 2006.

36:57

And then what happens to him when he leaves?

37:00

They put him in charge of the CIA.

37:01

They knew he had been screwing around with this woman.

37:04

And they're like, okay, he served his function.

37:07

Now he needs to get out of Dodge.

37:08

And now he's, you know, got tried for all these things and sleeping with

37:12

someone while he wasn't married.

37:14

And, you know, it's not a ceremonious end for these types.

37:17

Is that really what happened to Petraeus?

37:20

That's how he ended?

37:22

Yeah, he was sleeping with some girl that was writing his book or something

37:25

along those lines.

37:26

Well, I'm not saying that's the end of him.

37:30

All I'm saying is that history will remember the pattern is ending unfavorably.

37:35

You know what I'm saying?

37:36

And so when I examined Trump, I said, yeah, I don't like what he says.

37:41

I wouldn't want him around my daughters.

37:43

I wouldn't want him at a dinner party.

37:45

But he seems to be saying these things like he's going to reset this system.

37:49

You know, I think it was Chappelle was on your show or another show or someone

37:53

like that where he talked about Hillary saying, you know, something about the

37:56

tax loopholes or whatever.

37:58

And he just hit right back at her and said, well, the people who are funding

38:01

your campaign take advantage of those same loopholes.

38:04

And if they're there, I'm going to take advantage of them.

38:06

I wouldn't be a pragmatist if I didn't.

38:07

When he started saying stuff like that, it seemed to me like he was going to upend

38:11

this system.

38:11

The jury's out on that because I don't know how I feel these days.

38:15

We can get into that if you need to or if we want to.

38:17

But he's an outsider personality.

38:20

And I thought he was going to really reset the system.

38:22

And there are, you know, good things that are happening.

38:25

You know, if I were to grade him, I would probably give him a C plus or a B

38:28

minus.

38:29

He's certainly better than, you know, what was happening under Biden.

38:32

I was still in the military when Biden was in charge and it was awful to say

38:36

the least.

38:37

What were the problems?

38:39

Oh, my goodness.

38:40

Books that general officers were being told to read and that I as an advisor

38:44

were being told to read.

38:46

Books like White Rage, like understanding why your problem, you as a white man

38:51

are a problem in the modern day military because this whole thing's built on

38:57

systemic racism.

38:58

You have built implicit bias that you can't escape even if you wanted to or you

39:03

recognized it.

39:04

It was woke politics.

39:05

Yeah, it was woke politics.

39:06

And it was, you know, I would sit there and say, you know, all of the people

39:12

that I know who've died during this war, not all of them,

39:15

but 80% of them and the numbers bear this out when you look at them.

39:19

They're all white guys from the middle of the country who were on their farms

39:23

or, you know, not all of them, 80% of them.

39:26

I think the numbers bear out about 80% of them were these guys from the Midwest

39:29

or these places where they didn't really have a lot going.

39:32

And they went off to fight a war that we probably shouldn't have been fighting

39:35

in the first place, especially in Iraq.

39:38

And they died for their cause.

39:39

And now you're saying that those people who make up the majority of the combat

39:43

deaths are somehow part of this problem and that other people aren't benefiting

39:48

from it.

39:49

I don't believe race to me is disgusting.

39:52

Even to talk about someone's race, even, you know, on both sides of the

39:56

spectrum, when they were, you know, electing that Supreme Court justice.

40:00

I can't remember her name right now off the top of my head just because I'm a

40:03

little nervous still.

40:05

She was black and they were talking about how it's historic because she's black

40:09

and Biden had said he's going to hire a black woman to do this job.

40:13

If I had worked my whole life to do something, but now I'm only being elevated

40:17

to this next position because of my gender and the color of my skin.

40:21

I would turn that job down so fast because that's not what I want to be known

40:25

for.

40:25

These are immutable characteristics that I'm not in control of.

40:28

I wasn't, I didn't choose to be born white or blue eyes.

40:32

I didn't choose to be born in a trailer park in the middle of nowhere without a

40:35

dad at five.

40:36

I didn't choose any of those things.

40:38

I don't see how I benefit from these things at the individual level.

40:41

And, you know, the individual level of analysis for me is really the only way

40:44

to evaluate someone for their pluses and their minuses.

40:47

And anything beyond that, to me, is discriminatory on its face.

40:51

Of course.

40:52

It's just a great way to control people because you pit people against each

40:56

other that way.

40:57

And it's just an awesome way that they can stay in control and make everybody

41:01

walk on eggshells and think that, you know, they've victimized people in order

41:05

to get to their position.

41:07

And they have to be shameful of who they are that they had no control over.

41:12

It also gives people an easy rubric to judge other people.

41:15

Yeah.

41:16

Because nothing's easy, really.

41:18

And it gives them, like, white guy bad.

41:20

You know, black guy good.

41:22

Chinese guy, as long as he's not applying to the college I want to get into, he's

41:25

good.

41:25

Right.

41:27

And it gives people – people want easy answers, really, at the end of the day.

41:32

They want to be told the easy rubric to navigate life because really none of it's

41:35

easy and it requires discipline, like you said before, and thought.

41:39

And so it was that stuff in the military.

41:42

I remember getting told in an equal opportunity briefing we were getting, it

41:47

doesn't matter what you meant when you said what you were saying.

41:53

It only matters what the person felt when you said it.

41:56

They said that in a military briefing?

41:58

It was a military equal opportunity briefing.

42:00

And the example they gave was if a woman walks into the – like, we worked

42:05

with a lot of civilians at this military organization where we were developing

42:10

these offensive cyber capabilities.

42:12

A lot of civilians in there.

42:13

And so if, you know, woman X walks in today and she's got a dress on and the

42:18

thought in your head is, I'd like to get my wife that dress or something like

42:22

it.

42:22

Or find out where she bought it.

42:23

And you just say, that's a nice dress.

42:25

Anyway, here's the TPS reports.

42:28

If she heard something sexual or didn't like the connotation or whatever, there's

42:33

going to be an investigation.

42:35

You're going to be pulled out of that office.

42:37

This is all going to happen despite what you meant.

42:40

So the idea probably was good.

42:43

We want to prevent sexual harassment inside of the office.

42:46

But it was weaponized.

42:48

But it was weaponized and it was carried out in a way where it's only about how

42:51

people feel and not what a reasonable person standard would be in a particular

42:55

situation.

42:55

And from the time I joined the military until that time, we had been at war.

43:00

My entire time in the military, we were at war.

43:03

I deployed throughout my career.

43:04

And I wouldn't say that I was a war horse.

43:08

I was not a long tabber.

43:09

I was not a cool guy kicking in doors.

43:10

It was my job as the guy with tape over his glasses to point out the door for

43:15

someone else and say, bad guy's in there.

43:18

So I was not, you know, a super badass in that regard.

43:21

I was a nerd for super badasses.

43:23

And, but we also all engaged in gallows humor and we would, you know, the jokes

43:29

and stuff.

43:30

Even someone, even someone who had recently died, we would make a joke about.

43:34

It's because you have this tremendous pressure and comedy is the relief valve

43:40

for that in a lot of ways.

43:42

And, but then someone would overhear that joke or something.

43:45

And now you're looking down the barrel of a 15-6, which is a military

43:48

investigation.

43:49

And all of these things that could permanently impact your life in a way and

43:54

give you a scarlet letter to where you could never be employed again or do

43:58

anything ever again,

43:59

because you were simply trying to relieve some pressure or you were trying to

44:03

find out where to buy your wife at the next dress.

44:06

And now your life's being ruined.

44:08

And I know guys who suffered under that sword.

44:10

Like, I wouldn't name them, but I know guys who, you know, their career met a

44:15

terminal end because of a dumb joke or something.

44:18

It's like, you can't be expected to go out and shoot people in the face and

44:21

then be sensitive to someone's feelings an hour later.

44:24

Right.

44:25

It's just, it doesn't, it does not work.

44:27

Now, should you talk to that guy and say, hey, you know, you made woman X feel

44:30

so-and-so.

44:31

Be more cognizant of that whenever you're around her in the future.

44:34

Well, you should also have a rational discussion with the woman.

44:37

Yes.

44:37

And what did he ask you?

44:38

He said, where did you get that dress?

44:40

It's very lovely.

44:41

I'd like to get one for my wife.

44:43

Why were you upset at that?

44:45

Like, does this, is this rational?

44:47

Like, how, you can't be in an office if you're that sensitive.

44:51

Like, it's one thing if the guy said, I'd like to get you out of that dress.

44:54

Well, for sure.

44:55

Now we're, yes.

44:56

Now you're in a different world.

44:57

A hundred percent.

44:58

A hundred percent.

44:58

Right.

44:59

Yeah.

44:59

And if someone says, you look great, you know, have you lost weight?

45:03

You look fantastic.

45:04

That's a compliment.

45:05

Yes.

45:06

And if someone gets upset, I felt sexually objectified.

45:09

I felt harassed.

45:10

Like, okay, he just said, you look great.

45:12

Yeah.

45:12

That's it.

45:13

Healthy.

45:14

It's not, you look great.

45:15

I'd like to get you naked.

45:16

Now we've crossed the Rubicon, right?

45:18

Now we're into it.

45:18

For sure.

45:19

For sure.

45:20

But just, you look great.

45:21

Or I like your dress.

45:23

That's like, if you said that to a man, like, hey, great suit.

45:26

Yeah.

45:27

And he's like, I need to file a complaint.

45:28

Yeah.

45:29

Yeah.

45:29

I need to file a complaint.

45:30

Yeah.

45:30

You've trimmed up, Joe.

45:31

You're looking good.

45:32

Looking great, Bill.

45:33

Like, oh my God, I'm being harassed.

45:35

I need to like, complain.

45:36

That would have worked during the Biden administration.

45:38

That is fucking crazy.

45:39

That would have worked.

45:40

That's so crazy.

45:41

And the other thing that they were doing in this briefing, which is where I

45:43

kind of, you know,

45:44

the last couple of years of my military career, I got in trouble a couple of

45:47

times,

45:47

or I should say called down.

45:49

I was a senior, I was a CW4.

45:51

I was one rank from the top.

45:53

I was advising two-star generals, colonels on very important matters.

45:57

I wasn't high.

45:59

I wasn't high in the dominance hierarchy, but I was adjacent to people who were

46:03

as an advisor.

46:04

And the amount of, in this briefing in particular, they had gotten into, you

46:14

know, it's bad that

46:16

there are so many white people, this, I'm doing high points here, but we need

46:21

more diversity.

46:22

I was part of an accepted career program that they were starting to call, like,

46:26

the old white boys network, because most of the people, so the requirements for

46:30

this network were, you had to speak a couple languages.

46:34

So, you needed an engineering degree or some kind of demonstrated engineering

46:38

background, you had to have deployed, they wanted you to speak the language

46:43

very well, they wanted you to be able to go through these engineering courses,

46:47

these other things.

46:48

So, and what happens naturally is, you now need people who are interested in

46:52

engineering, all right?

46:54

So, you've got somebody who's maybe more constrained in their thinking.

46:58

You need somebody who speaks languages.

47:00

Well, now they also need to be kind of, you know, speak French, speak Russian,

47:05

whatever it was.

47:06

So, they had to have studied or lived in an area and done this.

47:09

And they need to be able to go through these crazy tactical and strategic types

47:13

of courses.

47:14

By virtue of those things, you're going to get men.

47:17

And there were lots of women, but then there'll be more white men.

47:21

And it's not because the pool presented itself that way.

47:25

Now you have to extract from that pool.

47:27

And so, in this briefing, when they were talking about, like, the old White

47:31

Boys Network or how we need to change things, I said, you know, do you realize

47:35

that most men have more in common than most women?

47:40

Or, like, if I say I need more diversity in a particular room, if you said

47:44

diversity of thought, I'd be fine with that.

47:48

But Joe and, you know, random black guy in the same program, in the same office,

47:53

have far more in common than the white woman.

47:56

But what you're saying is these people need to have all separate different

48:01

colors and different, like, all of this needs to be this way.

48:06

It's going to naturally present itself that way because men in the military

48:09

generally are disagreeable.

48:11

Men in the military who like engineering are generally hyper-disagreeable.

48:15

And the only difference between these two people is the pigment of their skin.

48:20

So this fake diversity quota that they're putting on top of us doesn't achieve

48:24

anything other than giving some officer a bullet on their OER.

48:28

And, you know, I got pulled into the office afterward.

48:30

I said way more than that.

48:32

But essentially afterwards, they were like, hey, chief, you can't say that in

48:35

those briefings.

48:36

Like, the way that you were getting animated in there and what you were saying,

48:38

what you're doing.

48:39

Like, yeah, this is not going to fly.

48:42

And this was like 2018 or 2019 or something.

48:45

This is being rational.

48:46

Yeah, just trying to be rational and say that there's more difference in groups

48:50

than there is between groups.

48:52

And that the similarities and the way that things stack up, you recruit from a

48:56

pool of volunteers and candidates.

48:58

If I'm recruiting from a pool of volunteers and candidates who are 80% male and

49:02

white, I have to expect that the selected individuals are going to be male and

49:07

white.

49:08

The majority of people who join the military, I don't control this.

49:12

I'm just, as an engineer, I'm looking at statistics.

49:15

Also, if you want a highly functional, productive group, it's got to be based

49:18

on meritocracy.

49:19

Yeah, for sure.

49:20

For sure.

49:20

Has to be.

49:20

Anything other than that is literally a threat to national security.

49:24

Yeah, you're denigrating lethality.

49:26

Yeah.

49:27

The role of the army is to deter war through exuding superior military fighting

49:32

and technology.

49:33

And when deterrence fails to win, that's it.

49:36

Those are the two things that we need to do with our military.

49:40

It needs to look like the guy in the playground who you would not muck about

49:43

with.

49:44

And if you were to muck with him, he will beat you senseless.

49:47

That's it.

49:48

Now, whether or not we should be using that all the time or how we use it, that's

49:51

a separate question.

49:52

But the entity itself needs to comport itself in this way.

49:56

Otherwise, you are endangering this truly special experiment, which at least in

50:01

its beginnings, valued the individual.

50:04

It valued individual rights and states' rights.

50:08

And the founders, and this was another thing I said in that briefing, was the

50:13

founders knew.

50:14

Yes, they were all slaveholders, but they knew that the Constitution and the

50:18

Bill of Rights and the Declaration of Independence would eventually lead to a

50:22

system where we had to acknowledge these people as people.

50:26

And we fought a civil war where a million white dudes died to see this

50:31

experiment through.

50:33

The scaffolding was there.

50:35

You have to look at the things, the zeitgeist of the time.

50:38

If they had just said, nope, everyone's going to be free.

50:40

There will be no slaves.

50:42

You would have never gotten ratification through the southern states, but they

50:45

knew that there are.

50:46

And when you read the Federalist Papers, they knew that they were erecting this

50:49

system.

50:50

When you look at Thomas Jefferson and some of these other great thinkers, yes,

50:53

he owned slaves.

50:54

I get it.

50:56

They knew what they were building and they knew that what would ultimately

50:59

terminate in.

51:00

And then we had a civil war where we destroyed our country from the inside to

51:05

see this dream come about.

51:07

And now we're just going to all go back and say they're all slave owners.

51:10

Like, I know this has all been said here a million times, but this stuff animates

51:13

me because it's built with blood and treasure.

51:16

Well, it's also, you can't judge people from the past based on the standards of

51:20

the present.

51:21

For sure.

51:21

Because culture changes.

51:23

People understand things better.

51:25

We have a much greater recognition of what was wrong with things 100 years ago,

51:32

200 years ago.

51:34

And I'm sure in the future, we're going to look back on today with the same

51:38

lens.

51:39

It just always works that way.

51:41

Did you know Joe had a gas-powered car?

51:43

Exactly.

51:44

That kind of stuff.

51:45

Yeah.

51:45

Did you know that?

51:46

You consumed more.

51:47

You flew more.

51:48

Yeah.

51:48

You ate more meat.

51:49

You did whatever you did.

51:50

Yeah.

51:51

You were a problem.

51:52

He was a problem.

51:53

Yeah.

51:53

And now why would we ever, like, I'm voting to get rid of the Joe Rogan

51:56

experience from the National Archives because he drove a gas car.

52:00

Yeah.

52:01

You know what I mean?

52:01

Like, someone stores your stuff for profundity's sake, for the future, to hear

52:05

about this.

52:06

You know, I've always loved your podcast, Joe, and it was because you're a

52:10

genuinely curious person, and I'm not kissing your ass right now.

52:15

You're a genuinely curious person that was saying things that were not in the

52:19

current zeitgeist at the time, and you refused to apologize for it.

52:22

And it led to a lot of great things, but it led to an updating of the system,

52:27

and you did it with dialogue, with the dialogos, with two people trying to

52:31

learn things about each other, and it led to an updating of a system.

52:36

I think it's very important for culture to have free and open dialogue so we

52:39

can update our system.

52:41

So bad ideas can die so we don't have to die instead of our bad ideas.

52:45

Yeah.

52:45

Because if I can't express a bad idea, I have to act it out.

52:48

And if I act out the bad idea, it could kill me.

52:51

And the celebration of good ideas.

52:52

And the celebration of good ideas.

52:54

Yeah.

52:55

And it's just really, there's just been such a weird inversion in politics

53:00

where the free, hippie-loving liberals of yesteryear are now the ones telling

53:05

you what words you can use, there are no borders, all of these crazy things.

53:11

And I always say to people, I said it to Andy, stuff on my last podcast with

53:15

him, I'm like a 1996 Bill Clinton Democrat.

53:18

If you go watch his State of the Union, and he talks about lowering debt,

53:23

getting out of debt, actually, working with Newt Gringrich to get out of debt,

53:28

securing the borders, making work and education freely accessible, I'm voting

53:33

for that guy.

53:34

I know.

53:35

Isn't it crazy?

53:35

I mean, that's why the problem of labels doesn't work, ideological labels,

53:40

because if you go back far enough and look at Clinton, for example, he's one of

53:45

the best ones.

53:46

And by the way, he did balance the budget.

53:47

Yeah, he did.

53:48

He actually did.

53:48

We had a surplus when he left the office.

53:50

Yeah.

53:50

Amazing.

53:50

Did a fucking amazing job.

53:52

So he got his dick sucked.

53:53

Yeah.

53:53

Who didn't?

53:55

Back then, that's the other thing, judging people by the standards of the past,

53:59

you know, JFK doesn't look so good in the Me Too movement, you know?

54:03

Right, exactly.

54:04

I mean, he would have got canceled.

54:05

Yeah.

54:05

It's like you have to recognize that those, this ideological bubble that we

54:11

find ourselves in left versus right, Bill Clinton does not fit in that.

54:15

Bill Clinton is securely on the right in terms of, you know, 1996 standards

54:20

applied to today.

54:22

He would never want to hear that.

54:23

No, he would never want to hear that because he's kind of shifted with the zeitgeist

54:26

because that's what you kind of have to do if you want to stay in your party

54:29

and be protected by your party.

54:31

Yes.

54:32

You know, but he's essentially, he had a lot of the, I mean, we've talked about

54:36

this before, we've played clips of Hillary Clinton from 2008 and she's more MAGA

54:41

than MAGA.

54:42

I know.

54:42

You know, her take on the border was like hardcore, it was hardcore.

54:47

If you've been convicted of a crime, get out.

54:49

You know, if you stay here, pay a stiff penalty and you have to get in line and

54:53

you have to learn English and everybody cheers.

54:55

Yeah.

54:56

Like that is a hardcore right wing 2026 perspective.

55:00

Obama did it too in 2012.

55:01

Yes.

55:02

Absolutely.

55:03

And Obama deported more people than Trump.

55:04

Yes, exactly.

55:05

Yeah.

55:06

This episode is brought to you by ThreatLocker.

55:08

Data breaches are happening more frequently than ever, and it's not because of

55:12

sophisticated tactics.

55:14

Attackers are using the same methods and exploiting the same vulnerabilities.

55:20

What's changed is speed and scale reacting to breaches can leave you exhausted,

55:26

constantly chasing threats instead of preventing them.

55:29

That's where ThreatLocker comes in with ThreatLocker zero trust.

55:34

You only allow what you need and block everything else by default.

55:40

You control what runs, when, where, and how blocking ransomware before it executes,

55:46

because no matter how you respond,

55:48

a fast response simply isn't fast enough.

55:51

Visit ThreatLocker.com/JRE to learn more.

55:56

And, and it's just, I'm not saying, like my thought is always, I'm always

56:00

updating, I'm always updating my systems.

56:02

I'm always getting told things.

56:04

I always have a pre-prescribed way of looking at the world that I'll have a

56:09

good conversation with someone and I'll update my system.

56:11

But generally my principles are in place.

56:14

And when you watch these people who get into their thirties, forties, fifties,

56:17

and sixties, and their core foundational principles are changing,

56:22

that really should give you cause for concern.

56:24

Because like you were saying this at this time, and now you're saying this at

56:27

that time, it's like, generally my rubric that I don't think will change about

56:31

myself is,

56:32

I'm fervently for the individual and I'm fervently for truth and, and that we

56:37

can, that the, that the world you should measure it and look at not what your

56:42

intentions are, but what the outcomes are.

56:44

And, and then evaluate the system and how it scales based on those outcomes.

56:49

Those are, that's principally, if you, I try to live that standard up to myself,

56:53

I fall, fall short of that standard all the time, but I try to live by that

56:57

standard.

56:58

And I feel like that will always be me even into my nineties, like, unless

57:03

something goes horribly wrong, right?

57:04

Right, right.

57:05

And, and I've pretty much been here since, you know, the past seven or eight

57:09

years or so, like even into my thirties, I wasn't quite sure who I was, um, as

57:14

a human.

57:14

And, and, uh, but I'm, I'm pretty, you know, steadfast in that and the amount

57:20

of opportunities and the amount of goodness in my life and my children and my

57:25

home and the things I've been able to do have really been born out of that last

57:30

seven years of the truth's going to be the top of the, of the decision matrix

57:34

for me, the top of the hierarchy for me.

57:36

I'm going to try not to cut corners whenever I can and help good people around

57:42

me.

57:42

And, and, and the truth is the way that I'll organize and function myself in

57:47

life.

57:47

And that I will try to only judge people as individuals and the world, you know,

57:51

these are Christ's teachings from 2000 years ago.

57:55

And, but the world for me has just opened up in a way that I could have never

58:00

predicted using a very simple rubric.

58:03

It's not easy, but it's simple.

58:04

And if more people just took those, and this isn't me, I didn't come up with

58:09

this.

58:09

This is the result of, you know, watching a bunch of experiments go bad.

58:13

But if people just adopted that very simple thing and just tried it for three

58:17

months, you'll feel better about yourself.

58:19

You'll feel better about the world.

58:20

You feel better about the people proximally around you.

58:23

It might make you hate the government more.

58:25

Yeah.

58:26

But, uh, I don't think if you don't hate the government, I think you're not

58:29

paying attention.

58:30

Yeah.

58:31

Yeah, for sure.

58:32

I mean, when you were working in cyber defense, like what, what, cyber offense,

58:37

cyber offense, what was the primary function?

58:40

Like, what did you do?

58:42

Um, so in the beginning, I have no short answers and I apologize.

58:48

In the beginning-

58:49

I don't like short answers.

58:50

Yeah.

58:50

I just, I always feel like I'm-

58:52

I like a good long answer.

58:53

Yeah.

58:53

Don't worry about that.

58:54

Okay.

58:55

When I joined the military, I was in signals intelligence, um, and essentially

58:59

learning the ins and outs of radars, how radars work, what they do, um, how

59:04

they function.

59:05

Did you guys ever see any weird shit, like UFO shit?

59:08

I wish I had.

59:09

I really do.

59:10

I wish you had too.

59:11

Yeah, I really do.

59:12

Um, I was more in the signals intelligence side of the house, um, focusing

59:17

first on electronic signals or emanations from radars, mapping them so that,

59:21

you know, if we were going to go do the ground invasion and there was going to

59:24

be some air support going in first and blowing shit up, we would tell them, hey,

59:28

there's a man-packable SA-7 here, there's a SA-10 here, there's this here,

59:32

there's there, and then telling these pilots so they didn't get shot out of the

59:35

sky.

59:35

Um, uh, quickly when the war kicked off, that became irrelevant because there

59:39

was no, you know, surface-to-air missiles, surface-to-service missiles in Iraq.

59:44

We had knocked them all out in the first few weeks.

59:46

So then it shifted to communications intelligence, so I kind of retrained on

59:50

communications intelligence, and that was, at that time, off of cell phones,

59:55

off of, uh, push-to-talk radios, repeaters, um, long-haul networks, terrestrial

59:59

networks, extraterrestrial networks, and what I mean by that is the stuff, the

1:00:03

satellites in the sky, um, and doing analysis on those to try to inform the,

1:00:07

the, the, what we call the common operating picture of the battlefield for a

1:00:11

combatant commander.

1:00:13

So, command commander wants to know where the bad guys are, what they're doing,

1:00:16

what they're saying.

1:00:17

To the event, to the amount that we could, my job was to, um, come up with

1:00:21

solutions and conduct, you know, um, passive and active, um, signals analysis

1:00:26

on these things, and then inform the commander so that we could, you know, uh,

1:00:31

mitigate risk.

1:00:32

It was all about mitigation of risk.

1:00:33

Um, from, this is 2008 or so.

1:00:36

I'd been doing this for about seven years, eight years.

1:00:39

And, um, from there, it shifted to the phones getting smart, and essentially,

1:00:44

it went from you walking around with a 2G phone or a 3G phone that had limited

1:00:48

compute capability to, now there's robust compute capability with the advent of,

1:00:54

like, the iPhone.

1:00:54

And now it's like, well, now we've got to get after guys who are, you know,

1:00:57

essentially walking around with a computer we could never have envisioned 20

1:01:01

years ago in their pocket with all of this capability.

1:01:04

Because the military and our, and our, our forces that we're fighting against,

1:01:07

it all comes down to our ability to shoot, move, and communicate.

1:01:10

Communication being the part that I was focused on.

1:01:13

So as the advent of the iPhone and those things came out, the Army realized we

1:01:17

didn't have a computer network operations MOS.

1:01:19

We didn't have a, um, offensive cyber component.

1:01:23

We didn't have a defensive cyber component.

1:01:25

So we kind of, I was there at the ground floor when we were building out these

1:01:28

new MOSs now that are all over the military.

1:01:30

But at that time, there was a thought going into, you know, we need to have

1:01:34

people who know how to be on-ed operators.

1:01:37

Ethical hacking, as paradoxical as that sounds.

1:01:40

That's how the lawyers called it that.

1:01:42

So it's hacking at the end of the day, but ethical hacking because you've got

1:01:45

the backing of the U.S. government.

1:01:47

And so we set up that framework and really started launching into operations,

1:01:52

you know, 2006, 7, 8, all the way into my last deployment in 2017 or 2015, 17.

1:01:59

It was all focused on computer network operations and how they lash up with terrestrial

1:02:04

networks.

1:02:04

How do we exploit all of that was one facet of my job.

1:02:08

And, um, uh, your question was how did I get into all of that?

1:02:13

And that was the, that was the, um.

1:02:15

How do you get into it?

1:02:16

What was, what, what, what was like, what was the operational aspect of it?

1:02:20

Like, how did you actually, what did you do?

1:02:22

Uh, going, so, you know, there's, there's, I'll stick to terms that are more,

1:02:27

um, generally understood by the public, but learning how to do things like war

1:02:31

driving, um, collecting on networks, Wi-Fi, you know, endpoints, um, cell

1:02:37

phones, uh, understanding the ins and outs of them, understanding how to do

1:02:40

forensic analysis of them.

1:02:42

So after there was an operation and a bunch of gorillas had been sent in to

1:02:46

kill the bad guy, um, we could derive maximum intelligence value from the hand,

1:02:50

from the handset to plan other operations.

1:02:53

Um, and so, you know, it would be passive, um, monitoring of networks to inform

1:02:59

the intelligence picture, which would lead to either combat operations or

1:03:05

active computer network operations.

1:03:08

Where now it's like, well, there's, you know, a, uh, I don't know, a, uh, an

1:03:13

Iraqi or an Afghani router that hasn't been patched in three years.

1:03:18

And we think we can either write or find a zero day, which is just an exploit

1:03:23

of those routers where, um, we can muck with their router in a way where they

1:03:28

think they're getting a good information and they're not, or they're, or erecting

1:03:33

other things, um, to, uh, mitigate risk for the commander.

1:03:38

And so, um, that really, you know, exploded at that point and between that and

1:03:42

human intelligence, which is kind of the, um, the actual gathering of

1:03:46

intelligence from other people, you know, you would call it spy or, you know,

1:03:50

James Bond, but that's James Bond was a horrible spy.

1:03:54

Um, I mean, yeah, you know, your job's to remain anonymous and you're walking

1:03:59

into a casino and there's gold fingers calling you by your first and last name.

1:04:03

Um, it's not a great look, um, you know, generally you don't want to be

1:04:08

sleeping with your sources or, uh, um, you know, using a real name or whatever.

1:04:13

So human intelligence.

1:04:14

And then my focus for the last 10 years was how does signals intelligence,

1:04:18

computer network operations, um, become a force multiplier for people

1:04:23

conducting overt and clandestine operations, um, throughout the theater at that

1:04:27

time.

1:04:28

Uh, my, you know, my deployments and my time was spent in Iraq, Afghanistan, um,

1:04:33

Africa, Northern Africa.

1:04:35

And then a lot of people don't know it, but we were in active combat operations

1:04:39

in the Southern Philippines as well for, uh, a fair amount of time.

1:04:43

I want to maybe say seven or 10 years.

1:04:44

We were doing combat operations in the Southern Philippines.

1:04:47

My first deployment to the cell up from Southern Philippines was, uh, 2007.

1:04:54

Who are we doing operations against?

1:04:56

So, um, there were terrorist elements down there that were traveling back and

1:05:00

forth from Pakistan and Afghanistan.

1:05:02

And there was a terrorist organization down there called the Abu Sayyaf group.

1:05:07

And, uh, there were other ones as well.

1:05:09

Jamaa Islamia, I think was the name of the other one.

1:05:12

And, uh, they were conducting their own terrorist anti-Christian operations in

1:05:16

the Southern part of the Philippines.

1:05:18

In the Southern part of the Philippines, I don't, can I say it?

1:05:20

Can I say the word?

1:05:21

What do you mean?

1:05:22

Jamie, can you pull up a map of the Philippines?

1:05:24

Can you pull it up?

1:05:25

Oh, say that?

1:05:26

Say that term?

1:05:26

Yeah, pull it up, Jamie.

1:05:27

Been listening to it forever.

1:05:28

Uh, so there's what's called the autonomous region of Muslim Mindanao, which is

1:05:33

the Southern part from like a place called Zambuanga down to Hulu or Holo

1:05:37

Island.

1:05:39

Um, and there's a, it's a funny joke because if you zoom into Zambuanga, which

1:05:43

is, God, look how many islands are in the Philippines.

1:05:45

I know, it's, go down to the South there.

1:05:47

See Zambu, go down right there, right, right, zoom right there on that Island.

1:05:51

Now move to, sorry, now move to the Southwest.

1:05:54

See that penis?

1:05:56

Mm-hmm.

1:05:57

At the tip of that penis is called Zambuanga.

1:05:59

Mm-hmm.

1:06:00

All of our combat operations.

1:06:02

Now, if you zoom out a little bit more and pan more South and zoom out just a

1:06:07

little bit more.

1:06:08

So the joke hits all that sperm South of the tip of the, uh, the Zambuanga city,

1:06:15

this, there are terrorist operations in here.

1:06:17

Now, if you go to that main Island called Sulu, there's Holo Island.

1:06:21

That's where I was on this tiny Island out in the middle of nowhere.

1:06:24

And on that, there's a mountain.

1:06:27

That's all the Philippines.

1:06:28

Well, no, I mean, this is all the Philippines down here.

1:06:30

Yeah.

1:06:30

Wow.

1:06:30

So this is called, there's a mountain in there.

1:06:32

I think it was called Mount Tumatoc or something like that on the near, on the

1:06:36

Eastern part of the Island called Luke.

1:06:38

Yeah.

1:06:39

So there's mountains.

1:06:40

There's a mountainous region there.

1:06:41

There are a bunch of terrorists up there.

1:06:43

They were killing people in the area, conducting bombings.

1:06:46

They were getting trained.

1:06:47

Um, in fact, there was a guy and I believe I'm going to get his name wrong,

1:06:50

perhaps, but I believe his name.

1:06:52

It was either Insulan Haplan or, oh, it's Jamar Patek, Jamar Patek.

1:06:58

He was actually arrested outside of Osama bin Laden's compound the day after he

1:07:02

was killed.

1:07:03

We were trying to kill him on that Island or in and around that Island is where

1:07:06

we were trying to find him and kill him.

1:07:08

Uh, so they're terrorist facilitators.

1:07:10

Um, they did the USS Cole bombing.

1:07:12

Could you zoom back out?

1:07:14

I want to see the Philippines one more time.

1:07:15

Like all the islands.

1:07:16

When you zoom all the way out, it's so nuts how many islands there are.

1:07:22

Yeah.

1:07:22

So up north, up north of Manila is mostly the Christian, um, population.

1:07:27

And as you get down south, it's the autonomous region of Muslim men now.

1:07:30

And that is all of where these terrorist operations were happening.

1:07:33

Um, and I believe that mostly pulled out of there.

1:07:37

There might be still some people in Zamboanga.

1:07:39

I'm not sure anymore because it's been five years, four years since I retired.

1:07:43

But, um, yeah, we were doing counterinsurgency operations down there and guys

1:07:47

died down there and there were combat operations.

1:07:49

And, uh, I was out there, um, I was in a tactical military intelligence battalion

1:07:54

and I was attached to the first special forces group.

1:07:57

And we were down there a couple of times and, uh, a lot of people don't even

1:08:00

know about it.

1:08:01

So, yeah, I never heard about it.

1:08:03

Yeah.

1:08:03

So, uh, anyway, um, I'm sorry to, no, no sidebar, but I'm so stunned at how

1:08:07

many islands are in the Philippines.

1:08:10

It'll spread out of this.

1:08:11

Yeah, it's, it's insane.

1:08:12

And the, the, the thing about it is, is I go to all of these little outposts

1:08:16

and these out islands.

1:08:17

We were always debriefing these guys and I'm going to get these terms wrong.

1:08:21

So I'm sure there'll be people in the comments, but I think they were called bongar

1:08:24

eyes or something like that.

1:08:25

But they were like these mayors of each one of these little islands and there'd

1:08:29

be terrorists in and around those areas.

1:08:31

And we'd try to make friends with these guys.

1:08:33

So they give us some information.

1:08:34

Um, and every one of those places was absolutely beautiful.

1:08:38

Like you'd go there and be like, man, Hilton could turn this into something in

1:08:41

a short order.

1:08:42

Right.

1:08:42

You know, when you're out of these places, beautiful beach, beautiful lush jungles,

1:08:47

the best swimming water.

1:08:48

Nicest people too.

1:08:49

Oh, Filipino people are some of my favorite people, man.

1:08:52

Like you want to talk, the guys that we worked with out there, they're a scout.

1:08:56

I think they're called scout sniper, scout rangers.

1:08:58

And they were a special, I think they were like their special forces.

1:09:01

We'd go to the range with these guys and show them stuff.

1:09:04

And they're the most, um, ride or die type of guys you'll ever meet in your

1:09:08

life.

1:09:09

Like, you know, so-and-so said this about you last week and I could kill him.

1:09:12

It's like, no, dude, it's cool.

1:09:13

It's like, don't worry about it.

1:09:14

Like fun fact.

1:09:15

There's some of the best pool players on earth too.

1:09:17

Oh, really?

1:09:18

Great.

1:09:18

Some of the greatest pool players of all time came out of the Philippines.

1:09:20

They're just great people.

1:09:21

I mean, I just, the people down there were fantastic and it was awful because

1:09:25

those guys

1:09:25

would be bombing churches, Christian churches and stuff like that.

1:09:28

And, uh, the doing counter operate, like I said, counter, um, um, intelligence

1:09:34

operations

1:09:34

out there, doing intelligence operations collection to inform that battle

1:09:37

picture.

1:09:38

But those guys had direct links with Osama bin Laden and other people.

1:09:41

Um, I had no idea.

1:09:42

Yeah.

1:09:42

Right after we, like I said, I think it was, I think if you look it up, I think

1:09:46

his name

1:09:46

is, um, Patek, P-A-T-E-K, and he was arrested outside of Osama bin Laden's

1:09:52

compound and we

1:09:54

had been chasing him in the Philippines.

1:09:55

Wow.

1:09:56

Cause we thought he was still down there.

1:09:57

Um, there was another guy that we, I believe we killed him.

1:10:00

His name was Al Bader Parad.

1:10:01

Um, but yeah, my job was not, I always say this on podcasts because the veteran

1:10:06

community

1:10:07

is wild right now.

1:10:08

They love to cut each other down right now.

1:10:10

There's something weird going on where like obviously lying.

1:10:13

Yeah.

1:10:14

Call the people out.

1:10:15

I prefer to call people out face to face.

1:10:17

Um, but, uh, I always make sure people know I was not a cool guy.

1:10:21

Like sometimes I got to dress like one, you know, for a few years I didn't wear

1:10:25

any uniforms

1:10:26

and I got to grow my beard out and act like a cool guy, but I was really a nerd

1:10:29

for cool

1:10:30

guys.

1:10:30

I've literally got pictures of myself down in the, in the holo or in

1:10:34

Afghanistan or anywhere

1:10:36

else and tape around my glasses and, you know, Pez dispenser and my radio and

1:10:40

collection

1:10:41

equipment looking like a true blue American nerd.

1:10:44

But I was not the guy who kicked the door and I was always the guy pointed the

1:10:47

door out.

1:10:48

So I'd be safe in the Humvee in the back, you know, eat an MRE and somebody

1:10:51

that looked

1:10:52

like another gorilla, you know, like a Annie Stumpf or Tim Kennedy or someone

1:10:55

like that.

1:10:56

I'd be like, is that the house?

1:10:57

I'd be like, pretty sure that's the house.

1:10:58

You guys might want to be safe, but go ahead.

1:11:00

I'll be in the Humvee.

1:11:01

I'll be out here or I'll be in an airplane above, you know?

1:11:04

So, um, and, uh, yeah, it was, it was being born in North Dakota and, and, uh,

1:11:11

you know,

1:11:12

my mother, single mother, after she left that first guy, um, trailer house in

1:11:16

the middle of

1:11:17

this little town called Cavalier, North Dakota.

1:11:19

I had no options.

1:11:21

I was a horrible student.

1:11:22

And, uh, what did-

1:11:24

That's crazy that you're so smart, but you were a horrible student.

1:11:26

I wouldn't, yeah, I wouldn't, I'd call myself curious before I'd call myself

1:11:29

smart, but,

1:11:30

um, uh, you know, my mother, you know, I don't know if we remember, you would

1:11:34

remember this,

1:11:35

but maybe other people my age, you know, you'd get these scholastic book order

1:11:40

forms that

1:11:41

you'd bring home from school and you could order books.

1:11:43

There'd always be on the back page, there'd always be like little cool stuff.

1:11:46

Like you could get like, you know, a pair of gloves or a hat or something.

1:11:50

Anyway, one time there was a, um, a coil radio that you could order with an earpiece

1:11:55

and you

1:11:56

put this coil radio together and with an earpiece, no battery.

1:11:59

It was just the electromagnetic radiation would, would, would, um, activate the

1:12:04

coil and the

1:12:04

coil would, you could listen to radio chatter.

1:12:07

Really?

1:12:08

With no battery?

1:12:09

Yeah.

1:12:09

Yeah.

1:12:09

Just tiny little, little radio.

1:12:11

How did it, what was the power?

1:12:12

The electromagnetic radiation.

1:12:14

And you would just kind of like a record, uh, uh, like, you know, how you hit a

1:12:19

record.

1:12:20

Electromagnetic radiation would hit the coil and the coil would feed up to an

1:12:24

amplifier

1:12:24

or up to an earpiece and the earpiece you could hear chatter and you could tune

1:12:27

it.

1:12:27

Did the earpiece have a battery?

1:12:29

No, I don't think anything had a battery on it.

1:12:31

Yeah, I think it was just, uh, I could be mistaken, but I don't believe it was.

1:12:35

Powered by electromagnetic radiation.

1:12:37

Yeah, I mean, you can look it up, Jamie, if you want.

1:12:40

Sorry to say that again, but, um.

1:12:41

Tighten that thing down.

1:12:42

That thing's driving me crazy.

1:12:43

Yeah, sorry.

1:12:44

This thing.

1:12:44

Oh, like here or here?

1:12:46

Right here.

1:12:46

Look at my finger.

1:12:47

It's right here.

1:12:47

Yeah, I've been meaning to do that, like, literally when everybody uses this

1:12:51

fucking thing.

1:12:52

It's wobbling around, ready to fall off.

1:12:53

Yeah, but if you look up coil, coil radio with small earpiece, I could be wrong.

1:12:57

I don't remember there being a battery on it.

1:12:59

Electromagnetic radiation powered it.

1:13:01

That's bananas.

1:13:03

Yeah, so kind of like a, same thing with like, you know, not the same wattage,

1:13:06

but a microwave,

1:13:07

right?

1:13:07

Um, sends power through the air.

1:13:09

Right, but it uses.

1:13:10

Or DC.

1:13:10

But it uses power.

1:13:12

Yeah.

1:13:12

In order to send it.

1:13:13

Yeah, but I could be wrong.

1:13:15

But, um, at any rate, that was the first time I got a radio and I was hearing

1:13:19

things and I'd

1:13:20

put it together and I'm listening to things.

1:13:22

Like what kind of things?

1:13:23

Uh, HF radio, VHF radio, people talking, that type of stuff.

1:13:28

And, um, it was just, and then I found out how to get an antenna to make the

1:13:32

antenna larger

1:13:33

and started ordering auxiliary pieces for it.

1:13:35

And then what really changed me was my mother let me get a, my mother and I

1:13:39

would clean houses.

1:13:40

She was a waitress, but we also would go around and clean houses.

1:13:43

And there was a lawyer that we worked for.

1:13:44

His name was Phil Culp.

1:13:46

And, um, he had a old 286 SX IBM and, uh, it was just sitting in his basement.

1:13:53

And I told my mom, I was like, Hey, if I clean for like a month, can I have

1:13:56

that computer?

1:13:56

Like he doesn't use it.

1:13:58

He's got a new 486 up in his place here.

1:14:00

And he instantly said I could have it.

1:14:02

And then that started me down the computer networking realm.

1:14:05

And like, look, how could I get this 286 to act like a 386?

1:14:08

Or how can I force it to run windows?

1:14:09

Or how do I update the memory?

1:14:10

How do I do these things?

1:14:12

In this little town, Edinburgh, North Dakota, there was a guy who had a

1:14:15

computer store in

1:14:16

a basement of an old general store.

1:14:18

And his name was Jeff Munzebrotten.

1:14:19

And, uh, I would go there and ask him questions about computers and just start

1:14:23

learning like

1:14:24

ins and outs on how do I update the RAM?

1:14:26

How do I get memory better?

1:14:27

How do I augment the storage?

1:14:29

Uh, how, how can I force this thing to run windows 3.1?

1:14:32

So I could have a GUI instead of using a command line.

1:14:35

Um, GUI mean graphic, graphic, graphic using interface.

1:14:38

Yeah.

1:14:39

Yeah.

1:14:39

Sorry.

1:14:39

And, um, so that's what I'm doing.

1:14:41

That kind of started me on that.

1:14:43

And that, for me, like I said, um, I had all kinds of problems with attention

1:14:48

deficit disorder

1:14:49

and not being able to pay attention.

1:14:51

That was the only time I could, I would go for, I don't believe in ADHD.

1:14:57

I might be wrong.

1:14:59

I think it's a superpower.

1:15:00

I mean, it's certainly, I remember I would spend two days working on a problem

1:15:04

and not sleeping.

1:15:05

That's what I'm saying.

1:15:06

I think it's a superpower.

1:15:07

I think it just keeps you from being interested in things you're not interested

1:15:10

in.

1:15:11

Yeah.

1:15:11

I have a theory on that too that I can get into after.

1:15:14

But, um, that started me down that road.

1:15:17

But in school, I couldn't pay attention.

1:15:18

Me neither.

1:15:19

There was this teacher.

1:15:20

I always tell this story.

1:15:21

It's a great teacher.

1:15:22

She's still around.

1:15:23

Um, her name is, uh, Connie Trenbeth and she was my English teacher or

1:15:27

literature teacher

1:15:28

or something like that.

1:15:29

She might not even remember the story, but here I am telling it on your podcast.

1:15:32

I remember it.

1:15:33

Um, she kept me after class once and she goes, you know, I knew your dad, Bill.

1:15:37

And, uh, you know, your, your uncles were all smart and your, my grant, my, my

1:15:42

great uncle

1:15:43

has an engineering wing of a school named after him out in Western North Dakota.

1:15:48

And she goes, all these guys were thinkers and your dad did all this great

1:15:51

stuff and built

1:15:52

all this stuff.

1:15:52

And, uh, essentially what she was telling me is you're a waste of life.

1:15:56

Like all you do is you come in here and you disrupt the class, you upset people.

1:16:03

No one can talk.

1:16:04

Sounds like me.

1:16:05

You're trying to dominate every conversation.

1:16:07

But when, you know, I, you had written one paper on something that interested

1:16:11

you and

1:16:11

I don't remember what it was.

1:16:12

And she's like, that was a wonderful paper.

1:16:14

She's like, if you could just do that every time.

1:16:18

And, uh, I was not hearing it.

1:16:21

Like, I remember the conversation.

1:16:22

Cause I actually remember, I think she said waste of life.

1:16:25

I think she actually said that like, you're wasting, like you're obviously my

1:16:29

RP, my CPU

1:16:31

clocks high.

1:16:31

I'm always thinking even when I'm not thinking.

1:16:34

And even as we're sitting here talking, I'm thinking about other things or

1:16:37

stuff I want

1:16:37

to do when I get back to my computer or stuff I want to do for my business.

1:16:40

And, um, and so I joined the military and the, the absurdity of life is this.

1:16:47

I joined to be a military policeman, which I absolutely would have hated.

1:16:51

Um, all of them got turned into infantry people or stand gate guard, which is a

1:16:55

needed function

1:16:56

in the military, but it doesn't apply to my personality.

1:16:59

But when I went to the recruiter station out in Minneapolis, I think it was, I

1:17:03

was a bonehead

1:17:04

and I forgot my driver's license and they're like, well, and I was supposed to

1:17:08

leave.

1:17:08

And at this time I'd dumped my girlfriend, told everyone goodbye.

1:17:12

I'd wiped the dust off my boots, like left Cavalier, North Dakota.

1:17:16

And, um, I was like, Hey, uh, I'm not going back.

1:17:21

So whatever we got to do right now.

1:17:23

And he's like, well, we can, you can go home, get your license.

1:17:27

Cause the MEPs station was in Minneapolis.

1:17:30

Was it Fargo?

1:17:32

It doesn't matter.

1:17:32

It was five, six, seven hours away.

1:17:34

And they're like, well, you're not leaving today without a driver's license.

1:17:38

So I looked at my recruiter and I was like, I don't know what job you need to

1:17:41

get me into,

1:17:42

but it needs to be a different job.

1:17:43

And they're like, well, you scored, you know, exceptionally high in your

1:17:46

general technical,

1:17:47

um, part of your ASVAB, which is like understanding machines and objects and

1:17:51

stuff.

1:17:51

So we could get you into this like Intel job where you'd learn about radars and

1:17:56

stuff.

1:17:57

And that immediately clicked for me.

1:17:58

And then he's like, well, we got to go brief you in this SCIF room.

1:18:02

There's a, you know, secure compartmented information facility.

1:18:05

There's only one guy who's got a clearance and he can brief you on the job.

1:18:08

And if you want that job, then you can leave tomorrow.

1:18:10

I instantly started hearing like the James Bond music, you know,

1:18:13

and so they walked me in this back place and, you know, nothing super crazy and

1:18:21

briefed me up on the job.

1:18:23

And I went back out and I said, yeah, this is actually the job for me.

1:18:25

So the absurdity of life is me forgetting my driver's license when I was 16.

1:18:29

I was 16 when I signed up, um, maybe 17.

1:18:33

No, I was turning 17 that December when I signed up for the military.

1:18:36

Um, I can connect with a string to forgetting my driver's license to being here

1:18:41

with you today.

1:18:42

You can, you can sign up when you're 16.

1:18:45

I think I was turning 17.

1:18:47

You can sign up when you, I didn't even know you could sign up when you're 17.

1:18:49

I didn't sign my delayed entry program thing.

1:18:52

Um, and I left a little bit before my 18th birthday.

1:18:54

So I was graduated from high school.

1:18:56

But, um, yeah, you can sign up when you're 16, I believe, as long as your

1:19:00

parents signed the waiver.

1:19:01

My mother signed the waiver.

1:19:02

She was happy to get me out of the trailer.

1:19:04

Um, so, uh, yeah, I was 17, almost 18 when I left.

1:19:09

You can make a radio out of that.

1:19:10

Yeah, right there.

1:19:11

So that's all the pieces.

1:19:12

They call it a crystal radio.

1:19:13

Yeah, I was going to say crystal controlled.

1:19:15

That's a radio?

1:19:16

There it is.

1:19:16

That's actually the exact thing.

1:19:18

That looks almost, that is almost exactly what it looked like.

1:19:22

Slinky made it?

1:19:22

Well, they bought the brand, they just, the Slinky brand now.

1:19:25

What?

1:19:25

Bought this toy.

1:19:26

There's a bunch of these all over the internet.

1:19:28

Yeah.

1:19:28

Wow.

1:19:29

Make your own working radio without batteries.

1:19:32

Yeah, and it uses a, I was going to say crystal controlled radio because it

1:19:35

uses a crystal diode on it.

1:19:37

Would you say Tesla coil, Jamie?

1:19:39

Yeah, it's a Tesla coil.

1:19:41

This guy's explaining it.

1:19:42

So this thing has a, kind of cool too, let me find this thing, a rocket radio

1:19:46

they called,

1:19:47

which is a, like, further development, this thing.

1:19:49

It attached to a phone.

1:19:51

Oh.

1:19:53

So you plug that onto a phone cable.

1:19:55

There's a picture of it somewhere on here, but, um, it explains, like, you're

1:19:59

picking up, there you go.

1:20:01

Wow.

1:20:02

No power.

1:20:03

No battery or current needed, hence no operating expense and long life.

1:20:08

Yeah, this is almost.

1:20:09

It just fits onto a phone.

1:20:10

What year was this?

1:20:12

Man, this is old.

1:20:13

Yeah.

1:20:14

So it also shows here, this is, like, you're picking up power from a radio

1:20:18

tower.

1:20:18

Yeah.

1:20:19

Wow.

1:20:19

And they're powerful, the signal.

1:20:21

This is sort of, like, what they're paying for at the FCC.

1:20:23

The more powerful your radio tower, the longer and more people you can reach.

1:20:27

Crazy.

1:20:28

That has no battery.

1:20:30

And that's also why some radio signals come in very well on your radio and some

1:20:35

don't.

1:20:35

And they sound like dog shit.

1:20:36

Yeah.

1:20:37

They got weak power.

1:20:38

And then the frequency modulation.

1:20:41

Like, amplitude modulation isn't as efficient as frequency modulation when it

1:20:45

comes to, for the vocorder to produce sound.

1:20:48

Amplitude modulation travels farther, but it doesn't have the amount of

1:20:54

information.

1:20:55

It's not modulated with the carrier wave can't be modulated with as much

1:20:59

information as you need.

1:21:00

Whereas frequency modulation is much quicker, megahertz, and you can amplitude

1:21:04

and add more sound or more information, which is why it sounds better.

1:21:08

So, FM sounds better, but it doesn't travel as far.

1:21:10

Right.

1:21:11

AM sounds worse.

1:21:12

I always, when I was training people in the military on this, I always use the

1:21:15

analogy of, if a party is happening next door, you can hear the bass music.

1:21:19

But you can't hear the treble.

1:21:20

You can hear the bass music because that frequency travels farther because it's

1:21:24

lower in the frequency band.

1:21:26

But you can hear the treble because, or you can't hear the treble, I'm sorry,

1:21:30

because it's higher frequency and there's more modulation.

1:21:34

And so, it disperses quicker and you can't hear it as well.

1:21:37

And it's the same thing with like VLF comms coming off of like a submarine, can

1:21:41

travel underwater for a very long ways, but you can't put as much information

1:21:46

in them as you could if you were doing, you know, VHF or UHF comms where there's

1:21:50

lots of modulation.

1:21:52

So, it's the dispersal.

1:21:54

And, you know, a lot of my, you know, mid part of my career was explaining this

1:21:57

stuff to, you know, military guys who were trying to understand, like, here's

1:22:01

how a cell phone works and this is how frequency works and this is how we send

1:22:04

information.

1:22:06

And just kind of demystifying, you know, how, you know, a GSM network works.

1:22:12

One of the things that I wanted to ask you about that is when new technology is

1:22:17

emerging, how do you stay ahead of the ability to extract information from this

1:22:27

technology?

1:22:30

Hack into networks before people understand the capability?

1:22:35

You really can't.

1:22:36

You really can't.

1:22:37

And that's the beauty of the free market is that the innovation to perform the

1:22:41

function that you want someone to pay for will always move faster than your

1:22:45

ability to exploit the technology.

1:22:48

Then how do you explain things like Pegasus?

1:22:50

Well, I mean, something like Pegasus.

1:22:54

Well, first off.

1:22:55

Explain Pegasus to people that don't know.

1:22:57

It was a persistent implant on cell phones for people.

1:23:01

Initially, it was you had to click it.

1:23:04

It was a click.

1:23:04

Initially, it was a click and then it became a non-click exploit.

1:23:08

So, in other words, you had to interact with something on the phone in order to

1:23:11

initialize and install the implant.

1:23:13

And then after, but the reason why it was so good is because it wasn't stored

1:23:18

in the, it wasn't stored in the usual areas that you would want a persistent

1:23:24

implant or where you would have a persistent implant.

1:23:27

For instance, you know, you might want to put it in the application layer of an

1:23:31

app or something like that where there's a binary that can run and execute

1:23:35

commands or functions.

1:23:36

And so they, I won't get into the very specifics of where and how they did this

1:23:41

because I'm not sure if I got this information from the government or not.

1:23:45

So I won't say it.

1:23:46

But they stored it in a place where it wasn't normal.

1:23:49

And you can read papers on your own and look at the forensics of it and how the

1:23:54

actual implant was executed.

1:23:57

But it essentially, you know, allowed people to own your phone and, you know,

1:24:02

was the kind of implant I only dreamed of when I was helping develop my own

1:24:07

implants in the military.

1:24:09

Mostly what we would rely on is, you know, zero-day architecture and looking

1:24:13

for something in a phone that either they hadn't patched or that the phone that

1:24:17

you were looking at hadn't been patched.

1:24:20

So phones, as they have their own red teams, are going through the phone for

1:24:23

their own because they want to sell a product that people will use and people

1:24:27

won't use stuff that can get hacked.

1:24:28

So they'll do their own red teaming and they'll discover like, oh, you know, we,

1:24:32

we, on this router we developed, we left this port open and it shouldn't have

1:24:36

been open.

1:24:37

So now we're going to write a patch that will close that port so that this port

1:24:40

is no longer accessible by a guy like me so I can't go in there and do

1:24:43

something to this particular type of router.

1:24:45

Another great thing, I'll say something good about the administration, they're

1:24:49

doing some stuff right now to make sure that we're getting rid of Chinese

1:24:52

technology and Chinese routers.

1:24:55

And, you know, there's a widespread network of, the PLA has a, and I can't

1:25:00

remember the name of the botnet, but they essentially implanted a bunch of old

1:25:07

unpatched routers to get access to government and business proximal people.

1:25:13

And it was widespread and huge and, you know, they, it looked like to me, I

1:25:17

haven't read this anywhere, but if I were looking at this implant and how it

1:25:21

was done, they were trying to really cause some trouble.

1:25:25

It was being placed at critical places, think power, think energy, think

1:25:29

banking, like they really wanted to cause some ruckus and I have not been part

1:25:34

of this administration, so I'm not saying anything classified for those of you

1:25:38

who are listening.

1:25:39

And so, but there was a decision to say, hey, we need to make sure that these

1:25:43

things get patched and also that we're not bringing in architecture from the

1:25:47

overseas because they don't play by the same rules that we at least say we play

1:25:51

by.

1:25:51

Well, that's why they banned Huawei devices.

1:25:53

Oh yeah, and ZTE.

1:25:54

Yeah.

1:25:54

Well, Huawei had a phone that I was really interested in back in the day.

1:25:59

They had a Porsche design had partnered with Huawei and made this insane

1:26:03

Android phone with like the best camera, the best battery.

1:26:07

It was like really high level and I was like going to buy it.

1:26:11

And then all of a sudden they banned all the Huawei phones and I was like, what's

1:26:14

going on?

1:26:15

And then, you know, I'd heard some people say, oh, they're just trying to stop

1:26:18

competition.

1:26:19

It's like American companies are trying to stop it.

1:26:22

And then I went into it deeper and I said, no, it seems like there's third

1:26:27

party input on some of the routers and some of their network devices that they

1:26:32

had engineered in order to be able to access them by third party.

1:26:38

And this, because of whatever, lack of understanding, lack of knowledge of how

1:26:43

these things are constructed, the people that purchased them didn't, weren't

1:26:48

aware of them.

1:26:49

And these things had gotten into place and they had gotten into place in

1:26:52

universities.

1:26:53

They got into place in military establishments.

1:26:56

They were using them in cell phone towers that people, you know, inadvertently

1:27:00

bought from China.

1:27:01

Yep.

1:27:02

And that's really, I mean, I can tell you firsthand from having done some other

1:27:05

forensic exploitation on this stuff.

1:27:07

Another large part of my career I didn't talk about was just on mobile forensics

1:27:11

and media forensics, which is essentially you think of like CSI Miami or CSI

1:27:16

whatever the city was.

1:27:18

There's a crime, someone was killed, you have forensics that are doing forensics

1:27:21

on like blood and fingerprints and blood splatter and all that stuff.

1:27:25

There's a whole another part of that same forensics branch that focuses on

1:27:28

media forensics.

1:27:30

What was deleted off this phone at one point, what remains on this phone, what

1:27:33

was it being used for?

1:27:35

I would do this in the military so that when we did do an operation and I was

1:27:38

part of some of the largest ones ever done out in Afghanistan,

1:27:41

there would be treasure troves of phones and all of these computers and stuff

1:27:46

like that.

1:27:47

It was my job and I had a great team that worked for me in 20, my deployment in

1:27:52

2015.

1:27:53

We would go in afterwards, gather up all of this stuff and, you know, the task

1:27:58

force commander would literally be standing by and we'd say, you know,

1:28:02

here's the intelligence that we've derived.

1:28:04

Here's the multi-point analysis here.

1:28:06

You know, it was on this hard drive.

1:28:07

It was here.

1:28:08

It was here.

1:28:08

You know, there's a bad guy place out here.

1:28:10

And those guys would be rolling like within moments after the last operation,

1:28:13

like some operations we do where we'd be rolling one after another target

1:28:17

because we were getting really good at media forensics and the intelligence

1:28:21

that was there and then getting into active media forensics,

1:28:24

which is a different discipline.

1:28:25

But essentially, I'll get it.

1:28:27

I can get to that later if you want to.

1:28:28

But launching and doing these follow-on operations off, you know, dumping the

1:28:35

binary from a phone and examining it at the ones and zeros level

1:28:38

to say everything that was going on with this thing or if it was a really high

1:28:41

– like the organization that I worked for at that time did the analysis of

1:28:45

the Osama bin Laden media.

1:28:47

And, you know, on that media, we're doing far more than we would for another

1:28:52

piece of media in that we're, you know, x-raying it.

1:28:55

And we're looking at maybe what the disc looked like before or what was

1:28:58

destroyed or reconstructing things.

1:29:00

Spending millions of dollars on that intelligence analysis because we wanted to

1:29:04

fully understand everything that this guy was involved in and what he was doing

1:29:07

and where he was and who he was talking to.

1:29:09

And so that was another part of my career that I did for about five years or so.

1:29:14

What was going on with the Huawei phones?

1:29:16

Like what were they doing with them?

1:29:17

I mean there were – some of them were coming out implanted.

1:29:21

In other words, there was access built in for a foreign actor.

1:29:24

And then in other terms, other places with routers, with the ZTE stuff, there

1:29:29

were just things that you would patch or that you would fix as a company who

1:29:34

was trying to protect the consumer and create a product that people would use.

1:29:38

And they weren't doing it.

1:29:39

So they were creating persistent back doors either by actively placing code on

1:29:43

there that would allow, you know, root access or they were leaving things open,

1:29:48

especially in Africa.

1:29:49

Like the work that – you know, when I was working in Africa, the Chinese were

1:29:53

just owning Africa.

1:29:54

They were just giving them communications infrastructure.

1:29:57

And they were doing that because they wanted their resources and they wanted to

1:30:00

know what these people were saying and what they were doing.

1:30:04

And so I'm a free market real – like I'm as free market as a guy can get.

1:30:09

I want the best people building the best products and I want everyone to be

1:30:12

able to compete.

1:30:13

But in that case, I would never own a Huawei or a ZTE or anything else.

1:30:18

On a consumer level, what were they doing with those phones?

1:30:20

Like if they had imported them to the United States, if they didn't have that

1:30:24

ban, what would have been the issue?

1:30:26

Getting access to, you know, any number of people that – the Chinese really

1:30:31

want access to everybody.

1:30:34

But you could start at the topical level of just saying, you know, getting Joe

1:30:38

Rogan to use his ZTE would be – that would be my wet dream as a guy who used

1:30:42

to do this work back in the day because you're talking to the president or you're

1:30:45

talking to this guy or that guy.

1:30:46

And I can build out a network of understanding who you're in contact with, who

1:30:50

you're talking to, what's being talked about.

1:30:52

But then also finding out, you know, this person's phone number and now doing a

1:30:56

deep dive on there.

1:30:57

So it's really about, you know, getting all of that data and constructing, you

1:31:01

know, an analyst notebook essentially outline of who's talking to who, who do

1:31:05

we need to implant.

1:31:06

But it's for business as well.

1:31:09

Like they're really trying to go – they would want this in the hands of

1:31:11

somebody who's in charge of a business because they want their IP.

1:31:14

They would want this in soldiers' hands so they would know deployment dates or

1:31:17

who's going where and who's doing what.

1:31:18

They want this in routers because routers are usually the most unpatched piece

1:31:23

of technology in that you're not – especially, you know, these days they're

1:31:26

more automated patching.

1:31:28

But back in the day, like you had to manually update a router and if you didn't,

1:31:31

well, then you had potential exploits that were sitting on that router where I

1:31:35

could gain access to the router in your home or I could gain access to a BGP

1:31:39

router, which is like a border gateway, which is moving all of the internet

1:31:43

data.

1:31:44

Or I could get access to a microwave terminal.

1:31:46

You know, if you look at a cell phone, they've got the microwave terminals on

1:31:49

there that are sending information in between them.

1:31:51

If those are Chinese parts that are either being used for the processing, the

1:31:55

CPU, or the physical infrastructure of that, the products that they were

1:31:59

putting out would give me direct access to the information that's being passed

1:32:03

on those terminals.

1:32:05

So you're getting, you know, system-level, root-level access through machinery,

1:32:09

through communication devices, and through things like routers where you can

1:32:13

know everything you want to know about your enemy.

1:32:16

Wow.

1:32:18

And so, as far as today's technology, I see you use an Android phone.

1:32:22

Like, is there a phone that is more secure or a platform that is more secure?

1:32:28

It all depends.

1:32:30

Like, I always take this from Thomas Sowell.

1:32:33

There are no answers.

1:32:34

There are only trade-offs.

1:32:35

So there's, like, the way to answer that question would be is, like, who are

1:32:38

you?

1:32:39

What are you trying to do with your life?

1:32:41

What are you talking about on your phone?

1:32:42

What are you doing on your phone?

1:32:44

You know, most of these phones, if you're just an average, everyday citizen who's

1:32:48

just going about your job, you know, the phones today are pretty secure,

1:32:51

especially versus a few years ago.

1:32:54

If you're a reporter, now the nexus is, do you trust the government and do you

1:32:59

trust Apple?

1:33:00

If you trust the government and you trust Apple, then Apple's probably your

1:33:04

best bet for using an, you know, there's lockdown mode on an Apple phone.

1:33:08

Or they used to call it back in the day, I think it was called reporter mode.

1:33:12

But there was ways to encrypt the devices and to encrypt the chatter and the

1:33:17

tunnel coming out of the phone, the RF coming out of the phone.

1:33:21

And, you know.

1:33:22

What is lockdown mode?

1:33:23

I don't know if that's exactly what it was called or not because I've never

1:33:27

really used Apple just for my own personal reasons.

1:33:29

What personal reasons?

1:33:30

I don't trust Apple.

1:33:32

How so?

1:33:34

They are more interested in monetizing people's data than they are providing

1:33:37

them capability.

1:33:39

So every time you take a photo, every time you upload a document, every time

1:33:42

you talk to it, every time it asks you about your, you know, you'll get these

1:33:47

questions where it says, if your password's lost, you can back up your password

1:33:51

in these ways.

1:33:52

Tell us where you were born.

1:33:53

Tell us your mom's maiden's name.

1:33:55

Tell us your mom's this, your mom's that.

1:33:57

Lockdown mode is an extreme optional protection.

1:33:59

You'd only be used if you believe you may be personally targeted by a highly

1:34:02

sophisticated cyber attack.

1:34:03

There you go.

1:34:03

Most people are never targeted by attacks of this nature.

1:34:06

When iPhone is in lockdown mode, it will not function as it typically does.

1:34:09

Apps, websites, and features will be strictly limited for security and some

1:34:13

experiences will be completely unavailable.

1:34:15

Yeah.

1:34:16

So when I was advising guys back in the day on going out and doing like a high

1:34:20

risk source meet.

1:34:21

So they're going to go meet, you know, a spy for another country and you're a

1:34:25

military guy and you're debriefing someone or doing something.

1:34:28

I was always telling them to use lockdown mode.

1:34:30

I knew that it did those things.

1:34:31

I didn't know if that was the term or if I'd thought that up.

1:34:33

So can you still send iMessages?

1:34:35

You can still text and call.

1:34:37

Text and call.

1:34:38

That's it.

1:34:38

But there's other things that you can't do.

1:34:40

Well, like Meta just recently announced they're no longer encrypting your DMs.

1:34:45

Why would they do that?

1:34:46

Well, they said that it's for protection or whatever to make sure that people

1:34:50

aren't doing bad things.

1:34:52

I don't know.

1:34:52

See what their explanation for it was.

1:34:57

Sorry, I'm worried about this reporter.

1:34:58

I'm sorry.

1:34:59

Meta recently announced that they're no longer encrypting your DMs on Instagram.

1:35:07

And a lot of people are up in arms and they're stopping using any DMs on

1:35:12

Instagram and any of that stuff.

1:35:15

And the idea is that other people can read your stuff now.

1:35:18

Now, whether it's Meta can read your stuff or who.

1:35:21

That's what I mean.

1:35:23

Yeah.

1:35:23

And I said, why don't you trust Apple?

1:35:24

It's the same reason I don't trust Meta.

1:35:25

They're not interested.

1:35:27

The dangers behind Meta killing end-to-end encryption for Instagram DMs.

1:35:31

Meta blamed users for not opting into the privacy-protecting feature.

1:35:34

Experts fear the move could be the first major domino to fall for end-to-end

1:35:38

encryption tech worldwide.

1:35:39

That's a horrible narrative.

1:35:40

Yeah.

1:35:42

It seems squirrely.

1:35:44

So, oh, you've read your last free article.

1:35:49

Oh, my God.

1:35:51

Give me money, motherfucker.

1:35:52

But what Apple and Meta want to do is, like, they're trying to build these new

1:35:55

neural networks.

1:35:56

They're trying to, you know, humans, and we can get into this too later if you

1:36:01

want.

1:36:01

Humans are the only thing, in my opinion, and I'm happy to have you disagree

1:36:05

with me and I love to have this conversation.

1:36:07

In my opinion, we're the only ones that are.

1:36:09

After May 8, 2026, announce plans to discontinue support for end-to-end

1:36:14

encryption for chats on Instagram.

1:36:16

If you have chats that are impacted by this change, you will see instructions

1:36:19

on how you can download any media or messages you may want to keep.

1:36:23

Social media giant said in a help document, if you're on an older version of

1:36:26

Instagram, you may also need to update the app before you can download your

1:36:30

affected chats.

1:36:31

When reached for comment, this is what Meta had to say.

1:36:34

Very few people are opting for end-to-end encrypted messages and DMs, so we're

1:36:37

removing this option from Instagram in the coming months.

1:36:40

Anyone who wants to keep messaging with end-to-end encryption can easily do

1:36:43

that on WhatsApp.

1:36:44

But WhatsApp is a little squirrely, right?

1:36:46

WhatsApp, yeah.

1:36:47

I mean, they're all squirrely.

1:36:48

And that's the problem.

1:36:51

And so you asked me why I don't trust them.

1:36:52

It's because they want to use – so humans, in my opinion, and some animals,

1:36:58

are the only things that have the ability to project consciousness.

1:37:03

And projecting consciousness is how you train a neural network, and it's how

1:37:07

you train all these large networks.

1:37:09

A lot of my time also in the military is spent in – I was doing artificial

1:37:12

intelligence in 2012, 2011, like before it was even a catch term.

1:37:17

We were using artificial intelligence to map dynamic networks and to do other

1:37:20

things, more pragmatic uses of it than how it's being used today with large

1:37:24

language models or convolutional neural networks.

1:37:27

But they need consciousness to train their models.

1:37:29

So when Google offers you Meta or Instagram or whoever else offers you photo

1:37:33

storage, it's because they want your face to train neural networks.

1:37:37

If they're going to pay for the compute, if they're going to pay for the

1:37:41

storage for these things, they're doing it because they're going to use the

1:37:45

data.

1:37:45

If you're getting a free app, in essence, any free app, if the product's free,

1:37:50

then you're the product.

1:37:52

So when Google is allowing you to use a Google Drive and get a gig of storage,

1:37:55

they're going to use those photos to train neural networks to do better facial

1:37:59

recognition.

1:38:00

What if you're paying for Google Drive?

1:38:02

I don't know about their terms of service now.

1:38:04

That is one of the best things that I use with large language models is any

1:38:08

product I download, I have the neural network examine the terms of service.

1:38:13

And then you can pretty much understand, like, here's my focus.

1:38:17

Here's the 40-page terms of services document when you click that link that you

1:38:21

got.

1:38:21

What are they able to do with my data?

1:38:23

So that's how I sign up for apps.

1:38:24

And that's one of the great uses of a large language model, in my opinion, is

1:38:28

to quickly understand how these things are being used.

1:38:30

And that's why I say with Apple, with Meta, with all of these large information,

1:38:34

you are more the product than the product's the product.

1:38:37

And that is because they're trying to build the most powerful, capable

1:38:41

artificial intelligences, which I think is a misnomer.

1:38:44

And, again, we can get into it later.

1:38:45

But they're trying to build these hyper-competent artificial intelligences.

1:38:49

And you need two things for that, really, is training data and you need compute.

1:38:54

And that's why you start seeing them coming out with, like, Meta's building its

1:38:57

own nuclear engineering facility or something, nuclear facility or something

1:39:00

like that.

1:39:01

And they need more training data.

1:39:04

So if I want to build a replica of Joe Rogan that I can make hyper-realistic AI

1:39:08

videos for, I need every picture of your face from every angle.

1:39:13

I need every wince, every squint, everything you've ever done.

1:39:16

So I can introduce more training data to better train that neural network in

1:39:20

order to generate more hyper-realistic versions of yourself.

1:39:25

And so when a company is offering you something for free, and it's fine.

1:39:29

Like, if people are fine with that idea, then by all means, download all the

1:39:32

free apps that you want.

1:39:34

But if you're downloading a free app, it's because you are the product.

1:39:37

They either want to see how you type.

1:39:38

They want to see what you're saying.

1:39:40

They want to see how you're thinking about things.

1:39:42

They want to understand your political biases.

1:39:44

They want to look at your photos.

1:39:45

And this isn't because they're a deep-seated nation-state actor that can become

1:39:49

that.

1:39:50

But it's because they're trying to build the best products because the big

1:39:54

money is in AI.

1:39:55

That's where the biggest money is.

1:39:57

So anytime you're doing any of these things, and it's just been obvious to me

1:40:00

from the onset, not from the onset, but pretty close to the onset, that...

1:40:04

Yeah, this is a good example, right?

1:40:06

Pokemon Go players built a 30 billion photo map.

1:40:09

That's how training robots deliver your pizza.

1:40:11

There you go.

1:40:14

So, you know, they view, and they can say they don't, and maybe if someone from

1:40:19

there catches this podcast, which they well could, they might put out a

1:40:22

statement that's saying that that's not what they're doing.

1:40:24

But I'm telling you as a person who has done media forensics, who has done

1:40:28

computer network operations, and who has trained artificial intelligence models,

1:40:32

that is precisely what they are doing.

1:40:35

That is, there's no...

1:40:36

So what is the difference between using Apple and using Android?

1:40:40

Well, Android will do the same things, and Google will do the same things.

1:40:43

It's just that I can root my phone, or I can install a custom operating system

1:40:47

like Graphene or something like that, which I'm not doing right now.

1:40:52

I had to make a sacrifice when I started my company, Spartan Forge, and the

1:40:56

sacrifice was I had to be the face of this product.

1:40:59

And so I never had a social media until I started the company, and I didn't

1:41:03

upload things to the cloud until I started this company.

1:41:06

And it became just like, I have to sell a product.

1:41:09

I have to, you know, and I'm actually selling a product, not people's data or

1:41:12

people's photos.

1:41:14

I have to sell this product, I have to let people...

1:41:16

People often don't know who is the company, or who is the organizing principle,

1:41:20

and what do they care about in the company.

1:41:22

And I just made that trade and said, I'm going to have to become a public

1:41:25

person and start putting things out there.

1:41:28

And so, you know, when I started a company, we started our first Instagram, and

1:41:32

I started my...

1:41:33

My marketing team started my first Instagram, and I had to start uploading

1:41:37

things and talking about how I felt about things,

1:41:40

because I wanted people to know that this company was not going to be like the

1:41:44

other companies that are out there.

1:41:46

We don't sell their data, we don't sell emails.

1:41:48

I can make a half million dollars off my email list tomorrow, and I've been

1:41:51

offered that money.

1:41:52

You know, we've got millions of emails from people who have signed up for our

1:41:54

apps.

1:41:55

Other companies who are starting companies, they want to go out and reach

1:41:59

marketing people.

1:42:00

So if you're starting another hunting app, maybe for cameras, or for a call, or

1:42:05

a tricky call, or an elk call, or something,

1:42:07

and you found Spartan Forge, and you said, man, they've got two million emails.

1:42:13

I could pay them a half million dollars for that two million dollars and start

1:42:16

some top-of-line marketing, top-of-funnel marketing, and go blast them.

1:42:21

So they would pay me a lot of money for those emails.

1:42:23

I will never do that.

1:42:24

I'll never sell my company's emails, the people's emails.

1:42:27

I'll never do any of those things because the product is the product for my

1:42:30

company.

1:42:31

It's not the people.

1:42:32

So the reason why you use Android over Apple is the ability to root it and

1:42:37

install things like Graphene?

1:42:40

Yeah, custom OSs.

1:42:41

But yet you don't use it?

1:42:43

Not now, but what I still can use and what I still do use is Android also

1:42:47

publishes their framework in an open-source fashion

1:42:51

where you can look at the Android.

1:42:53

It's called AOSP, Android Open Source Project.

1:42:57

So the basis of Android, think of it as the nuts and bolts.

1:43:01

I'll try not to talk in too technical terms here.

1:43:03

But the basic framework, think about it like a car.

1:43:07

The frame and the engine makeup is published so you can look at how things work

1:43:11

on the inside.

1:43:12

Apple goes the opposite way, and they don't publish any of that, and you can't

1:43:15

see any of that stuff.

1:43:16

I'm for the free and open version because at least if I'm worried about my

1:43:21

phone having a problem,

1:43:23

I can actually dump binary or I can create an E01 file and exhume.

1:43:27

I can look at the binary and say, is my phone acting like it should or doing

1:43:30

what it should?

1:43:31

Or is there some kind of persistent implant?

1:43:33

I wouldn't be able to do that with a – I would have to trust Apple and Apple's

1:43:36

ecosystem

1:43:37

and whoever they're – McAfee or whatever they're using.

1:43:40

I would have to trust them, which I don't.

1:43:43

So I like the Android because –

1:43:46

Is that option available for the average consumer that's not that learned in

1:43:50

computers?

1:43:51

Well, the great part about large language models now is if you wanted to dump

1:43:55

your own phone today,

1:43:56

you could follow along with a large language model and do it, your own Android.

1:44:00

And how would you do that?

1:44:01

Well, there's – you would have to buy some expensive – there is something

1:44:06

1:44:06

you'd either have to pay a firm to do it or you could download things like

1:44:11

Celebrite.

1:44:12

You could get a Celebrite or there's other things called Forensic Toolkit,

1:44:16

other things like that that allow you to examine your phone at a deeper level.

1:44:20

And is this an app, Forensic?

1:44:23

They're products.

1:44:23

Products.

1:44:24

They're products.

1:44:25

So it's a physical product that you dump your phone into?

1:44:27

Yeah, and there's software.

1:44:28

And there's connecting and all that type of stuff.

1:44:31

Tools I used throughout my military career, Celebrite was one of them, but they're

1:44:36

Israeli-owned.

1:44:37

I've got nothing against Israel.

1:44:39

I've just got everything against foreign actors.

1:44:41

It's just if they're not an American company, that automatically kicks them

1:44:44

down a level for me.

1:44:46

So anyway, there's all kinds of – Android just makes it much easier to

1:44:51

examine your phone

1:44:53

or to understand if you've got something going on that's funky than it is on

1:44:57

Apple.

1:44:58

So for the average person, like for me, like if I –

1:45:01

You're not the average person.

1:45:02

Well, let's pretend I am.

1:45:02

If I got an Android phone and I wanted to examine my phone, what would be the

1:45:08

process?

1:45:08

You would download some of the software that I talked about.

1:45:11

You would jack your phone into it.

1:45:12

You would open your phone.

1:45:14

And then it would start carving the binary of your – everything in your phone.

1:45:20

It would start – you could create a one-to-one emulation of your phone if you

1:45:23

wanted to.

1:45:24

And then you would be able to get under the hood and examine the apps.

1:45:27

You would be able to examine the binary.

1:45:29

What's the executable code?

1:45:30

You'd be able to look at all of those things and then determine – because

1:45:34

Android open source project is published,

1:45:36

you could do a one-for-one and say, well, you know, at the kernel level, there's

1:45:41

this weird code that's not in the Android build.

1:45:44

So what is this code?

1:45:46

And then with a neural network, you could – I've never done it, but I'm sure

1:45:51

you could figure out what the intent is of that code, even for a layperson.

1:45:54

So I could take that information and I could put it into perplexity, and perplexity

1:45:59

would lay out what's going on with it?

1:46:01

Ostensibly, it would be able to, yes, unless it was some type of weird code.

1:46:04

I don't know if – I haven't used perplexity, so I don't know if they have

1:46:07

something like ChatGPT's codex.

1:46:09

I sort of just tried just to be like, can you help me examine what my Android

1:46:13

phone is doing, looking for any malicious actors?

1:46:15

Yes, I can walk you through structured, non-destructive check for malware or

1:46:18

other shady activity on your Android phone.

1:46:21

A first, what are you noticing for tools, commands, quickly check for common

1:46:25

warning signs, sudden big battery drain, you're not using the phone,

1:46:28

unusual data usage, particularly in the background, apps you don't remember

1:46:32

installing, or icons briefly appearing and then disappearing.

1:46:35

Lots of pop-ups, redirects in browser, or new default search launcher, strange

1:46:41

calls, SMS messages you didn't send yourself.

1:46:44

If any of those ring a bell, we'll focus on them in later steps.

1:46:47

Yeah, it's just asking you, like, why do you want to do that?

1:46:49

So this is just something that you could do with an Android phone that you just

1:46:51

can't do with Apple.

1:46:52

You can't do – yeah, Apple's not open.

1:46:54

What are the reasons you don't trust Apple?

1:46:55

Well, could I ask – can I do one thing before we – remember that question

1:46:58

because I don't want to forget it.

1:46:59

Could I give you a prompt?

1:47:00

Sure.

1:47:01

Because I want to answer your first question that we've already gone past.

1:47:04

Can you bring perplexity back up, please?

1:47:07

Do you want to go in addition to that or start a new one?

1:47:08

No, this is fine.

1:47:09

Just say, my friend helped me carve an EO1 file, Echo Oscar EO1 file.

1:47:19

And he says that there is code in there that doesn't comport with the rest of

1:47:30

the Android system.

1:47:32

Comport?

1:47:33

Yeah, P-O-R-T, the rest of the system.

1:47:36

Could I dump that code here and could you tell me what it means?

1:47:42

I'm sure the answer is yes, but I just didn't want to answer it because I've

1:47:45

never done it.

1:47:46

Could you tell me?

1:47:50

Could you tell me, Jeremy?

1:47:51

Could you tell me what it means?

1:47:55

Get the you out.

1:47:58

Get the you.

1:47:59

Tell me you.

1:48:00

Could I have some of your coffee, please?

1:48:02

Yeah, absolutely.

1:48:03

It's for you.

1:48:04

Okay, let's see what it says.

1:48:06

Yes, you can paste suspicious code here and I can help explain what it appears

1:48:13

to do line by line

1:48:15

and whether anything looks malicious before you paste a few important notes.

1:48:18

Remove or redact anything that looks like private data, usernames, passwords,

1:48:23

keys, tokens, IPs, email addresses, phone numbers.

1:48:26

Perplexity is our sponsor.

1:48:27

I love that because you'd never get that from ChatGPT.

1:48:30

You'd want all that information.

1:48:31

Perplexity is the shit.

1:48:34

And so you're not sharing personal or case-sensitive information.

1:48:38

Wonderful perplexity.

1:48:39

If it's very long, send in chunks and tell me chunk 1-3, chunk 2-3, etc.

1:48:46

So we can help keep track.

1:48:48

I can do static analysis here, read and reason about the code, but I cannot

1:48:52

actually execute it in a sandbox.

1:48:53

Right.

1:48:54

So this is more like a careful forensic read-through than a full dynamic

1:48:58

malware analysis.

1:49:00

Go ahead and paste the code snippet your friend flagged as not fitting with the

1:49:03

rest of the system

1:49:04

and tell me in a sentence or two where in the EO-1 it came from.

1:49:09

Example, app folder, system partition, random file path.

1:49:13

Yep.

1:49:13

Exactly.

1:49:13

So, yeah, I thought that would be the answer.

1:49:16

I've just never done it.

1:49:17

And so you can do a forensic examination of an Apple, by the way.

1:49:21

I'm sorry if I misspoke there.

1:49:23

But you can't do it to the level that you can with – because the Android Open

1:49:27

Source Project publishes all of the code,

1:49:29

I can get an understanding of the very inner working.

1:49:32

So if something's being done, for instance, at the kernel, or you could think

1:49:35

about it as like the lowest level of the phone,

1:49:37

something that wouldn't normally get caught in a forensic examination, I wouldn't

1:49:43

be able to do that with Apple.

1:49:44

Right.

1:49:45

So – and the nation state actors are doing things at very low levels in the

1:49:50

code framework for that exact reason,

1:49:53

because most people who aren't very deep into forensics would miss that.

1:49:57

It would be like the fingerprint under the couch cushion or something like that.

1:50:00

And what is the difference between what someone can do with an Android phone

1:50:06

with the standard Android operating system versus graphing?

1:50:11

So that gets into, you know, if you wanted to war drive or sample Wi-Fi

1:50:16

networks in an area,

1:50:17

or if you wanted to run a barrage attack on a Wi-Fi endpoint,

1:50:23

you could work that in there to do things with the phone that you couldn't

1:50:27

otherwise do with a standard Android operating kit.

1:50:31

But as far as on a consumer level, like what protections do you have by running

1:50:35

graphene that you don't have by running Android?

1:50:38

You're much more in control of the ecosystem.

1:50:42

You have a firmer understanding.

1:50:45

And again, you could use a large language model to do this, to understand

1:50:48

exactly what's being run on the phone.

1:50:49

You control the background services that can be run on the phone.

1:50:52

So if you're getting hot mic'd or if your camera's taking pictures of you and

1:50:55

you're not looking

1:50:56

or it's listening to you for advertising content, stuff like that,

1:50:59

you would be in control of all of that in a way that you're not in control of

1:51:02

on a native Android app.

1:51:04

In control, like how so?

1:51:05

Would it alert you that this is happening?

1:51:07

Or just the functionality wouldn't be there for it to take place.

1:51:10

Right, because the functionality is only designed for the standard Android

1:51:14

operating system.

1:51:15

And I haven't installed graphene in a while.

1:51:18

So all of this updates, and I could be saying things that are incorrect.

1:51:21

I stopped doing this about three years ago.

1:51:24

Well, I know that there was, I forget what country it was,

1:51:27

but they were focusing on people who use Google Pixel phones, for example.

1:51:31

Yeah, because that's...

1:51:32

Because that's one of the phones that are more commonly rooted.

1:51:35

Yeah, it's easy to do.

1:51:36

And you could do it with a large language model.

1:51:38

You could sit there and be walked through on how to do it, which is a great,

1:51:40

you know, part of that.

1:51:42

Is it complicated?

1:51:43

Like, for a person like me that's not that astute?

1:51:45

No, it's not something I would do with a phone that you care about the first

1:51:49

few times.

1:51:49

Right.

1:51:50

Because you're going to jack things up.

1:51:51

You have to, you know, get the bootloader.

1:51:53

And essentially, the starting, you know, the starting mechanisms of the phone

1:51:57

that launches all of the other things,

1:51:59

you have to get down to a level and unlock that so that you can...

1:52:02

Is that available for all Android phones?

1:52:04

No, not all Android phones.

1:52:05

Lots of them lock it down, so you can't do that.

1:52:08

Is that available for Samsung phones?

1:52:10

No, not this one.

1:52:11

You can't...

1:52:11

So, the question has to become, can you unlock the bootloader?

1:52:14

And that is the starting...

1:52:16

Think of it as the starting engine of the rest of the phones.

1:52:18

Why is it only available on Google Pixel phones?

1:52:20

I'm not sure why they do it that way.

1:52:22

I haven't looked into that.

1:52:23

It's just Pixels and the older Samsung's made it available.

1:52:27

Older Galaxy S7s, S10s, you could do more than you can with, like...

1:52:33

And I've got the Galaxy Fold here, and you can do almost none of that on here.

1:52:37

That is fucking sweet, though.

1:52:38

Yeah, I love this phone.

1:52:40

But, like I said, I went away from doing all that, A, because it was work, B,

1:52:43

because

1:52:44

I'm not working in national security anymore, and I'm not, you know, I haven't

1:52:48

written an

1:52:48

exploit in years.

1:52:49

I don't do this type of work anymore, and I need to sell a product.

1:52:54

And it just, you know, working with other employees, like, that run my

1:52:57

Instagram, or,

1:52:58

you know, assistant going through my email, and all those other types of things,

1:53:02

it just

1:53:02

wasn't pragmatic anymore for me to keep doing that, and I had to give up that

1:53:05

part of myself.

1:53:06

Does Spartan Forge, your app, work, run on graphene?

1:53:09

Yeah, well, it could.

1:53:10

Yeah, it would.

1:53:10

You have to sideload the app.

1:53:12

But, again, a large language model could walk you through doing that.

1:53:15

So, we haven't gotten to that level of...

1:53:18

Does it make sense here that this says it's easier because Google makes it

1:53:21

easier?

1:53:21

Yeah.

1:53:22

He was just asking me why they make it easier.

1:53:24

I don't know that answer.

1:53:26

That's, I mean...

1:53:27

So, the process is officially supported in the Android settings under developer

1:53:30

options,

1:53:31

allowing users to toggle OEM locking, simple fast boot method.

1:53:35

Pixels use standard fast boot commands that work consistently across all models

1:53:40

to unlock

1:53:41

the bootloader accessibility.

1:53:42

Yeah.

1:53:43

That's what I was talking about.

1:53:45

So, yeah, I don't know why they do it.

1:53:47

It might be people can...

1:53:48

Well, the Android open source project exists, so it would stand to reason that

1:53:53

you would

1:53:53

want a way for someone...

1:53:55

Because what you want is people interacting with that code and red teaming it

1:53:58

and making

1:53:59

the code better and then offering bug bounties so that you can tell Android,

1:54:04

like, hey, you've

1:54:05

got a critical flaw in your system architecture here, and then they'll pay you

1:54:08

20 grand for

1:54:08

that.

1:54:09

Right.

1:54:09

I've got friends who do that.

1:54:10

So...

1:54:12

You and I talked about Eric Prince's phone.

1:54:14

Yes.

1:54:15

That...

1:54:16

Which is...

1:54:16

So, the narrative is that that is an unhackable phone.

1:54:22

Yeah.

1:54:23

It's just by virtue...

1:54:24

And look, Eric's a wonderful guy, and he's...

1:54:29

The principles that he used for the first instantiation of that phone are the

1:54:33

correct

1:54:34

principles, which is we need to get...

1:54:36

If you want...

1:54:36

If you're security-focused at all, you should get away from these big, large

1:54:40

conglomerates

1:54:41

because none of your data is private.

1:54:43

That's a correct principle.

1:54:44

An incorrect principle, and I'm going to get shit about this, but I told you in

1:54:49

the beginning

1:54:49

I care about the truth, and I do care about the truth, is that when you're

1:54:52

using a PKI

1:54:53

subsystem that relies on Microsoft, then you're not in control of the PKI

1:54:59

certificate signing,

1:55:00

and Microsoft could cause a bunch of problems, and they were using that.

1:55:04

So, the other thing being, if you're building on the Android open source

1:55:08

project, that means

1:55:09

the code that you're using as the engine, let's just call it that, of your

1:55:13

phone, is examinable

1:55:15

by the public.

1:55:16

So, you're relying on Android to publish these, you know, updates to the phone,

1:55:22

and you're

1:55:23

relying on those things to be as good as possible.

1:55:25

Now, you might harden it some more, but as long as the code is out there, it

1:55:28

can always

1:55:29

be mucked with.

1:55:30

As long as people have to interact with the device and type, and you have to

1:55:33

see what you're

1:55:34

typing, a phone's going to be...

1:55:36

It's going to have Swiss cheese.

1:55:37

So, when people say something is unhackable, as you said, that's just not true.

1:55:43

Yeah, it didn't make sense to me.

1:55:45

It's just not true.

1:55:46

Which is why you and I talked about it.

1:55:47

Yeah, we talked about it quite a bit.

1:55:49

I have, like I said, great guy, done lots of great things for the country, and

1:55:54

it's just,

1:55:55

if they had just said something along the lines of, it's hackable, as any phone

1:55:58

is hackable,

1:55:59

because by virtue of you having to interact with it, it's hackable.

1:56:02

It's just, like, if I came up with an app that had a, you know, look at the

1:56:08

TikTok terms

1:56:09

of service on the first TikTok.

1:56:10

Oh, it's bonkers.

1:56:10

With those terms of services, I will own your phone.

1:56:14

And I'm not saying you can install TikTok on his phone, but what I'm saying is,

1:56:17

by virtue

1:56:17

that you have to interact with the phone and see what you're doing and type

1:56:21

passwords,

1:56:22

and you've got those kinds of terms of service, I could easily put a keylogger

1:56:26

in that, and

1:56:26

now I know your signal password or your signal pin, or, you know, I get you,

1:56:31

you know, you're

1:56:31

going to China, so I stop you in secondary, and while you're in secondary, I've

1:56:35

got a CCTV

1:56:35

on you, and you unlock your phone.

1:56:37

Now I know how to unlock your phone, and now I'm going to lock you up in

1:56:41

secondary at customs

1:56:43

in China or in Canada, and I'm going to separate you from your phone.

1:56:47

And I've seen you unlock it.

1:56:49

Well, now I'm going to get in there with NCASE, or I'm going to get in there

1:56:51

with FTK, or I'm

1:56:52

going to get in there with Celebrite, and I'm going to dump your phone.

1:56:56

And just by virtue of it being built on the Android open source project, that's

1:57:02

a great

1:57:02

thing.

1:57:03

It's a good thing.

1:57:04

Just don't call it totally unhackable.

1:57:05

Because a guy like me, I don't need but a week or two to tell you on this

1:57:10

current build,

1:57:12

like, here's the hole in this Swiss cheese.

1:57:14

Now, is it far better than having a Google phone with standard firmware and

1:57:20

standard OS

1:57:21

or an Apple phone?

1:57:22

I don't know about Apple, because, again, you asked me about Apple, and I said,

1:57:26

I don't

1:57:26

know Apple.

1:57:27

I don't know what's happening at the top of that company.

1:57:29

But I know that they like to monetize people, and that's pervasive in my mind.

1:57:34

And using data that people don't know is getting used, even though it's in a 40-page

1:57:38

terms

1:57:38

of services document, is pervasive.

1:57:40

So I just don't know at that highest level of analysis.

1:57:43

And that's why I said, to answer your question about the safest phone, I would

1:57:46

ask you what

1:57:47

you're using it for, who you are, and what are you doing in the world, is the

1:57:51

best way

1:57:51

to answer that question.

1:57:52

So me, like, what would you recommend I use?

1:57:55

I mean, I wouldn't want to...

1:57:58

I mean, okay, I'll tell you generally what I would say, because you might ask

1:58:01

me that question

1:58:02

one day, because we'd go back and forth about a lot of tech.

1:58:04

I know specifically what I would recommend for you to do, and I'd even tell you

1:58:08

to hire

1:58:09

someone else to do it and not me, because that checks and balances is what I

1:58:14

would want.

1:58:15

But for you, I would say you should take something like a Raspberry Pi, and you

1:58:21

should run WireGuard

1:58:22

on your phone, and you should route all of your internet traffic through

1:58:26

something like a

1:58:27

home terminal at your house through a Raspberry Pi, using something like WireGuard,

1:58:31

which is

1:58:32

a VPN that I use that's very good.

1:58:34

And, you know, everything should be routed through that.

1:58:38

And if you trust Apple, continue using Apple.

1:58:42

If you don't trust Apple, then, you know, use Android.

1:58:45

And you could use a Pixel and do Graphene, and you could use Signal on there

1:58:51

and those other

1:58:52

things, and you're going to be relatively safe.

1:58:53

But again, if I'm a nation-state actor, I can create circumstances where I'm

1:58:57

going to get

1:58:58

access to your shit, and I'm going to lock you down.

1:59:00

And some of them are more expensive than other methods to do it, but I'm a pragmatist,

1:59:06

and

1:59:07

you can always come up with a method to get a hold of somebody's shit.

1:59:09

You can always create the circumstances, especially if you're a nation-state

1:59:12

actor, to get a hold

1:59:14

of somebody's stuff.

1:59:15

That would be the very high level of things that I would recommend to you, just

1:59:21

out the

1:59:21

gate.

1:59:22

Yeah, it's very concerning, because it seems like these things keep getting

1:59:28

stronger and

1:59:29

more capable.

1:59:29

Yes.

1:59:30

Like the Pegasus 2 being a non-click exploit.

1:59:33

Yes.

1:59:34

So all they have to do, essentially, is just know your number.

1:59:36

Yep.

1:59:37

And that's, you know, you just make yourself a difficult target would be my

1:59:44

best recommendation.

1:59:45

When you're going to answer questions about password reset, don't answer them

1:59:49

honestly.

1:59:49

Write down in a physical journal or something how you answered those questions.

1:59:53

Don't answer them honestly.

1:59:55

You know, all of these things we think are added for layers of protection.

1:59:58

For instance, you used to get that pop-up on your phone where it said, you know,

2:00:02

there'd

2:00:02

be like blocks of pictures, and it would say, click all of the pictures with a-

2:00:09

With a traffic light in it.

2:00:10

I was just going to say that, a traffic light in it.

2:00:11

Yeah.

2:00:12

Part of that might be for security.

2:00:13

The other part of it is they're using the information of what you're clicking

2:00:16

to train

2:00:16

neural networks.

2:00:17

Right.

2:00:18

You're a product at that point.

2:00:19

Right.

2:00:20

You think you're getting security out of it, but you're a product at that point

2:00:23

because

2:00:23

you're helping to educate a neural network on what traffic lights look like.

2:00:27

Yeah.

2:00:27

And how they can look.

2:00:28

And all those different instantiations of traffic lights.

2:00:31

So, and again, like, we have to separate causality and intention and outcomes

2:00:37

in that the companies

2:00:38

might do this because they want to create the greatest AI ever.

2:00:41

But when you're issuing someone a 40-page terms of service document on

2:00:45

everything they

2:00:46

can do with your thing that you paid $2,000 for, it's just, you know, we need

2:00:52

more ethical

2:00:53

people at least what Eric Prince was trying to do was right, which was we need

2:00:57

to off ramp

2:00:58

from some of these big things because the way that this government is going, I'm

2:01:02

very

2:01:02

worried about the rights of the individual now and going forward because we

2:01:07

have an uneducated

2:01:08

class of people for all of the reasons in the world.

2:01:12

Like if you want to just focus on your family and you're not thinking about

2:01:15

these things,

2:01:15

I don't hate that for you.

2:01:16

But the idea of individual autonomy and rights has been so shit on in recent

2:01:23

years that where

2:01:25

this, when we get more uneducated and we rely, large language models are great,

2:01:30

but they're

2:01:30

not a foundation of learning.

2:01:32

In other words, we have a lot of people with access to information but no

2:01:36

wisdom.

2:01:37

It's like when your parents would say, learn how to do addition and subtraction

2:01:41

on paper

2:01:42

before you use a calculator.

2:01:43

Like understand how to do research and cite sources and understand, you know,

2:01:48

how to conduct

2:01:49

really good analysis before you just use a neural network for everything.

2:01:53

Because as we lose focus of our civics and what our founders were trying to do

2:01:58

and the uniqueness

2:01:59

of it, which is truly unique, which is, you know, when I joined the army, I

2:02:02

joined the army

2:02:03

to get out of North Dakota.

2:02:04

When I reenlisted in the army, it's because I believed in the experiment and

2:02:07

that's another

2:02:07

five-hour podcast.

2:02:08

But the foundation of the experiment is good, but we've eroded it in so many

2:02:15

ways over the

2:02:16

years and given up so many individual rights in the name of security.

2:02:20

And I'm sure it's been said on here before, but Franklin said, anybody who

2:02:23

gives up their

2:02:24

individual rights in the name of security deserves neither.

2:02:27

Your freedoms in the name of security deserve neither.

2:02:31

And it's some of the ways that they've done it have been really above the

2:02:34

surface.

2:02:35

And it, it frankly blows my mind that we let the government get away with some

2:02:40

of these

2:02:41

things that we let them get away with, where you even explain it to people and

2:02:44

like, I don't

2:02:45

see it.

2:02:45

Like, I don't see how that was a big deal.

2:02:47

And I'm like, it was a total recalibration of the system that allowed the

2:02:51

democratic party

2:02:53

and the Republican party to usurp your rights in a way that if you knew any

2:02:57

better, you'd

2:02:58

probably be protesting like some of the ways that they've done this, you know,

2:03:02

we can go

2:03:03

with the easy stuff like the Patriot Act, right?

2:03:05

In the name of security, we're going to start collecting on Americans, you know,

2:03:09

and the

2:03:09

Biden and Obama administration, I will say this at risk of, you know, getting

2:03:15

in trouble

2:03:16

because I used to have a clearance.

2:03:18

They had a massive vacuum cleaner and they knew what it was vacuuming up and

2:03:23

they kept

2:03:24

vacuuming it up anyway in the name of security.

2:03:26

I'm not saying they were going after American citizens, but they certainly knew

2:03:30

they were

2:03:31

and they just vacuumed shit up and collected it and stored it in a database.

2:03:36

In case they need it.

2:03:37

In case at some point we needed to, you know, come up with a narrative or get

2:03:41

rid of somebody

2:03:42

who's inconvenient or whatever else that just flies in the face of individual

2:03:46

American rights

2:03:47

and American autonomy and is really, in my mind, the anti-pattern to freedom.

2:03:53

It's just really, really bad.

2:03:55

I mean, I'll give you one that people always crap on me whenever I talk to them

2:03:59

about it,

2:03:59

but there's two that really bother me.

2:04:01

One of them being like the 17th Amendment.

2:04:03

Do you know the 17th Amendment of the Constitution?

2:04:06

So when the founders, when you read the Federalist Papers and the Federalist Papers,

2:04:11

I really love reading the Federalist Papers.

2:04:13

I love reading how they informed the Constitution, the Bill of Rights, the

2:04:16

Declaration even.

2:04:18

John Jay, James Madison wrote these documents explaining the framework and the

2:04:23

17th Amendment,

2:04:24

essentially how the Senate, the Senate, right, the 50 people there that are

2:04:28

supposed to be

2:04:29

representing us was originally constructed was a state would have legislatures

2:04:33

and the state

2:04:34

legislatures and the governor would appoint the senator.

2:04:36

The reason that the founders did that was because the state governments had to

2:04:42

give power

2:04:42

to the federal government to exist.

2:04:44

Back with the Articles of Confederation.

2:04:49

Confederation, is that right?

2:04:51

Articles of, I think it's the Articles of Confederation.

2:04:54

I'm blowing up.

2:04:55

Sorry, I'm going nuts.

2:04:56

Back before there was a strong centralized American government, we had problems

2:05:00

with money.

2:05:01

We had problems with interstate commerce and those types of things.

2:05:04

And those articles eventually turned into what is the Constitution.

2:05:07

But the states had to grant that power and the signers of the Declaration of

2:05:11

Independence

2:05:11

and the Constitution knew that the states needed to be those small projects

2:05:15

that we talked about

2:05:16

before where if California wanted to go nuts, let them go nuts.

2:05:20

But it shouldn't impact what's happening in Texas.

2:05:22

It shouldn't impact what's happening over in New England.

2:05:24

It shouldn't impact what's happening in the Midwest.

2:05:26

But if that goes nuts and it fails, it needs to fail.

2:05:29

So the state senators, I'm sorry, the state legislatures would come together

2:05:34

and they would vote for a senator.

2:05:36

They would elect a senator.

2:05:37

And that senator's job was to go to the federal government and protect the

2:05:41

rights of the state.

2:05:43

Not to protect the rights of individuals per se and certainly not to embolden

2:05:47

the federal government.

2:05:48

But with the 17th Amendment, what happened was the House of Representatives

2:05:53

function was to be the petulant children of government.

2:05:58

So their job was to come up with crazy ideas, crazy laws, all of those things.

2:06:02

The more liberal version of government jurisprudence would be the House of

2:06:06

Representatives, your crazy ideas.

2:06:08

And then you had state senators who were supposed to be between the House and

2:06:11

the president who would say,

2:06:13

well, here's a good idea, but the rest of this is retarded, AOC.

2:06:16

Like, we're not doing all this.

2:06:17

That's crazy.

2:06:18

Or whoever else.

2:06:19

Name me a Republican who's an asshat as well.

2:06:22

We're not doing these things.

2:06:24

And that's because it would erode the state's rights and the state's

2:06:27

constitution and what made this state great.

2:06:29

Because what the legislatures would do is say, hey, Joe Rogan, you've made a

2:06:33

lot of money and you've got a big podcast and a big voice and you've learned

2:06:37

some lessons around the way.

2:06:38

And you were able to do that in Texas.

2:06:39

And you decided to come to Texas because we had all of these things that

2:06:42

California didn't have.

2:06:43

We need you to go to the Senate for three years or six years or seven years,

2:06:48

whatever it was back then, and represent those same principles.

2:06:52

So when Obamacare comes through, you can say not only no, but fuck no.

2:06:56

Like, I'm not voting for this thing.

2:06:57

And it was to protect the state.

2:06:59

But what the 17th Amendment did was it was redundant with the House of

2:07:04

Representatives, which was, in the founders' eyes, the only popular vote part

2:07:09

of the American government was the popular vote.

2:07:13

And then you had the way the president gets elected through electors, but you

2:07:17

had the state senate, which was appointed by the states.

2:07:20

So the legislatures, and I'll use North Dakota, where I'm from, you'll have one

2:07:24

big city, two big cities, Fargo and Grand Forks, North Dakota.

2:07:28

It's where the universities are.

2:07:30

It's where your crazy kids are.

2:07:32

Crazy thought exists.

2:07:33

Hyper crazy ideas.

2:07:35

But some of them are useful.

2:07:36

The rest of the state's agriculture, right?

2:07:39

So all of those legislators from all of those counties or those legislative

2:07:42

districts would get together and say, we're going to put Bill Thompson, that

2:07:46

would never happen, but in charge of our, he's going to be at the Senate

2:07:49

representing North Dakota.

2:07:50

But he has to represent the whole state.

2:07:53

In other words, you can't do things that will help Grand Forks or Fargo because

2:07:57

that's where the universities are.

2:07:59

That's where all of the crazy politics are.

2:08:00

You also need to be thinking about the guys out in the western counties, Lemoore

2:08:04

County in North Dakota or way out west.

2:08:06

You have to protect agriculture.

2:08:07

You have to protect small businesses.

2:08:09

You have to protect families.

2:08:11

What the 17th Amendment under Wendell Wilson and how they really usurped the

2:08:15

Constitution and made the Senate a redundant, they made it a redundant House of

2:08:19

Representatives and using the popular vote.

2:08:22

So now we use popular vote for that.

2:08:24

But if you want the popular vote in North Dakota, 85% of the population is in

2:08:28

Fargo and Grand Forks.

2:08:30

So now you've got, if I want to run for Senate in North Dakota, I'm just going

2:08:33

to spend all of my time in Fargo and Grand Forks.

2:08:36

Because if I can repeat back to those people all the ideas that they want to

2:08:39

hear, I'm going to win that vote and I don't have to represent those people out

2:08:43

in the rest of the state in anything.

2:08:45

So they created a redundant House of Representatives.

2:08:47

But another reason why it happened was they wanted popular vote because there

2:08:51

is no amount of money that you could stick into a legislature out in the

2:08:54

western part of North Dakota.

2:08:56

You can't bribe these people, but the DNC and RNC now can say, look, these two

2:09:01

senators are running.

2:09:02

We like this guy.

2:09:03

So we're going to, this guy will do whatever we tell him to do.

2:09:05

And it has nothing to do with the state or representing the state's rights or

2:09:08

the rest of those legislative districts.

2:09:11

We're going to pick this senator and he's getting $300 million for his election

2:09:15

bid.

2:09:15

And this other guy who's, you know, a slower moving constitutional conservative

2:09:20

who might be a free, you know, market absolutist and a classical liberal, he's

2:09:24

not being funded.

2:09:26

But under the state architecture, you might have been a better representation

2:09:31

of the state.

2:09:32

And that's why the legislators had to vote for you to put you in as a senator.

2:09:36

You had to represent the whole state.

2:09:38

But now all that someone who wants to be a senator needs to do is go to the

2:09:42

Republican National Committee or the Democrat National Committee and say, I'll

2:09:46

do all the things you tell me to do.

2:09:48

Fund my campaign and I'm going to go stump in Fargo and Grand Forks, North

2:09:52

Dakota and the hell with the rest of the state.

2:09:54

It's very important.

2:09:55

There's a very important sleight of hand.

2:09:57

And when that happened, you made a redundant House of Representatives and the

2:10:02

state no longer was protected at the federal level.

2:10:06

And what happened was all of the power from all of these states and these

2:10:09

legislatures and these individuals got sucked up into the federal government.

2:10:12

And then after that, you see all of these things that would never have been

2:10:16

passed by a state getting passed, things like Obamacare, things like the Patriot

2:10:19

Act, certain war resolutions, all kinds of things where it just further erodes

2:10:25

the power of the state.

2:10:27

And federal government wants that because it puts all of the power up in the

2:10:30

federal government.

2:10:31

And people always say we need to get money out of politics.

2:10:34

No, we need to get power out of politics.

2:10:36

That power that they've taken, you know, over the last 130 years or so used to

2:10:40

exist at the state and local levels because they wanted these thought

2:10:43

experiments happening where we could pluck the best things out of them and

2:10:47

forget the rest.

2:10:49

But all of that power has now gone up to the federal government and the federal

2:10:53

government won't ever release that power and they only want more budget and

2:10:57

more spending to execute that power.

2:10:59

And that's also because the interest groups that want to go, they don't want to

2:11:02

have to go and convince a whole state of whether or not something is good that

2:11:05

people are going to vote on.

2:11:06

They just want to go take a lobby and go up to the federal government because

2:11:10

they want all of the power up there as well.

2:11:12

And the federal government wants all the power up there as well because they

2:11:15

make $300,000 a year before they become a politician and they're worth $30

2:11:20

million when they're done being a politician because all of the money has to go

2:11:23

to the federal government because they're in charge of light bulbs we can use,

2:11:26

computers we can use, flush toilets we can have, how our roads are going to

2:11:30

look, what our medical care looks like.

2:11:33

None of those powers are explicitly written in the Constitution of the United

2:11:37

States and they use things like the commerce law and other things in order to

2:11:40

create things like Obamacare where really we want competing states.

2:11:44

If Texas comes up with a great way to do health care and North Dakota's isn't

2:11:48

so great, they can look at that experiment and they can adopt the principles

2:11:52

and they can have it at that level.

2:11:54

But it's much easier to get change at the local level when the power is derived

2:11:58

from the state and the individual because if I want to change the way that my

2:12:01

state does health care, I have one of two options or three options.

2:12:05

I can run for office.

2:12:06

I can support someone who is going to go into office and do what I want or I

2:12:09

can move.

2:12:10

But when everything is centralized at the federal government and everything

2:12:14

flows from the federal government, all of the money, power, and gravity is up

2:12:17

there and the individual, the 300 million of us or so, have really no power now

2:12:21

to exercise either state's rights or individual rights at the higher level.

2:12:26

I hope I'm elucidating this correctly.

2:12:28

You are.

2:12:28

But it's a real usurpation of individual and state autonomy that really got rid

2:12:32

of state power, which was, if you read the federalist papers, was so important

2:12:36

to the founders that there was this state, that the state's needs were

2:12:40

organized because the state was where the founders wanted these thought

2:12:43

experiments.

2:12:45

You read Thomas Hobbes, or John Locke, or Montesquieu, all of them talked about

2:12:49

this great experiment that was being set up and how it was built on all of this

2:12:53

Western politics and everything that came before it on how we could have a

2:12:56

government that was forced to respect the rights of individuals and allowed for

2:13:01

these competing think tanks of ideas and that the power would never rest at the

2:13:04

federal government.

2:13:06

But the 17th amendment was a way that a lot of that power went from the state

2:13:11

level and the state legislatures and now to become the president, they want to

2:13:15

do a popular vote and under a popular vote, you would just have to campaign in

2:13:19

New York and LA, you would get the popular vote out of the likely voting people

2:13:24

and now the rest of the country is not and that would be another, you hear all

2:13:28

these people saying, we need a popular vote.

2:13:30

We can't have the electoral college, we can't have all of these things,

2:13:34

everything needs to be, pure democracy allows 51% to rule 49% and that was

2:13:40

another thing the founders were working fervently to get away from and that's

2:13:46

why we had an electoral college and it's actually quite beautiful when you

2:13:49

actually read about it and examine it.

2:13:51

That's why we had the state senate and state legislatures and this is why we

2:13:54

had the house, you had all levels of the things of government that the founders

2:13:58

cared about being represented in this body politic and it was a beautiful thing

2:14:02

and I could go on for 15 more things about that, I won't do it for the sake of

2:14:05

your listeners because I doubt this is what they wanted to do, but similar

2:14:08

things happened with the Supreme Court, Marbury v. Madison and allowing the

2:14:13

Supreme Court to have judicial review, that was never a thing that was in the

2:14:16

Constitution and the Supreme Court.

2:14:18

If you like the Supreme Court being able to have the power to describe

2:14:20

everything as being either constitutional or unconstitutional then you're not

2:14:24

ruled by a democracy, you're ruled by an oligarchy.

2:14:26

You've got eight people in a robe that are going to tell you whether or not

2:14:29

laws are good or bad and that's not the founding of this country, it's not how

2:14:32

it was intended to work.

2:14:33

That all started back in Marbury v. Madison with Thomas Jefferson and these writs

2:14:38

of mandamus where the Supreme Court, long story short, essentially granted

2:14:43

itself the power to conduct judicial review under the old system or the system

2:14:48

that was ratified and that the founders approved was if a law was deemed unconstitutional,

2:14:55

it would go before the Supreme Court and they would rule in favor of the person.

2:14:59

And then eventually the government would figure out, oh, this law doesn't work,

2:15:03

but it was never on the Supreme Court to say constitutional, unconstitutional.

2:15:06

You would get arrested for some law and it would get appealed to the Supreme

2:15:09

Court and the Supreme Court would say we're not punishing this person, this is

2:15:13

against the Constitution.

2:15:15

But the government would have to keep arresting people and it would have to

2:15:18

keep going in front of the federal government.

2:15:20

So what I'm saying is, and I'm sorry to go off on this, we can go back to tech,

2:15:23

but all I'm saying is the core of the American experiment in individual rights

2:15:28

and what makes this country so great and why I was willing to die for it after

2:15:32

my initial enlistment and why I have such love for this is because it was the

2:15:36

only experiment where the value of the individual was held at the top of the

2:15:39

hierarchy and that people could truly be allowed to flourish.

2:15:43

And in 250 years we did more than any society could have hoped to have achieved

2:15:47

in tens of thousands of years.

2:15:49

Everything tends towards disorder and power always gets centralized and we had

2:15:54

a framework to do that, but we were willing participants in our own demise and

2:15:59

now we're scratching our heads and wondering why there's no individual and why

2:16:04

there's no individual autonomy, why a guy can't smoke weed on the weekend or

2:16:09

why a guy can't do X, Y, or Z because we have centralized the authority and the

2:16:13

power and the decision-making structure.

2:16:15

And we're allowing them to be, there would be no problem with money in politics

2:16:17

if the federal government had only the powers that were outlined to it in the

2:16:20

constitution.

2:16:21

I think that's very well said and I could have never said it the way you said

2:16:29

it.

2:16:29

And I think there's a lot to absorb here.

2:16:31

I'm sorry.

2:16:32

No, no, it was great.

2:16:33

It was great.

2:16:34

I'm, this is one of the things that I love about you.

2:16:36

You're very thorough.

2:16:37

Yeah.

2:16:38

Thorough is one thing.

2:16:39

My friends always say Bill's tism is starting to show.

2:16:42

Ah, you got a touch of the tism, but I think that's good.

2:16:45

Like I said, just like ADHD, I think it's a superpower.

2:16:48

A lot to absorb.

2:16:49

So, I think we'll wrap it up right here, but thank you.

2:16:52

This was an awesome conversation.

2:16:53

I really appreciate it.

2:16:54

It was really great.

2:16:55

Yeah.

2:16:56

We could do this again too.

2:16:57

I'm sure we could probably have 30 or 40 of these.

2:16:59

We didn't even get to AI.

2:17:00

I wanted to get to AI because I think I have a very anti-pattern to AI and how

2:17:03

you understand it.

2:17:05

But if you want, we can save that for another time.

2:17:07

Yeah.

2:17:08

We'll do that for our next one because I think that's another four hours.

2:17:10

Yeah, probably.

2:17:11

Yeah.

2:17:12

For sure.

2:17:13

And by then, who knows where it's going to be.

2:17:15

Jensen, Jensen Huying from Nvidia recently declared that we've reached AGR.

2:17:19

Yeah.

2:17:20

So, I would, I would, yeah, I could, yeah, I just couldn't disagree more.

2:17:26

And I think I could, in the same way, I just elucidated-

2:17:28

You're not the only one.

2:17:29

There's just quite a few people.

2:17:30

Yeah, yeah.

2:17:31

I mean, it's consciousness projection and I'll sum it up in a minute.

2:17:34

At the end of the day, neural networks are mathematical functions.

2:17:38

They rest in, you know, weighting neurons based on training data and applying

2:17:42

power to train models.

2:17:44

It's all mathematic.

2:17:46

There's no sense of knowing there in that, you know, Penrose I've read a lot of

2:17:52

on his OR.

2:17:54

If people want to read about that, I won't explain it.

2:17:57

Or orchestrated objective reduction.

2:17:59

And how the mind works in these fleets of consciousness that we have, these shimmers

2:18:03

of consciousness that

2:18:04

we have based around what, you know, he describes in the microtubule.

2:18:07

We get conscious thought and that conscious thought we project into things.

2:18:12

AI is very good conscious projection, but it will never have consciousness or

2:18:16

knowing because it has no system of values.

2:18:18

And if we were to instill values in it, it would still be consciousness

2:18:21

projection.

2:18:22

You saw my dad's cabin.

2:18:24

My dad died when I was five, but I bought it back and was working on it.

2:18:27

And inside of his cabin, I got to learn a lot about my father by working on the

2:18:31

cabin that he built.

2:18:33

Like we would measure things and cut things right on walls and that type of

2:18:36

stuff.

2:18:37

That's all consciousness projection that allowed me to get to know him away.

2:18:40

I might not have even known him if he were alive.

2:18:43

But I got to re-experience and understand my father and his thoroughness

2:18:46

through that cabin.

2:18:47

AI is consciousness projection.

2:18:49

It's projected consciousness.

2:18:50

It's getting very good.

2:18:51

But on a calculator, you could get the same thing that you get out of a neural

2:18:55

network if you had sufficient time.

2:18:58

I could present you a question just like you did on Propellexity.

2:19:01

I could sit here with a rule book and I could type in a calculator.

2:19:04

It might take me a million years, but I could do it.

2:19:08

And I could give you the same answer that a neural network would give you.

2:19:11

That doesn't mean consciousness or knowing or AGI is present.

2:19:16

It relies on its training data.

2:19:18

It can only give you what the training data gives it.

2:19:20

It needs human consciousness projection like we talked about with the CAPTCHAs

2:19:25

or we talked about with uploading photos to Google Drive.

2:19:28

It needs that training data.

2:19:30

And to me, it's just really fancy, clever math.

2:19:33

And having trained these networks for dozens of years or a dozen years now and

2:19:37

working with them, they're just really clever consciousness projection.

2:19:43

And so, yeah, that is four hours and we can do that next time.

2:19:46

We'll do that next time.

2:19:47

Definitely.

2:19:48

If people, you mentioned the app.

2:19:49

By the time we do it next time, who knows what the fuck is going to be going on

2:19:51

with AI too.

2:19:52

Yeah.

2:19:53

But if people want to learn more about me or my company, if I can say that.

2:19:57

Yeah, please.

2:19:58

It's SpartanForge.ai.

2:19:59

We're built under the rubric of individual freedom.

2:20:02

I want people outdoors.

2:20:03

I want people hunting.

2:20:04

I want people experiencing nature.

2:20:06

I want people providing for their families.

2:20:09

The best part of my day is when my kids are eating a backstrap of an animal

2:20:12

that I took.

2:20:14

And I want to enable people to go out and do that.

2:20:16

And even though it's paradoxical through an app, you can get lost.

2:20:18

You've got to conserve time.

2:20:19

You've got to e-scout.

2:20:20

You've got to learn things before you go out there.

2:20:22

So we built this company under that.

2:20:24

It's one of my, you know, I've got three other companies that I'm doing, but

2:20:26

SpartanForge is the one that I'm working on.

2:20:28

It's an awesome app.

2:20:29

Really working on.

2:20:30

Well, I really appreciate that.

2:20:31

We've got a lot of work into it and we've got a lot more coming over the summer.

2:20:34

So if people want to support us or want to get out there and get some hunting

2:20:37

done, please check it out.

2:20:39

And I answer all the Instagram DMs.

2:20:40

So if you want to have a question for me.

2:20:42

Good luck with that now.

2:20:43

Well, I try to.

2:20:44

I spend about two hours every morning doing it.

2:20:46

Good luck.

2:20:47

Thank you, Joe, for having me.

2:20:48

Thanks, brother.

2:20:49

Appreciate you very much.

2:20:50

Yeah, I get that.

2:20:51

All right.

2:20:52

You too.

2:20:53

Bye, everybody.

2:20:59

Bye.