Can Humanity Be Trusted With the Means to Destroy Itself?

16 views

6 years ago

0

Save

Boyan Slat

2 appearances

Boyan Slat is an inventor, entrepreneur and former aerospace engineering student. He is the founder of The Ocean Cleanup organization: https://www.theoceancleanup.com/

Comments

Write a comment...

Transcript

Instagram is kind of dabbling with this idea of taking away the likes. Right. Like what if we just didn't show anybody the likes. You don't know how many likes you get. You put up a picture, it's just a fucking picture. Move on. No. You put up a picture. He got 70,000 likes for that picture? What the fuck man? You check it, check it an hour later. 74,000. Ooh, it's going viral. That's weird. That likes thing is one of the weirdest drugs. Nobody saw it coming and people get addicted to saying things that get likes. Putting things up that are socially conscious to let everybody know how virtuous you are. And exactly. Get some likes. Yeah. And it's all making use of, I suppose, the flaws of our human nature. What I'm worried is that one day those likes will actually be a physical feeling. Oh yeah, you get like a. A little jolt. A little love jolt. And they'll engineer the system to get you to seek those constant love jolts. Yes. Why not? I mean, if they're going to give you augmented reality. We are, how many generations, I don't know, away from something, something being embedded in your body. Right. People have already decided to do that. There's some, was it a guy or a girl embedded a fucking Tesla model three key in their arm so that they didn't ever have to have their key in their pocket. They could just walk up to their Tesla and the fucking door unlocks. Does that climb in. Software updates. Key doesn't work anymore. But it runs out of batteries. They got to cut you open like a fish. It's just, I mean, what the fuck are people doing? Those are people at the fringes. They are the fringes, but there's more of them than you think. And if they make it more, they make it simple, like you just need like a, like a flu shot. Yeah. They just engender where there is. Any sort of innate fair or aversion towards, you know, crossing that interior, exterior boundary with technology. Fair is a good way of looking at it, right? Like what is fair? Is it fair if you agree to do it? Like, look, is it fair if you decide to get a face tattoo? Right? If you, if it's up to you, man, if it's fair, if it's like, Hey man, my credit card company told me that give me 10% off. If I stick this, you know, this credit card chip under my skin somewhere. Yes. I suppose if you, again, incentivize it with, you know, with selfish interests, Yeah. Maybe it will take off. There's that. And there's also the big concern is what if these, I mean, we're talking about income inequality in this world. A big one would be what if there's a jump that you can make in enlightenment and intelligence, access to information, number crunching, the ability to assess risk versus reward. This is all done computer wise. And it's, it's done through some sort of additional piece of hardware that they give you or put in your body, but it costs a lot of money. So the people that can afford it initially are the people that are valuable. They have money in the first place. So the wealthy people are ready because it's very valuable, but then the people that really need it, they can't afford it. So by the time it becomes something, all the money's gone. Everybody's chewed it all up. Everybody's figured out how to hack the system. You should become a writer for Black Mirror. That seems like a Black Mirror episode. It seems like it would work, right? Yeah. Well, that's what people are worried about when it comes to longevity too, right? They're worried about technological innovations that are allowed people to, you know, nanobots and all sorts of different weird things are going to repair cells and allow people to live for extended periods of time. But then who are these people going to be? Are they going to be the king class? You know, are they going to be this super duper wealthy people of the future that are going to, you know, hold this over the poor folks who can't afford the technology? Yeah. Yeah. So it truly seems like the technologies that we're developing or at least are not too far away, our institutions aren't ready yet to really cope with those. No. Because definitely that would be probably increase inequality quite a lot. Yes. That is one of the major concerns when it comes to this sort of rapid change that we're facing right now. You know, another one of course is artificial intelligence. There's people that I respect very, very much that have a very negative view of what the future of artificial technology is going to mean to the human race. Sam Harris. Elon. Elon. Yeah. Both of them scare the shit out of me every time I talk about it. Yeah. Sam. Sam and I did an episode and I, he talked about artificial intelligence and the rise of it and the fact that once it's uncorked, it's really not going to be able to be put back in the bottle. And we talked about it for like an hour and a half. And after it was over, like the rest of the day I was bummed out. I was like, this is inevitable. Yes. So, as a person I have a very, I suppose a very optimistic and pessimistic view of technology at the same time. I think on one hand it allows us to improve the world and that's what we've seen and it's gradual and it continues probably because people want to solve their own problems. And with that inadvertently solve other people's problems, that's kind of progress happens, I believe. But then at the same time, while the world is getting a lot better, it's also getting riskier. I mean, 2000 years ago or maybe even 200 years ago, there was no way to wipe out humanity. There simply wasn't. Even if you wanted to happen very badly, you could scream, wouldn't happen. Now though, there are actually people who have the power to do that. Rapidly. The whole of humanity could be wiped out in a day. Yeah. And now it's fortunately just a few people. But imagine if that goes from a few people to quite a few corporations to maybe even everyone. I think there's this sort of brain teaser or mental experiment that Nick Bostrom came up with that says, well, what if you could have kind of this atomic bomb that you could just make yourself in your microwave? It's like, well, maybe at some point in time, it would just not be economically feasible anymore to rebuild cities because it just would... Too much nuclear moles, huh? Oh my God. So I don't know. So on one hand, so I think that's kind of the scary, risky aspect of it. But at the same time, when you think of it, I would much rather trust or entrust an average person today with the button for a nuclear detonation device than somebody a thousand years ago. Oh, for sure. Yeah. One of the Mongols or someone. Some savage.