20 views
•
4 years ago
0
0
Share
Save
3 appearances
Tristan Harris is a co-founder of the Center for Humane Technology and co-host of its podcast, "Your Undivided Attention." Watch the Center's new film "The A.I. Dilemma" on Youtube.https://www.humanetech.com"The A.I. Dilemma"https://www.youtube.com/watch?v=xoVJKj8lcNQ
18 views
•
4 years ago
249 views
•
4 years ago
Artificial intelligence is something that people are terrified of as an existential threat. They think of it as one day you're going to turn something on and it's going to be sentient and it's going to be able to create other forms of artificial intelligence that are exponentially more powerful than the one that we created and that we'll have unleashed this beast that we cannot control. What my concern is with all of this... We've already gotten there. Yeah, that's my concern. My concern is that this is a slow acceptance of drowning. It's like a slow, we're okay, I'm only up to my knees, it's fine, it's just my waist high. It's very much like a frog in boiling water, right? Exactly. Exactly. It seems like... This is like humans have to fight back to reclaim our autonomy and free will from the machines. I mean, one clear... Okay, Neo. It's very much the matrix. It's Neo. One of my favorite lines is actually when the Oracle says to Neo, and don't worry about the vase, and he says, what vase? And he knocks it over, and says, that vase. And so it's like, she's the AI who sees so many moves ahead in the chessboard, she can say something which will cause him to do the thing that verifies the thing that she predicted would happen. That's what AI is doing now, except it's pointed at our nervous system and figuring out the perfect thing to dangle in front of our dopamine system and get the thing to happen, which instead of knocking off the vases, to be outraged at the other political side and be fully certain that you're right, even though it's just a machine that's calculating shit that's going to make you do the thing. When you're concerned about this, how much time do you spend thinking about simulation theory? The simulation? Yeah, the idea that if not currently, one day there will be a simulation that's indiscernible from regular reality. And it seems we're on that path. I don't know if you mess around with VR at all, but... Well, this is the point about the virtual chatbots out competing on real friends and the technology. I mean, that's what's happening, is that reality is getting more and more virtual because we interact with a virtual news system that's all this sort of clickbait economy, outrage machine that's already a virtual political environment that then translates into real world action and then becomes real. And that's the weird feedback loop. Go back to 1990, whatever it was when the internet became mainstream, or at least started becoming mainstream, and then the small amount of time that it took, the 20 plus years, to get to where we are now. And then think, what about the virtual world? And once this becomes something that has the same sort of rate of growth that the internet has experienced or that we've experienced through the internet, I mean, we're looking at like 20 years from now being unrecognizable. It almost seems like that is what life does. The same way bees create beehives, a caterpillar doesn't know what the fuck's going on when it gets into that cocoon, but it's becoming a butterfly. We seem to be a thing that creates newer and better objects. Correct? But I think- But more effective. But we have to realize AI is not conscious and won't be conscious the way we are. And so many people think that- But is consciousness essential? I think so. To us. I don't know. Essential in the sense of we're the only ones who have it? No, I don't know that. In fact, that's the theory. Well, no. But there might be more things that have consciousness. But- Is it essential? I mean, to the extent that choice exists, it would exist through some kind of consciousness and the machines. Is choice essential? It's essential to us as we know it, like as life as we know it. But my worry is that we're inessential. We're thinking now, like single celled organisms, being like, hey, I don't want to gang up with a bunch of other people and become an object that can walk. I like being a single celled organism. This is a lot of fun. I mean, I hear you saying, are we a bootloader for the AI that then runs the world? That's Elon's perspective. Yeah. I mean, I think this is a really dangerous way to think. I mean, we have to- Yeah, so are we then the species- Dangerous for us. Yeah. I mean, are- But what if the next version of what life is is better? But the next version being run by machines that have no values that don't care, that don't have choice and are just maximizing for things that we're programmed in by our little miniature brains anyway. But they don't cry. They don't commit suicide. But then consciousness and life dies. That could be the future. I think this is the last chance to try to snap out of that. Is it important in the eyes of the universe that we do that? I don't know. It feels important. How does it feel to you? It feels important, but I'm a monkey. You know, the monkey's like, I'm staying in this tree, man. You guys are out of your fucking mind. I mean, this is the weird paradox of being human is that, again, we have these lower level emotions. We care about social approval. We can't not care. At the same time, like I said, there's this weird proposition here. We're the only species that if this were to happen to us, we would have the self-awareness to even know that it was happening. We can concept- like this two hour interview, we can conceptualize that this thing has happened to us. That we have built this matrix, this external object, which has like AI and supercomputers and voodoo doll versions of each of us. And it has perfectly figured out how to predictably move each of us in this matrix. Let me propose this to you. We are what we are now, human beings, homo sapiens in 2020. We are this thing that if you believe in evolution, I'm pretty sure you do. We've evolved over the course of millions of years to become who we are right now. Should we stop right here? Are we done? No, right? We should keep evolving. What does that look like? What does it look like if we go ahead, just forget about social media. What would you like us to be in a thousand years or a hundred thousand years or 500,000 years? You certainly wouldn't want us to be what we are right now, right? No one would. No, I mean, I think this is what visions of Star Trek and things like that we're trying to ask, right? Like, hey, let's imagine humans do make it and we become the most enlightened we can be and we actually somehow make peace with these other alien tribes and we figure out space travel and all of that. I mean, actually a good heuristic that I think people can ask is on an enlightened planet where we did figure this out, what would that have looked like? Isn't it always weird that those movies, it's people are just people, but they're in some weird future, but they haven't really changed that much. Right. I mean, which is to say that the fundamental way that we work is just unchanging, but there are such things as more wise societies, more sustainable societies, more peaceful or harmonious societies. But ultimately, biologically, we have to evolve as well. But our version of the best version, that's probably the gray aliens, right? Maybe so. That's the ultimate future. I mean, we're going to get into gene editing and becoming more perfect, perfect on the sense of that, but we are going to start optimizing for what are the outcomes that we value. I think the question is, how do we actually come up with brand new values that are wiser than we've ever thought of before, that actually are able to transcend the win-lose games that lead to omni-lose-lose, that everyone loses if we keep playing the win-lose game at greater and greater scales? I like you have a vested interest in the biological existence of human beings. I think people are pretty cool. I love being around them. I enjoy talking to you today. Me too. My fear is that we are a model T. There's no sense in making those fucking things anymore. The brakes are terrible. They smell like shit when you drive them. They don't go very fast. We need a better version. The funny thing is, there's some quote by someone I think, I wish I could remember it. Something about how much would be solved if we were at peace with ourselves. If we were able to just be okay with nothing, like just being okay with living and breathing. I don't mean to be playing the woo woo new age card. I just genuinely mean how much of our lives is just running away from anxiety and discomfort and aversion. Episodes of the Joe Rogan Experience are now free on Spotify. That's right. They're free from September 1st to December 1st. They're going to be available everywhere, but after December 1st they will only be available on Spotify, but they will be free. That includes the video. The video will also be there. It'll also be free. That's all we're asking. Just go download Spotify. Much love. Bye bye. Me.