41 views
•
3 years ago
0
0
Share
Save
1 appearance
Daniel Schmachtenberger is a founding member of The Consilience Project, aimed at improving public sensemaking and dialogue.
3 appearances
Tristan Harris is a co-founder of the Center for Humane Technology and co-host of its podcast, "Your Undivided Attention." Watch the Center's new film "The A.I. Dilemma" on Youtube.https://www.humanetech.com"The A.I. Dilemma"https://www.youtube.com/watch?v=xoVJKj8lcNQ
68 views
•
3 years ago
2.5K views
•
3 years ago
I do have an opinion on algorithms and I do have an opinion on what it does do to young girls' self-esteem. Well, you have teenage daughters. Yes. I just think it means... And young girls are a point of focus for why they're a point of focus more than young boys. I'm not entirely sure. I guess it has to do with their emotional makeup and there's higher risk of self-harm due to social media. And Jonathan Haidt talked about that in his book, The Coddling of the American Mind. It's very clear. It's very damaging. My kids, my 13-year-old does have interactions with our friends and I do see how they bully each other and talk shit about each other and they get so angry and mad at each other. It is a factor. But it's an algorithm issue, right? There's multiple things here. So the first thing is, just to kind of set the stage a little bit, I always use E.O. Wilson, the sociobiologist who sort of defined what the problem statement for humanity is. He said, the fundamental problem of humanity is we have paleolithic emotions and brains, like easy brains that are hackable for magicians. We have medieval institutions, government that's not really good at seeing the latest tech, whether it was railroads or now social media or AI or deepfakes or whatever's coming next. And then we have godlike technology. We have paleolithic emotions, medieval institutions, godlike technology. You combine that fact, that's the fundamental problem statement. How do we wield the power of gods without the love, prudence and wisdom of gods? It's actually something that Daniel taught me. And then you add to that the race to the bottom of the brainstem for attention. What is their business model? Just to review the basics, everybody knows this now, but it's engagement. It's like, how do I get that attention at all costs? So algorithms is one piece of that. When you're on a newsfeed, I don't want to just show you any news. I want to show you the most viral, engaging, longest argumentative comment threats news. So that's like pointing a trillion dollar market cap AI at your brain, saying, I'm going to show you the next perfect boogeyman for your nervous system. The thing that's going to make you upset, angry, whether it's masks, vaccines, Francis Hagen, whatever the thing is, it will just drive that over and over again and then repeat that thing. And that's one of the tools in the arsenal to get attention, is that the algorithms. Another one is technology making design decisions. How do we inflate people's sense of beautification filters? In fact, just recently, since we talked last time, I think it's MIT Tech Review article showing that they're all competing, first of all, to inflate your sense of beauty. So they're doing the filters. People know this stuff. It's very obvious, but they're competing for who can give you a nicer filter. And then now, instead of waiting for you to actually add one, TikTok was actually found to actually do a 2% just bare beautification filter on the no filter mode. Because the thing is, once they do that, the other guys have to do it too. So I just want to name that all of this is taking place in this race to capture human attention because if I don't do it, the other guy will. And then it's happening with design decisions like the beautification filters and the follow you and if you follow me, I'll follow you back and the like button and check, pull, refresh, the dopamine stuff. That's all design. Then there's the algorithms, which is I'm pointing a thing at your brain to figure out how can I show you an infinite feed that just maximally enrages you. And we should talk about that because that thing drives polarization, which breaks democracy. But that's a, we can get into that. Daniel, let's bring you in here. So how did you guys meet and how did this sort of dynamic duo come about? Yeah, I was working on studying kind of catastrophic risks at large. You've had people on the show talking about risks associated with AI and with CRISPR and genetic engineering and with climate change and environmental issues. Pull up to the microphone there. And escalation pathways to war and all these kinds of things. And basically how can shit hit the film? Right. And I think it's a pretty common question of like, how long do we have on which of these and are we doing a good job of tending to them so that we get to solve the rest of them? And then for me, it was, there were so many of them. What was in common driving them? Are there any kind of like societal generator functions of all of the catastrophic risks that we can address with to make a more resilient civilization writ large? Tristan was working on the social media issues. And when you had Eric on, he talked about the twin nuclei problem of atomic energy and kind of genetic engineering and basically saying these are extremely powerful technologies that we don't have the wisdom to steward that power well. Well, in addition to that, it's all things computation does, right? There's a few other major categories. And computation has the ability to, as you mentioned with Facebook, get to billions of people in a very, very short period of time compared to how quickly the railroads expanded or any other type of tech. Like how fast can talk to get to a billion people, billion users, which they did in like a few years versus before that it took software companies like Microsoft even longer than that. Before that took railroads even longer than that. So the power of this tech is you can compress the timelines. You're getting a scale of a billion people. You're impacting a billion people in deeper ways much faster, which means that if you're blind to something, if you don't know what you might be doing, the consequences show up faster than you can actually remediate them. When we say exponential tech, we mean a number of things. We mean tech that makes more powerful versions of itself. So I can use computer chips to model how to make better computer chips and then those better computer chips can recursively do that. We also mean exponential speed of impact, exponential scale of impact, exponentially more capital returns, exponentially smaller numbers of people capable of achieving a scale of impact. And so when he's mentioning God-like powers and kind of medieval institutions, the speed at which our tech is having influences in the world and not just first order influences, the obvious stuff, but the second and third order ones. Facebook isn't trying to polarize the population. It's an externality. It's a side effect of the thing they're trying to do, which is to optimize ad revenue. But the speed at which new technologies are having effects on the world and the total amount of consequences way faster than regulation can keep up with. Just by that alone, we should be skeptical of any government's ability to regulate something that's moving faster than it, faster than it can appraise of what the hell is even happening in the first place. Well, not only that, you need someone who really understands the technology and you're not going to get that from elected officials. You're going to need someone who's working on it and has a comprehensive understanding of how this stuff works, how it's engineered, where it goes. I'm skeptical of the government being able to regulate almost everything. Thank you.