The Effects of the YouTube Recommended Algorithm w/The Social Dilemma's Tristan Harris

71 views

4 years ago

0

Save

Tristan Harris

3 appearances

Tristan Harris is a co-founder of the Center for Humane Technology and co-host of its podcast, "Your Undivided Attention." Watch the Center's new film "The A.I. Dilemma" on Youtube.https://www.humanetech.com"The A.I. Dilemma"https://www.youtube.com/watch?v=xoVJKj8lcNQ

Comments

Write a comment...

Transcript

The thing we have to hone in on is the asymmetry of power. You know, as I say in the film, it's like we're bringing this ancient brain hardware, the prefrontal cortex, which is like what you use to do goal directed action, self control, willpower holding back, you know, marshmallow test. Don't do the don't get the marshmallow now wait later for the two marshmallows later. All of that is through our prefrontal cortex. And when you're sitting there and you think, okay, I'm going to go watch, I'm going to look at this one thing on Facebook because my friend invited me to this event, or it's this one post I have to look at. And the next thing you know, you find yourself scrolling through the thing for like an hour. And you say, man, that was on me. I should have had more self control. But there behind the screen behind that glass slab is like a supercomputer pointed at your brain that is predicting the perfect thing to show you next. And you can feel it like it's, this is really important. So like, if I'm Facebook, when you flick your finger, you think, when you're using Facebook, it's just going to show me the next thing that my friend said. But it's not doing that it when you flick your finger, it actually literally wakes up this sort of supercomputer avatar voodoo doll version of Joe. And the voodoo doll of Joe is, you know, the more clicks you ever made on Facebook is like adding a little hair to the voodoo doll. And the more likes you've ever made adds little clothing to the voodoo doll. And the more, you know, watch time on videos you've ever had adds little, you know, shoes to the voodoo doll. So the voodoo doll is getting more and more accurate. The more things you click on this is in the film, the social dilemma, like if you notice, like the character, you know, as he's using this thing, it builds a more and more accurate model that the AIs, the three AIs behind the screen are kind of manipulating. And the idea is it can actually predict and prick the voodoo doll with this video or that post from your friends or this other thing. And it'll figure out the right thing to show you that it knows will keep you there because it's already seen how that same video or that same post has kept 200 million other voodoo dolls there because you just look like another voodoo doll. So here's an example. And this works the same on all the platforms. If you are where a teen girl and you opened a dieting video on YouTube, 70% of YouTube's watch time comes from the recommendations on the right hand side. Right. So the things that are showing recommended videos next. And it will show you it'll show what did it show that the girls who watch the teen dieting video, it showed anorexia videos because those were better at keeping the teen girl's attention. Not because it said these are good for them. These are helpful for them. It just says these tend to work at keeping their attention. So again, these tend to work if you are already watching diet videos. Yeah. So if you're a 13 year old girl and you watch a diet video, YouTube wakes up its voodoo doll version of that girl and says, hey, I've got like 100 million other voodoo dolls of 13 year old girls. And they all tend to watch these other videos. I just know that they have this word thinspo. Thinspiration is the name for it. To be inspired for anorexia. Yeah. It's a real thing. YouTube addressed this problem a couple of years ago. But when you let the machine run blind, all it's doing is picking stuff that's engaging. Why did they choose to not let the machine run blind with one thing like anorexia? Well, so now we're getting into the Twitter censorship conversation and the moderation conversation. So the real, this is why I don't focus on censorship and moderation because the real issue is if you blur your eyes and zoom way out and say, how does the whole machine tend to operate? Like no matter what I start with, what is it going to recommend next? So if you started with a World War II video, YouTube would recommend a bunch of Holocaust denial videos. If you started Teen Girls with a dieting video, it would recommend these anorexia videos. In Facebook's case, if you joined, there's so many different examples here because Facebook recommends groups to people based on what it thinks is most engaging for you. So if you were a new mom, you had Renee DiResta, my friend on this podcast. We've done a bunch of work together and she has this great example of as a new mom, she joined one Facebook group for mothers who do do-it-yourself baby food, like organic baby food. And then Facebook has this sidebar. It says, here's some other groups you might recommend, you might want to join. And what do you think was the most engaging of those? Because Facebook, again, is picking on which group, if I got you to join it, would cause you to spend the most time here. Right. So for some do-it-yourself baby food groups, which group do you think it selected? Probably something about vaccines. Exactly. So anti vaccines for moms. Yeah. OK, so then if you join that group, now it does the same, run the process again. So then so now look at Facebook. So it says, hey, I've got these voodoo dolls, I've got like 100 million voodoo dolls and they're all they just joined this anti vaccine moms group. And then what do they tend to engage with for a very long time if I get them to join these other groups? Which of those other groups would show up? Chemtrails. Oh, OK. Pizza Gate. Flat Earth. Flat Earth. Absolutely. Yep. And YouTube recommended. So I'm interchangeably going from YouTube to Facebook because it's the same dynamic. They're competing for attention. And YouTube recommended Flat Earth conspiracy theories hundreds of millions of times. And so when you when you're a parent during covid and you sit your kids in front of YouTube, you're like, I've got this is the digital pacifier. Got to let them do the thing. I got to do work. And then you come back to the dinner table and your kid says, you know, the Holocaust didn't happen and the earth is flat. And people are wondering why it's because of this. And now to your point about this sort of moderation thing, we can take the whack-a-mole stick after the public yells and Renee and I make a bunch of noise or something in a large community, by the way, of people making noise about this. And they'll say, OK, shoot, you're right. Flat Earth, we got to deal with that. And so they'll tweak the algorithm and then people make a bunch of noise about the inspiration videos for anorexia for kids and they'll deal with that problem. But then they start doing it based reactively. But again, if you zoom out, it's just still recommending stuff that's kind of from the crazy town section of YouTube. Is the problem the recommendation? That's I don't mind that people have ridiculous ideas about Hollow Earth because I think it's humorous. But I'm also a 53 year old man. Right. Right. I'm not I'm not a 12 year old boy with a limited education that is like, oh my God, the government's lying to us. There's lizard people that live under the earth. Right. But if that's the real argument about these conspiracy theories is that they can influence young people or the easily impressionable or or people that maybe don't have a sophisticated sense of vetting out bullshit. Right. Well, and the algorithms aren't making a distinction between who is just laughing at it and who is deeply vulnerable to it. Generally, it just says who's vulnerable to it. Because another example, the way I think about this is if you're driving down the highway and and there's Facebook and Google trying to figure out, like, what should I give you based on what tends to keep your attention? If you look at a car crash and everybody driving in the highway, they look at the car crash according to Facebook and Google is like the whole world wants car crashes. So we just just feed them car crashes after car crashes after car crashes. And what the algorithms do, as Guillaume Chaslow in the film says, who's the YouTube whistleblower from the YouTube recommendation system, is they find the perfect little rabbit hole for you that it knows will keep you there for five hours. And the conspiracy theory, like dark corners of YouTube were the dark corners that tends to keep people there for five hours. And so you have to realize that we're now something like 10 years in to this vast psychology experiment, where it's been in every language in hundreds of countries, right? And in hundreds of languages, it's been steering people towards the crazy town. When I say crazy town, I think of, you know, imagine there's a spectrum on YouTube, and there's on one side, you have like the calm Walter Cronkite Carl Sagan, you know, slow, you know, kind of boring, but like educational material or something. And the other side of the spectrum, you have, you know, the craziest stuff you can find. Crazy town, no matter where you start, you could start in Walter Cronkite, or you could start in crazy town. But if I'm YouTube, and I want you to watch more, am I going to steer you towards the calm stuff? Or am I going to steer you more towards crazy town? Crazy town, always more towards crazy town. So then you imagine just tilting the floor of humanity, just by like three degrees, right? And then you just step back and you let society run its course. As Jaron Lanier says in the film, if you just tilt society by one degree, two degrees, that's the whole world. That's what everyone is thinking and believing. And so if you look at the degree to which people are deep into rabbit hole conspiracy thinking right now, and again, I want to acknowledge Coenzel Pro, Operation Walking Bird, like there's a lot of real stuff, right? So I'm not categorically dismissing it, but we're asking, what is the basis upon which we're believing the things we are about the world? And increasingly, that's based on technology. And we can get into, you know, what's going on in Portland. Well, the only way I know that is I'm looking at my social media feed. And according to that, it looks like the entire city is on fire and it's a war zone. But if you I called a friend there the other day and he said, it's a beautiful day. There's there's actually no violence anywhere near where I am. It's just like these two blocks or something like that. And this is the thing is warping our view of reality. And I think that's what really for me, the social dilemmas was really trying to accomplish as a film. And the director, Jeff Wielowski, was trying to accomplish is is how did this society get go crazy everywhere all at once? You know, seemingly, you know, this didn't happen by accident happened by design of this business model. When did the business model get implemented? Like, when did they start using these algorithms to recommend things? Because initially, YouTube was just a series of videos and it didn't have that recommended correct section. When was that? You know, it's a good question. I mean, a you know, they originally YouTube was just post a video and you can get people to, you know, go to that URL and send it around. They needed to figure out once the competition for attention got more intense, they needed to figure out how am I going to keep you there? And so recommending those videos on the right hand side, I think that was there pretty early if I remember actually, because that's that was sort of the innovation is like keeping people within this YouTube wormhole. And once people were in the YouTube wormhole constantly seeing videos, that was what they could they could offer the promise to a new video uploader. Hey, if you post it here, you're going to get way more views than if you post it on Vimeo. Right. And that's that's the thing. I open up TikTok right now on my phone. You have to talk on your phone. Well, I'm not supposed to, obviously, but more for research purposes. Research. Do you know how to take talk at all? No, my 12 year old is obsessed. Oh, really? Oh, yeah. She can't even sit around. If she's standing still for five minutes, she just starts like she says, TikTok. And that's the thing. I mean, 2012. So the Mayans were right. Right. 2012, the platform announced an update to the discovery system designed to identify the videos people actually want to watch by prioritizing videos that hold attention throughout, as well as increasing the amount of time a user spends on the platform. Overall, you to YouTube could assure advertisers that it was providing a valuable, high quality experience for people. Yeah. So that that's beginning of the end. Yeah. So 2012 on YouTube's timeline, I mean, you know, the Twitter and Facebook world, I think introduces the retweet and reshare buttons in the 2009 to 2010 kind of time period. So you end up with this world where the things that we're most paying attention to are based on algorithms choosing for us. And so the sort of deeper argument that's in the film that I'm not sure everyone picks up on is these technology systems have taken control of human choice. They're taking control of humanity because they're controlling the information that all of us are getting. Think about every election. Like I think of Facebook as kind of a voting machine, but it's a sort of indirect voting machine because it controls the information for four years that your entire society is getting. And then everyone votes based on that information. Now you could say, well, hold on. Radio and television were there and were partisan before that. But actually TV radio and TV are often getting their news stories from Twitter and Twitter is recommending things based on these algorithms. So when you control the information that an entire population is getting, you're controlling their choices. I mean, literally in military theory, if I want to screw up your military, I want to control the information that it's getting. I want to confuse the enemy. And that information funnel is the very thing that's been corrupted. And it's like the Flint water supply for our minds. Episodes of the Joe Rogan experience are now free on Spotify. That's right. They're free from September 1st to December 1st. They're going to be available everywhere. But after December 1st, they will only be available on Spotify, but they will be free. That includes the video. The video will also be there. It'll also be free. That's all we're asking. Just go download Spotify. Much love. Bye bye.