1 year ago
Mark Zuckerberg is the chief executive of Meta Platforms Inc., the company behind Facebook, Instagram, WhatsApp, and other digital platforms and services. about.facebook.com
When we take down something that we're not supposed to, I mean that is like, that's the worst. How do you discern? Like how, like say like these Christian Facebook pages, I don't know how they found out that 19 of 20 were fake, but if someone just says, I am Bob Smith, and they post as Bob Smith, and they have a photograph, but really what they're doing is trying to talk shit about Joe Biden and get people to vote Republican in the midterms. Like how do you know whether someone's real or not? Like this is the big argument with Elon and Twitter, because Elon asked Twitter like what percentage of your website is filled with bots, and they say 5%, and he says, I don't believe you, I think it's higher, and let's find out how you've come to this conclusion. I believe they said that they just took 100 random Twitter pages and looked at the interaction, and there's some sort of an algorithm they applied to it, but how do you discern? Yeah, so I mean I think estimating the overall prevalence is one thing, but I think that the question of looking at a page, and is this page authentic, I think that there's a bunch of signals around that. One of the things that we try to do is for large pages, we try to make sure that we know who the admin of that page is. You should be able to run an anonymous page, you don't necessarily need to out yourself and say who you are running it, but we want to make sure that we sort of have an identity for that person on file, so that way we know, at least behind the scenes, that that person is real. For certain political things, I think having a sense of what country they're originating from, I mean some of that you can do just by looking at where their server traffic comes from, like is the IP address coming from Romania? Because if it's an ad in some other country's election, then you probably want to make sure that that ad is, especially in countries that have laws around that, are coming from someone who's a valid citizen, or at least in that place. So there's a bunch of, I think, I don't want one theme in my world view around this stuff, and when it gets to some of the stuff that we talked about before, it's like I don't think that this stuff is black and white, or that you're ever going to have a perfect AI system. I think it's all trade-offs, all the way down. You could either build a system, and you can either be overly aggressive and capture a higher percent of the bad guys, but then also by accident take out some number of good guys, or you could be a little more lenient and say, okay, no, the cost of taking out any number of good guys is too high, so we're going to tolerate having just a little bit more bad guys on the system. These are values questions, right around what do you value more? Those are super tricky questions. Part of what I've struggled with around this is I didn't get into this to basically judge those things. I got into this to design technology that helps people connect. You could probably tell, and we spent the first hour talking about the metaverse and the future of basically building this whole technology roadmap to basically give people this realistic sense of presence. That's what I'm here to do. This whole thing that's arbitrating what is okay and what is not, I obviously have to be involved in that because this is at some level ... I run the company, and I can't just abdicate that. I also don't think that as a matter of governance, you want all of that decision making vested in one individual. I think one of the things that our country and our government gets right is the separation of powers. One of the things that I tried to create is ... We created this oversight board. It's an independent board that basically we appointed people whose paramount value is free expression, but they also balanced that with things like when is there going to be real harm to others in terms of safety or privacy or other human rights issues. Basically that board, people in our community can appeal cases to when they think that we got it wrong, and that board actually gets to make the final binding decision, not us. In a way, I actually think that that is a more legitimate form of governance than having just a team internally that makes these decisions or maybe some of them go up to me, although I don't spend a ton of my time on this on a day-to-day basis. I think it's generally good to have some kind of separation of powers where you're architecting the governance so that way you have different stakeholders and different people who can make these decisions. It's not just one private company that's making decisions even about what just happens on our platform. Trevor Burrus How do you guys handle things when they're a big news item that's controversial? There was a lot of attention on Twitter during the election because of the Hunter Biden laptop story. The New York Post. Yeah, we had that too. Yeah, so you guys censored that as well? John Lassner We took a different path than Twitter. Basically the background here is the FBI, I think, basically came to us, some folks on our team. It was like, hey, just so you know, you should be on high alert. We thought that there was a lot of Russian propaganda in the 2016 election. We have it on notice that basically there's about to be some kind of dump that's similar to that. So just be vigilant. Our protocol is different from Twitter's. What Twitter did is they said you can't share this at all. We didn't do that. What we do is we have ... If something is reported to us as potentially misinformation, important misinformation, we also have this third party fact checking program because we don't want to be deciding what's true and false. For the ... I think it was five or seven days when it was basically being determined whether it was false, the distribution on Facebook was decreased, but people were still allowed to share it. So you could still share it. You could still consume it. Trevor Burrus Do you want to say the distribution has decreased? John Lassner It got shared. Trevor Burrus How does that work? John Lassner Basically the ranking in newsfeed was a little bit less. Fewer people saw it than would have otherwise. It definitely ... Trevor Burrus By what percentage? John Lassner I don't know off the top of my head, but it's meaningful. But basically, a lot of people are still able to share it. We got a lot of complaints that that was the case. Obviously, this is a hyper political issue. Depending on what's out of the political spectrum, you either think we didn't censor it enough or censored it way too much. But we weren't sort of as black and white about it as Twitter. We just kind of thought, hey look, if the FBI, which I still view as a legitimate institution in this country, it's a very professional law enforcement, they come to us and tell us that we need to be on guard about something, then I want to take that seriously. Trevor Burrus Did they specifically say you need to be on guard about that story? John Lassner No. I don't remember if it was that specifically, but it basically fit the pattern. Trevor Burrus When something like that turns out to be real, is there regret for not having it evenly distributed and for throttling the distribution of that story? John Lassner What do you mean evenly distributed? Trevor Burrus I mean, evenly in that it's not suppressed. It's not- John Lassner Yeah, yeah. Yeah. I mean, it sucks. Yeah. Because it turned out, after the fact, I mean the fact checkers looked into it, no one was able to say it was false. So basically, it had this period where it was getting list distribution. So yeah, I think it probably, it sucks though, I think in the same way that probably having to go through a criminal trial, but being proven innocent in the end sucks. It still sucks that you had to go through a criminal trial, but at the end you're free. So I don't know if the answer would have been don't do anything or don't have any process. I think the process was pretty reasonable. We still let people share it, but obviously you don't want situations like that. John Lassner But certainly much more reasonable than Twitter stance and it's probably also the case of armchair quarterbacking, right? Or at least Monday morning quarterbacking, I should say. Because in the moment you had reason to believe based on the FBI talking to you that it wasn't real and that there was going to be some propaganda. So what do you do? Trevor Burrus Yeah. John Lassner And then if you just let it get out there and what if it changes the election and it turns out to be bullshit, that's a real problem. And I would imagine that those kinds of decisions are the most difficult. The decisions of like what is allowed and what is not allowed. Trevor Burrus Yeah. Yeah. I mean, what would you do in that situation? John Lassner I don't know what I would do. I would have to like really thoroughly, well, first of all, you're dealing with the New York Post, which is one of the oldest newspapers in the country. So I would say I would want to talk to someone from the New York Post and I would say, how did you come up with this data? Like where are you getting the information from? How do you know whether or not this is correct? And then you have to make a decision because they might have got duped. It's hard because everybody wants to look at it after the fact. Now that we know that the laptop was real and then it was a legitimate story and there is potential corruption involved with him. We think, oh, that should not have been restricted. That should not have been banned from sharing on Twitter. Right. I think everybody agrees with that. Even Twitter agrees with that. But the thing is then they didn't think that. In the beginning they thought it was fake. So what do they do? Like if something comes along and the Republicans cook up some scheme to make it look like Joe Biden is a terrible person and they only do it so that they can win the election, but it's really just propaganda. What are you supposed to do with that? You're supposed to not allow that to be distributed. So if they think that's the case, it makes sense to me that they would try to stop it. But I just don't think that they looked at it hard enough. When the New York Post is talking about it, you know, they're pretty smart about what they release and what they don't release. If they're going over some data from a laptop and you could talk to a person, but again, this is just one story, like one individual story. How many of these pop up every day? Especially in regards to polarizing issues like climate change or COVID or foreign policy or Ukraine. Anytime there's a really controversial issue where some people think that it's imperative that you take a very specific stance and you can't have the other stance. Like, those moments on social media, those trouble a lot of people because they don't know why certain things get censored or certain things get promoted. Yeah, I agree. And it's like to be in your spot. And I was one of the things that I really wanted to talk to you about is this, because like to be in your spot must be insanely difficult to have no matter what decision you make, you're going to have a giant chunk of people that are upset at you. And there might be a right way to handle it, but I don't know what the fuck right way is. Well, I think the right way is to establish principles for governance that try to be balanced and not have the decision making to centralized. Because I think that it's hard for people to accept that like some team at Meta or that I personally am making all these decisions. And I think people should be skeptical about so much concentration around that. So that's why a lot of the innovation that I've tried to push for in governance is around things like establishing this oversight board. So that way you have people who are luminaries around expression from all over the world, but also in the US. I mean, folks like Michael McConnell, who's a Stanford professor, who was a free... which Republican president appointed him. But I mean, he was, I think, going to be considered for the Supreme Court at some point. I mean, he's a very prominent and kind of celebrated free expression advocate, and he helped me set the thing up. I think setting up forms of governance around... that are independent of us that basically get the final say on a bunch of these decisions, and that's a step in the right direction. I mean, in the Hunter Biden case that you talked about before, I don't want our company to decide what's misinformation and what's not. So when we work with third parties and basically let different organizations do that. Now, I mean, then you have the question of are those organizations biased or not? And that's a very difficult question, but at least we're not the ones who are basically sitting here deciding... We're not the ministry of truth for the world that's deciding whether everything is true or not. So I'd say this is not a solved problem. Controversies aren't going away. I think that there's... It is interesting that the US is actually more polarized than most other countries. So I think sitting in the US, that's easy to extrapolate and say, hey, it probably feels this way around the whole world. And from the social science research that I've seen, that's not actually the case. There's a bunch of countries where social media is just as prominent, but polarization is either flat or has declined slightly. So there's something kind of different happening in the US. But for better or worse, I mean, it does seem like the next several years do seem like they're set up to be quite polarized. So I tend to agree with you. There are going to be a bunch of different decisions like this that come up. Because of the scale of what we do, almost every major world event has some angle, that's like the Facebook or Instagram or WhatsApp angle, about how the services are used in it. So yeah, I think just establishing as much as possible independent governance. Thank you.