44 views
•
7 years ago
0
0
Share
Save
1 appearance
Renée DiResta is the Director of Research at New Knowledge and a Mozilla Fellow in Media, Misinformation, and Trust.
154 views
•
7 years ago
Show all
Thanks for having me. I listened to you on Sam Harris's podcast and I was utterly stunned. I had to listen to it twice because I just couldn't, let's get into this from the beginning. How did this start out? How did you start researching these online Russian trolls and bots and all this jazz? Yeah. So a couple of years back in around 2015, I had had my first baby in 2013 and I was getting on these preschool lists. And what I decided to do was I started looking at anti-vaccine activity in California because I had a kid and I wanted to put them on preschool lists where I was going to fit with the parents basically as someone who vaccinates. And I started looking at the way that small groups were able to kind of disproportionately amplify messages on social channels. And some of this was through very legitimate activity and then some of it was through really kind of coordinated deliberate attempts to kind of game ways that algorithms were amplifying content, amplifying particular types of narratives. And I thought it was interesting and I started writing about it and I wound up writing about ways in which hashtag gaming, ways in which people were kind of using automation to just be in a hashtag all the time. So it was kind of a way to really gain control of shared voice and what that meant when very small groups of people could achieve this kind of phenomenal amplification and what the pros and cons of that were. And then this was 2015. So the way that this sort of awareness of social media challenges came about was actually when I was working on this, other people were looking at it from the same, looking at the same tactics but how they were being used by ISIS, by the terrorist organization. And there also you had this very small group of people that managed to use bots and amplification to really kind of own a narrative, really push this brand, this digital caliphate to kind of build it on all social platforms almost simultaneously. And the ways in which information was hopping from one platform to another through kind of deliberate coordination and then also just ways in which information flows kind of contagion style. And I wound up working on thinking about how the government was going to respond to the challenge of terrorist organizations using American social platforms to spread propaganda. So what we came to realize was that there was just this information ecosystem and it had evolved in a certain way over a period of about eight years or so and the kind of unintended consequences of that. And the way that Russia kind of came into the conversation was around October 2015 when we were thinking about what to do about ISIS, what to do about terrorism and terrorist kind of proliferation on social platforms. This was right around when Adrian Chen had written the article The Agency for the New York Times. And that was one of the first big exposés of the Internet Research Agency. The first time an American journalist had gone over there and actually met the trolls, been in St. Petersburg and began to write about what was happening over there and the ways that they had pages that were targeting certain facets of American culture. So while we were in DC talking about what to do about terrorists using these platforms to spread propaganda, there were beginning to be rumblings that Russian intelligence and you know Russian entities were doing the same thing. And so the question became, can we think about ways in which the Internet is vulnerable to this type of manipulation by anyone and then come up with ways to stop it? So that was how the Russia investigation began was actually around 2015 a handful of people started looking for evidence of Russian bots and trolls on social platforms. So 2015 if we think about social media and the birth of social media essentially it had only been alive for I mean what was Twitter 2007 I believe? Something like that. So eight years like eight years of social media and then all of a sudden they figured out how to game this system and then they figured out how to use this to make people argue against each other. Yeah so I think so there was this if you go back to like remember like Geocities and they okay AOL use that. Yeah of course. So we're probably about the same age. So there have always been you know kind of the thing that was great about the Internet like Internet 1.0 we can call it right was this idea that everybody was given a platform and you could use your platform you could put up your blog you could say whatever you wanted. You didn't necessarily get attention but you could say whatever you wanted. And so there was this kind of consolidation as social platforms kind of came into existence content creators were really excited about the fact that now they not only had this access to write their own stuff but they also had access to this audience because as the network effects got more and more pronounced more and more people came to be on social platforms. And it originally wasn't even Facebook if you remember it was like you know there's like Friendster and MySpace and social networks kind of evolved. When I was in college Facebook was still limited to like you know a handful of like Ivy League schools and so I wasn't even eligible. And as you watch this consolidation happen you start to have this information ecosystem really dominated by a handful of companies that grow very large because they're providing a service that people really want. But there's a kind of mass consolidation of audiences onto this handful of platforms. So this becomes really interesting for regular people who just want to find their friends reach people spread their message grow an audience. It also becomes really interesting for propagandists and trolls and in this case terrorist organizations and state intelligence services because instead of reaching the entire internet they really just kind of have to concentrate their efforts on a handful of platforms. So that consolidation is one of the things that kind of kicks off some of the one of the reasons that we have these problems today. Right so the fact that there's only a Facebook a Twitter an Instagram and a couple other minor platforms other than YouTube. I mean anything that you can tell it's an actual person. Like YouTube is a problem right because you can see it's an actual person if you're if you're narrating something you you know if you're in front of the camera and explaining things people are going to know that you're an actual human being. Whereas there's so many of these accounts that I'll go to like I'll watch people get involved in these little online beefs with each other and then I'll go to some of these accounts like this doesn't seem like a real person and I'll go and it's like hashtag MAGA there's a American eagle in front of a flag and then you read their stuff like wow this is this is probably a Russian troll account and it's strange like you feel like you're not supposed to be seeing this like you're seeing the wiring under the board or something and then you'll go through the timeline and all they're doing is engaging people and arguing you know for Trump and against you know whatever the fuck they're angry about whatever whatever it is that's being discussed and they're they're basically just like some weird little argument mechanism. Yeah so in 2016 there was a lot of that during the presidential campaign right and there were there was so much that was written you know we can go back to the free speech thing we were kind of chatting about before there was so much that was written about harassment and trolling and negativity and these kind of hordes of accounts that would brigade people and harass them of course a lot of that is just real Americans right there are plenty of people who are just assholes on the internet sure but there were actually a fair number of these as we began to do the investigation into the Russian operation and it started on Twitter in about 2014 actually so 2013 2014 the internet research agency is targeting Russian people so they're tweeting in Russian at Russian and Ukrainian folks people in their sphere of influence so they're already on there they're already trying this out and what they're doing is they're creating these these these accounts it's kind of wrong to call them bots because they are real people they're just not what they appear to be so I think the unfortunate term for it has become like cyborg like semi-automated you know sometimes it's automated sometimes it's a real person but sock puppet is the other way that we can refer to it a person pretending to be somebody else so you have these sock puppets and they're out there and they're tweeting in 2014 about the Russian annexation of Crimea or about MH17 that plane that went down which Russia you know of course had no idea what happened and it wasn't their fault at all and gradually as they begin to experience what I imagine they thought of was success that's when you see some of these accounts pivot to targeting Americans and so in late 2014 early 2015 you start to see the this strategy that for a long time had been very inwardly focused making their own people think a certain way or feel a certain way or have a certain experience on the internet it begins to to spread out it begins to to look outwards and so you start to see these accounts communicating with Americans and as we were going through the data sets which the twitter data set is public anyone can go and look at it at this point you do see some of the accounts that are kind of you know that were that were somewhat notorious for being really virulent nasty trolls anti-Semitic trolls going after journalists you know some of these accounts being revealed as actually being Russian trolls now it doesn't kind of exhalpate the actual American trolls that were very much real and active and part of this and expressing their opinion but you do see that they're mimicking this they're using that same style of tactic that harassment to to get at real people and if they do get banned if their account gets banned they just simply make another account they use some sort of a you know um what is it a virtual virtual server what is that called? VPNs or VPNs that's it yeah so if they do that they can kind of do that as long as they want they can continue to make new accounts and it probably also emboldens the actual American trolls because they're going to go out a little bit further than everybody else a little bit crazier and it kind of changes the tone of discourse within these communities that are arguing about a certain subject things get nastier and they're getting nastier because of the interference of these trolls like it seems like they've they've actually managed to not just cause a lot of discourse but to change the way people are interacting with each other and to make it just make it more more vicious yeah so the what they're doing is they're operating in communities so one of the really common criticisms of you know people who um a lot of people think that this didn't have a huge impact didn't you know did it swing the election we have no idea um but the what it does do in the communities that it targets is it can change that tone and that's where you see um it's it's I mean I think everybody's probably had this experience you're part of a group and then a new person gets added to the group and the dynamic changes it's very much the same kind of thing just that these are not real people who are joining the group and so there's this opportunity to to to you know kind of expand the bounds of tolerance just that little bit more or try to normalize using particular ways of communicating that maybe a group wouldn't naturally gravitate to but then it does so there there are definitely uh ways in which any any type of troll doing this doesn't have to be a Russian troll has this ability to kind of shift the language shift the community shift the uh the culture just a little bit now when did why did the agency do this and do we do we know do we have someone who's ever left there or become a whistleblower who can give us some information about what the mandate was and and how it was carried out there've been a couple of uh whistleblowers and actually some investigative journalism in Russia that's that's covered this um they describe the the employees of these is the internet research agency so it's a little bit like a social media marketing agency plus um tactics that we would not expect a social media marketing agency to use things that are a little more like what you would expect to see from an intelligence agency so besides just making your pages and your blogs and your social posts they're also in there kind of uh connecting with real people and real activists and pretending to be something that they're not to develop kind of a one-on-one relationship but most of the most of the um the whistleblowers who have come out there's a woman named Ludmilla Savichuk um she wrote an expose i believe on this and it's described as being much like you would expect if you were doing social media grunt work um you have a certain number of posts per day you you know you're trying trying to get a certain amount of engagement you're trying you've got to kind of hit your quotas most people are young millennials the people that work there um they're well versed in trolling culture they're well versed in internet culture you know they're up to speed on like popular memes and things like that and so you do see um this you know and then the other thing that they do is uh they talk about in Mueller indictment you see some really interesting descriptions of like the stand-ups that they have stand-up is a thing you do at a tech company where everybody kind of stands up and you talk about your uh goals and responsibilities and blockers and things and in these stand-ups they would be sitting there saying things like um if you're targeting black lgbt people make sure you don't use white people in your in your image and your meme because that's going to like trigger them um you know so trying to get at the the very niche um rules for you know for communicating authentically in an American community which is you know online you know you sometimes um there are very specific uh ways in which a community expects a member of that community to communicate yeah and so they are in there and you can read in these filings by Mueller's team and by the eastern district of Virginia um the the the degree of granularity that they have to recognize that if you are running a black lgbt page and your meme is of white people you're going to cause some tension and consternation and assuming that that's not necessarily what you want to be doing you should go find the meme of black lgbt people to put in the you know to put as your meme for the day so there's a lot of um there's a lot of sophistication there's a lot of understanding of American culture and then there's a lot of understanding of trolling culture and so these things combine to be a rather effective uh you know very effective social media agency and is there an overwhelming sort of narrative that they're trying to pursue they're trying to push so what we saw so i did the um i did some of the research for the senate and the senate data came from the platforms so what i had was um the attribution was made by the platforms it wasn't like renee deciding this was ira it was the platforms giving it to to our government and the information in there um what it showed was that across all platforms across twitter across facebook instagram youtube they were building up tribes so they were really working to create distinct communities of distinct types of americans and that would be for example there's an lgbt page that is very much about lgbt pride there's um they created it and they created it and they they curate it and they create it curate it um it has a you know there's like a persona a lot of the posts on the lgbt page were written by what sounded kind of like a millennial lesbian was the voice um so it was a lot of um you know memes of lgbt actresses and they would brand it with a specific brand mark it was a rainbow heart um lgbt united was the name of the page it had a matching instagram account which you would also expect to see from a media property right you would expect them to see in both places and this um you know what were they pushing it read like a read like a young woman talking about um crushes on actresses and things actually you know it was it was really besides the sometimes wonky english virtually indistinguishable from what you would read on any kind of like young millennial focused um social page it wasn't uh none of it was radical or divisive it wasn't like um the way that they got the division across was they built these tribes where they're reinforcing in-group dynamics so you have the lgbt page you have uh numerous pages targeting the black community that was where they spent most of their energy a lot of pages targeting um far right so both old far right meaning um people who are very concerned about what does the future of america look like and then young far right which was much more angry much more like trolling culture so they recognize that there's a divide there that the kinds of memes you're going to use to target younger right-wing audiences are not the same kinds of memes you're going to use to target older right-wing audiences so there's a tribe for older right-wing younger right-wing in the black community there's a baptist tribe there's a black liberation tribe there's a black women tribe there's one for people who have incarcerated spouses there's a brown um brown power i believe was the name of it page that was very much about um mexican and chicano culture there was native americans united and all of these are fake all these are fake all these are fake and what are they trying to do with all these so you build up this in-group dynamic and over and they did this over years so this was not a short-term thing they started these pages in 2014 2015 time frame most of them um they started some other ones that were much more political later and we can talk about the election if you want to but with this tribal thing um you're building up tribes so you're saying like as black women in america this is um here's posts about things that we care about here's posts about black hair here's posts about child rearing here's posts about um fashion and culture and then every now and then there would be a post that would reinforce like as black people we don't do this and so or as lgbt people we don't like this and so you're building this report so like me and you we're having a conversation we're developing a relationship on this page over time and then i say like as this kind of person we don't believe this so it's a way to subtly influence by appealing to an in-group dynamic or appealing to like as members of this tribe as lgbt people of course we hate mike pence as black people of course we're not going to vote because you know we hate hillary clinton because we hate her husband as um as people who are concerned about the future of america as texas secessionists you know so so everything is presented as members of this tribe we think this as members of this tribe we don't think this but a lot of the posts sorry but a lot of the posts were not even political they were just sort of affirming the standards of the tribe yes so they were kind of setting up this whole long game yeah and then once they got everybody on board how many followers are these do these pages have so the there was kind of a long tail there were um i think 88 pages on facebook and 133 instagram accounts and i would say maybe 30 of the facebook pages had over a thousand followers which is not very many and then um maybe the top 10 had upwards of 500 000 followers so there's you know same way you run any social campaign sometimes you have hits sometimes you have flops right and what was interesting with the flops is you would see them repurpose them so they would decide you know the same way if you're running a social media agency well we've got this audience this page isn't doing so well it's like rebranded a little bit change it up try to um try to make it appeal to somebody else so you see this there is a there is um i got this data set and i was going through these instagram memes and you know 133 000 of them and i was um there was a cluster of images of kurmit the frog i was like what the hell is kurmit the frog doing in here and so i so then i go so this the way the platforms provide the data is i got like a csv of the posts and then i got a folder of the images and so in order to like connect the dots i had to have the image up on one screen and then the this um thing the csv up on the other screen csv it's like a spreadsheet okay yeah and i and we you know turned it into a database that we could track things a little bit more easily across the platforms but um so i have this cluster of kurmit the frog memes and i go and i look and i realize that they're attributed to an account called army of jesus and i thought well that's interesting because what you know these are some of them are really raunchy it was like it was like kurmit miss piggy like you know i mean it was just like like stupid uh stupid crappy memes um attached to army of jesus and like the hell is going on here i keep going through it hundreds of kurmit memes and then i get to a post where they say like um this page is owned by Homer Simpson now kurmit went to jail for being like i don't know they made some like some joke it was stupid and all of a sudden the data set turns into Homer Simpson memes so again like this kind of raunchy Homer Simpson culture um and again it's attributed to army of jesus and then i go through all this and realize that they didn't get to actually making army of jesus a jesus focused page until like 900 posts in so they just renamed the account at some point it used to be called nuts news and then they nuts news was what they called it when it was the kurmit the frog meme page and then it gets repurposed when they realize kurmit's not doing it it's not getting the audience they want Homer Simpson's not getting the audience or engagement they want and then they pivot over to jesus and then all of a sudden they start you know the likes and and things start pouring in so what they're doing is they're actually like either deliberately or they're just creating placeholders um it's kind of a red flag when a brand new account that was created yesterday suddenly starts talking about some highly politically divisive thing or whatever but if you lay the groundwork and you do it over a period of two years then somebody who goes and checks to see what the account was where it came from how old it is is going to see something that was two years old so it's an opportunity to create almost like sleeper accounts where you create them now and then you activate them you politicize them you actually put them to use a couple years in the future so we saw all kinds of uh you know we saw this over and over again there was a black guns matter account that turned into an anonymous account at one point um they were pretending to be anonymous you know the hackivist yeah so so they repurposed this black guns matter page which just had it was it was advocating that black people buy weapons and carry and it's like a pro second amendment page but for the black community and they took that page when it wasn't getting i guess a ton of engagement and it became uh it was called um oh gosh i don't remember the exact name of the anonymous page and i don't want to say it was something that's legit but by the way they pivoted into an anonymous page and when they do that did they go back and repurpose the content of the earlier posts do they change that was not clear that wasn't clear we didn't get that information from the platforms um there was a lot of stuff that i would have loved to have more insight into we could see again you know you'd think if you started following an army of jesus page and you know all this raunchy kirmits from like a year ago that would raise some flags i would assume that they scrubbed it and restarted but i don't know your your podcast would sam changed how i look at a lot of the pages that i actually follow because i follow some pages like that have classic cars or something like that and then i'll see them and most of it is just photographs of cars like beautiful old cars and it'll have you know they'll have a giant following and then all of a sudden something will get political and i'll look at it and go oh wow like this is probably one of those weird accounts like they're getting people to get engaged with it because it represents something they're interested in like classic muscle cars and then they use it for activism and they use it for to to get this narrative across i think i i mean i've seen it happen with some of mine too um i think one of the challenges is like you want people to be aware that this stuff exists but you don't want them to be paranoid that it's everywhere i am paranoid i know that's a problem everybody's patrol now i look at this all day long and sometimes i see things and i'm like you know what are the odds um but and i and i try to you know not feel like you know you don't want to feel like you're in some like tom clancy novel but the um it's it's this balance between when you make people aware of it and i think people deserve to be aware but they deserve to understand how this plays out the flip side of that is you do wind up in these weird you know you see it happen on social media now or um click into a trump tweet and you'll see like you're a russian bot no you're a russian bot no you're like they're probably not russian bots you know it's in everybody you don't like on the internet is not a russian bot yes exactly and so that's where you get at um the interesting conversations of you know in some ways getting caught this is this is one of the challenges with running disinformation campaigns right it it makes it really hard for people to know what's real after the fact it leaves you a little bit off balance right feel like um you know when you you feel like you can't quite tell what's real and that's part of the goal right it's to make you not have a not feel entirely balanced and um in your information environment is this real is this not and so in some ways there's not much downside to doing this right because you do you know if you either knock it out of the park and you influence the election and you influence people and you have this secret covert operation going on for years um or you get caught and then there's a you know until there's some confidence in the ability of platforms to detect this stuff there's real concern among everybody that um that you're you're encountering something fake