I talk with Dave Karpf, (twitter: @davekarpf), a political scientist and associate Professor of Media and Public Affairs at George Washington University.
There’s a good chance you’ve heard about how Cambridge Analytica used access to the Facebook data of millions of U.S. citizens and advanced digital advertising wizardry to essentially “hack” Americans’ minds and deliver a surprise presidential victory to Donald Trump. This depiction of Cambridge Analytica as nefarious data geniuses has been portrayed in many news stories, and probably most prominently in the popular documentary The Great Hack (Netflix link).
But what if this perception is largely untrue? What if Cambridge Analytica was exaggerating their behavior-influencing abilities, as many companies do? And what if our perceptions of CA as geniuses of digital influence is based on people accepting their exaggerated claims uncritically?
That is the stance of political scientist Dr. Dave Karpf, and in this episode he explains why. Links to this episode:
Topics discussed include:
- What the documentary ‘The Great Hack’ got wrong
- Why public perception of the power of highly focused “micro-targeting” of digital advertising is largely overblown
- Why political ads don’t have much effect on election results
- The cost of being distracted by the wrong problems
- The myth of the “attentive public” and how many of our political problems stem from politicians coming to realize that not many people are actually paying attention
- Newt Gingrich’s role in “Trump”-style politics
- The role of social media and the internet in politics
- An incident where Dr. Karpf compared NY Times columnist Bret Stephens to a bedbug on Twitter, and how Stephens over-reacted and was mocked by both liberals and conservatives and eventually deleted his Twitter account
Related content or stuff we mention:
- On digital disinformation and democratic myths, by David Karpf
- A world without wizards: On Facebook and Cambridge Analytica, by David Karpf
- Story in The Atlantic about Newt Gingrich: The man who broke politics
- What Netflix’s The Great Hack gets wrong about Cambridge Analytica, from The Nation
- Film review of The Great Hack, from Variety, criticizing the movie in similar ways
TRANSCRIPT
[Note: this transcript will contain some errors.]
Hi and welcome to the People Who Read People podcast. I’m Zach Elwood. You can read summaries of past episodes at www.readingpokertells.video/blog, and contact me there, too.
In this interview, I talk to political scientist Dr. Dave Karpf. This interview was recorded on August 19th 2020.
There’s a good chance you’ve heard about how, in the 2016 US election, a British firm called Cambridge Analytica got access to the personal Facebook data of millions of American citizens and used advanced targeted marketing techniques to sway the election in favor of Donald Trump. Many people’s perceptions on this topic come from a documentary on Netflix called The Great Hack, which painted a picture of Cambridge Analytica essentially hacking the minds of Americans using advanced digital wizardry, with the result being Donald Trump as president. And there have been many articles written on this subject that take a similar point of view.
But what if that perception weren’t true? What if Cambridge Analytica was bullshitting, like many companies do? What if the claims they made about their ability to change people’s opinions using advanced techniques were mostly smoke and mirrors? Based on everything I’ve learned recently, including what I learned doing the interview you’ll hear in this episode, there’s a good chance that Cambridge Analytica’s claims were mostly bullshit; it seems likely that their Facebook marketing work done for the Trump campaign was not any more advanced than what most marketing teams or political campaigns are doing with Facebook ads, basically no more effective than any political advertising in general. In short, it seems likely that the perceptions about the power of micro-targetting of ads, based on individual traits and psychology, are largely overblown.
I got interested in this subject when I watched the documentary The Great Hack. I was immediately skeptical of the claims made in that movie. The movie implies that Cambridge Analytica used psychological quizzes on Facebook to gather psych data of millions of Americans. They allegedly used these detailed psych profiles to create highly targeted, highly specialized ads differentiated and aimed at specific types of voters who they thought might be susceptible to influence. It did sound scary. But many of these claims sounded to me like deceptive marketing claims; indeed, many of the graphics and verbiage they use in the documentary come from Cambridge Analytica’s marketing presentations. And the movie presents no evidence or even a detailed description of how the actual ad micro-targeting worked in practice; it was very short on details, very hand-wavy.
As I researched, I found some thoughts on the subject by Dr. Dave Karpf, a political scientist and an associate Professor of Media and Public Affairs at George Washington University. His last name is spelled KARPF. He’s written several books on the use of political data: one book is called “The MoveOn Effect – The Unexpected Transformation of American Political Advocacy”, and another one is titled “Analytic Activism: Digital Listening and the New Political Strategy.”
There’s a chance you might remember Dr. Karpf from an incident in 2019 where, in a Twitter post, he compared NY Times columnist Bret Stephens to a bedbug, which led to Stephens sending an angry email to Karpf and Karpf’s boss at the university. One thing led to another, and this conflict led to Bret Stephens being mocked by both liberals and conservatives, and to Stephens deleting his Twitter account.
In this episode’s interview, I talk to Dr. Karpf about Cambridge Analytica, about how much of their claims about advanced marketing are hype versus reality, about how effective their political ads were, and about the efficacy of political ads in general, about the over-size political influence of social media platforms like Facebook and Twitter, and we talk a bit about his Bret Stephens social media scuffle, and about social media’s role in the political sphere.
To get the best sense of Dr. Karpf’s thoughts on Cambridge Analytica and the modern state of political advertising and persuasion, I highly recommend reading a couple pieces he wrote, which you can find by searching for the titles of these pieces:
One is titled: On Digital Disinformation and Democratic Myths.
Another is titled: A WORLD WITHOUT WIZARDS: ON FACEBOOK AND CAMBRIDGE ANALYTICA
These two pieces give a very clear presentation of Dr. Karpf’s thoughts on the subject, and I recommend reading them. I learned a lot reading Dr. Karpf’s work and listening to him, and I think you will, too. If you want links to his articles, you can find summaries of my podcast episodes at readingpokertells.video/blog.
Okay, here’s the interview from August 19th 2020 with Dr. Dave Karpf.
Zach: Thanks for coming on Dr. Karpf.
Dave: Sure thing. Thanks for having me.
Zach: Okay, let’s start with the movie, the Great Hack. How misleading do you think it was?
Dave: I mean, scale of one to 10? I went into that movie thinking it was gonna be a nine, and it was probably like a six or a seven.
It could have been worse, but it was, it. Trafficked in a lot of the, the tropes and mistakes that I have over the years come to associate both with Cambridge Analytica and with the broader conversation about political microtargeting that’s been going back well before 2016. People have been worrying about microtargeting since even the 1990s.
Really.
Zach: People bring up [00:05:00] that other people have used microtargeting and it was, it seemed the case that, you know, because it was the Trump campaign, that that’s why there was so much. About the, the micro targeting this time. Is that accurate to say
Dave: it, it’s accurate. The thing that I would add onto it, it’s not just that Republicans were doing it this time, so people were concerned so much as Trump’s victory was such a surprise to, to everyone.
I mean both to to Democrats, but to everyone. People didn’t expect in October that he was gonna win. Uh, James Comey didn’t expect he was gonna win the media, didn’t expect he was gonna win. Other republicans didn’t expect he was gonna win in like mid-October. And when you have something that is that high stakes and that jarring, people then wanna find some simple answer to some simple explanation and the explanation that.
The reason he got elected is because some secretive organization with all of your data hacked your minds or hacked other people’s minds like not yours. You [00:06:00] of course were unaffected, but the mass public out there, they all got their minds hacked, and that’s how we got this surprising result. That’s a nice, simple explanation.
So I think it’s, it’s not just that this time the Republicans were using it, they were doing microtargeting during the Bush years as well, but it was specifically that this was such a surprise that people needed an answer and Cambridge Analytica became a nice stand in for.
Zach: It’s comforting in a way to have that as a, as an explanation.
Yeah, yeah, yeah. Mm-hmm. When I was watching that, the great hack, you know, it stood out to me. I, I got kind of suspicious as I was watching it because it didn’t really offer much of, in the way of countering viewpoints to the claims that, of, of this extreme persuasion and control. And, and that to me, you know, when I watch a documentary and it’s not offering me those other points of view, being skeptical of the first point of view, I get, I get a little, little skeptical.
And that’s why I started looking into it. And that’s how I found the things that you had written. The interesting thing about the Great Hack documentary too, is one of the things that didn’t mention was that Ted [00:07:00] Cruz’s campaign had actually dropped Cambridge Analytica, even though they used them earlier, but leading up to the 2016 campaign, they, they had actually dropped them because I guess they weren’t happy with their.
With their work, which would’ve been a balance thing to include because they, in the documentary, they, they emphasized that Ted Cruz had used it. But that was another strange thing where I was like, why wouldn’t they have included that?
Dave: Right. Yeah. And there’s, there’s always that question of, if Cruz was using Cambridge Analytic first, and it was that effective, then A, why didn’t Cruz get the nomination?
And B, why did Cruz drop them? And what crew staffers would tell you is we dropped ’em like a bad habit because when we actually looked at how they were performing, we found out that they weren’t actually doing any of the things that they claimed they were doing. Like all of this, all of this pro, these promises that Cambridge Analytica was making about how they were gonna do psychographic targeting of the entire electorate.
Then when the cruise campaign looked at it, they were basically like, oh, you’re, you’re not actually doing any of that. We’re just handing money over to you, and [00:08:00] the results are
Zach: terrible. I mean, that’s what struck me is, you know, the documentary basically wasn’t questioning what, to me, sounded just like marketing claims of like, we’re really digging into these details and finding all of these connections and really targeting these things.
You know, it, it just was this unquestioning acceptance of all of these claims, which to me, you know, I saw no evidence that those things were being done.
Dave: Right. Yeah. And the difficulty is when you find their internal marketing materials, when you find their PowerPoint slides, or like the presentations that Alexander Nicks gave when he was trying to sell the product, do you treat that as.
Proof of how effective it is, or do you treat that as marketing where they are trying to get clients? Mm-hmm. And if it’s the latter, then we’re left with the question. Okay, well what else have you got? And part of the problem is that it, it’s always very, very difficult to actually sort out how effective are any of these things.
Um, the strongest claims I’ve heard for psychographic persuasion effects are that in careful lab settings, applying a different message to [00:09:00] different audiences with different psycho psychological traits can lead to different results. I buy that, that is probably the case in a laboratory setting, but rolling that out on a mass scale is entirely different effort and there’s just no evidence that this company had ever been able to pull it off.
And plenty of evidence that what they were probably doing was saying they could pull it off so they could get the contracts, and then trying to figure out as they go once they got the money.
Zach: It just seems like, and I think you wrote about this, the, the trade off involved in theoretically being able to create all these hyper targeted, focused ads, this content that’s so much money and time to create that, and it’s kind of hard to imagine that being much more effective than just coming up with a generally persuasive bit of content and, and broadly disseminating that.
You know, you, I think you’ve talked about that. If you could, uh, talk about that issue a little bit.
Dave: Yeah, I mean, the. There’s two points there that I’d really wanna make one, and this comes up repeatedly in, in the film where they keep [00:10:00] talking to David Carroll, who’s worked very hard on this. I really respect the work that he’s put into this.
But at times in, in the film, as a political scientist myself, it’s just really clear to me that he comes out of the, at this, out of the world advertising, um, because he, he kind of keeps on pointing out, look, advertising works. I know it works because I study this stuff. And a point that I think we need to keep in mind here is advertising in the context of elections, is the toughest type of sell for advertising.
Like the, the easiest way, the place where you would have the biggest impacts for any type of advertising would be when you have a new product that nobody has heard of before. And if they heard about it in the right way, they would actually buy it. They would actually take an act. They, they would take an act that they wouldn’t otherwise be taking.
That’s not voting. Everything that we know in political science about people’s voting behavior in the United States is people have deep impressions of the two parties. They have deep impressions of like civic behavior, whether or not they’re going to vote. The best predictor of whether or not you’re gonna [00:11:00] vote is did you vote in past elections?
People who tend to vote tend to vote. People who tend not to vote tend not to vote. The best predictor if you’re gonna vote for a Democrat or a Republican or third party is, have you in the past voted for a Democrat, Republican, or third party? There’s very little variance variance there. Uh, and one of the pieces that I, I think you’re referencing here, I pointed out that this is a, a bit like if Coke and Pepsi were the two brands and you could only buy one soda every four years.
Like Coke and Pepsi do a lot of advertising. Not to make you aware that Coke and Pepsi, Pepsi exists, but because they want you to buy more of their soda, you’ve already got a brand loyalty. But you could buy more or less in the context of American elections, once every four years we get to cast a ballot.
You’re pretty much gonna cast a ballot for the party that you casted ballot for your Yeah, your team. So the effect sizes there are gonna be vanishingly small when we keep that in mind. The thing that stands out for me about psych, uh, psychographic microtargeting claims is why the hell [00:12:00] would the advances in advertising be happening in American elections?
Like if psychographic microtargeting really, really worked the place where it would make scads of money, ridiculous money is gym membership sales. Somebody with like high anxiety levels, you would market gym memberships to them in a different way to than somebody with a different psychological profile.
Mm-hmm. And if we had these files on everybody based on their Facebook likes, then gyms ought to be trafficking that and giving psychographically micro-targeted messages to get you to sign up for a gym membership. You’re not gonna use, the fact that they’re not doing that is a big red flag. So the thing about gym membership sales is they can actually measure week by week.
Okay. We tried these psychograph, these psychographically micro targeted ads last week. Let’s take a look. Did we get more signups? You, you’ve got actually an actual outcome variable that you can use to run experiments. In the context of elections. The other problem is since we can only vote on the first Tuesday in [00:13:00] November, all that advertising is kind of a wa like we can come up with some like fuzzy measures to like get a, have a guess at is this making a difference?
But I largely, that’s like a massive polling operation. And then you drop your ads in one area and not in another and see like did the numbers for people’s intention to vote, intention vote. And so the, the problem in American elections, the reason why it’s like such an awful place to actually innovate new advertising techniques is you’re gonna spend months and months, millions upon millions of dollars trying out something new with no way of finding out whether or not it actually changed your outcome variable until the first Tuesday of November.
And then you gotta wait a couple years for another election to try it again. That’s not where you should have, uh, innovations of any sort. Mm-hmm. The, the innovations that you ought to be seeing in political advertising should basically be grabbing what you’re seeing in other, in other areas of commerce and in other business areas.
And them saying like, okay, this is working for car sales. Mm-hmm. Or, you know, this is working for gym [00:14:00] memberships. Let’s see how you apply it to politics. So sort of skating behind these places that have an outcome variable that you can actually learn and refine. Since you don’t have that in elections, that’s just not the place that you’re gonna see major innovations.
But what you do see is since the stakes are so high and we spend billions of dollars on American elections, that creates a huge incentive for vendors, for, for the ad makers to make outrageous claims that they can be the one who, who actually like get those contracts and can buy a new car.
Zach: No, it’s great.
It’s a great point. I mean, the fact that you don’t see that from in other markets proves your point pretty well. Yeah, it was eye-opening to me. You mentioned, uh, some respected studies showing that political ads basically have like zero impact. That seems relatively lesser known.
Dave: Yeah. I this, so this is one of those things that within the field of political science is bedrock so well understood, but it’s, it’s unknown for everybody else.
What we have found [00:15:00] continuously for, from in political science for like 20 or 30 years now is that political advertisements have a very small impact that only lasts a few days. After that, it all washes out. That seems, that seems weird. It’s a hard thing for people to wrap their heads around. It is August while the two of us are recording this Democratic National Convention is going on right now.
RNC will be next week. We still have months and months before the election actually happens, which means we are going to sit through hundreds of millions of dollars of political advertising. And it’s, it’s a head scratcher. This notion of, at the end of the day, that’s actually not gonna move that many votes again, the main reason it’s not gonna move that many votes is because people already have pretty well-defined preferences about the parties, and particularly at the presidential level, they, they know an awful lot about the two candidates.
Like polls never move very much. But what we’ve found continuously is that. The motion in the polls tends to be like people who were probably gonna vote for Trump, but were mostly like kind of uncomfortable with the recent thing, [00:16:00] end up coming home to ’em. Like the advertisements helped to remind them why they like to vote for Republicans, they vote for the Republican.
Same on the democratic side. So people end up kind of settling back into what they were likely to do anyway. But yeah, and like the, the like huge ad buys move the needle a little bit for a few days and then that effect just kind of vanishes For listeners, if you’re trying to wrap your head around this, the thing I would ask you to think about is open up your calendar right now and pick a date three weeks ago and try to recall what the hell was in the news cycle three weeks ago.
You probably can’t do it. The new cycle is moving so fast. There’s so much churn in it, particularly in politics, particularly right now that the things that in any given day seem like the major controversy just kind of end up washing out. They get replaced in, in our memories. And if that, you can’t remember what was going on three weeks ago, it stands to reason that three months from now the advertisement that you briefly saw either on TV or in a, in Facebook or Twitter, isn’t gonna have that big of an impact on you [00:17:00] in this ocean of advertising that you’re also seeing.
Zach: Do you think it’s the same effect for, say, ads that try to get people out to vote? You know, for example, Trump uh, ads saying Democrats are gonna destroy the country and we really need you to vote. Do you, do you see it as not really having that much effect either or, or do you see that as a separate class of like get out to vote ads?
Dave: The effects of those ads are also pretty marginal and they’re pretty small overall. Now, the thing we need to keep in mind is marginal effects in the US system. Can determine, like, determine the entire balance, balance of power, right? Like a 0.1% change in three states. And we’re talking about whether or not Hillary Clinton can get a, can get a second term right now.
So since all of the power goes to Razor’s edge, uh, margins, the fact that their marginal effects doesn’t mean that they don’t matter. You know, one of the arguments that I’ve occasionally had to have back in the, in all the debates over Cambridge Analytica is people would say, well, you know, the race was so close that it, it’s large enough to make a difference.
And okay, [00:18:00] sure, but let’s rank order all the different things that were big enough to make a difference. Because a strong wind in a couple of states is, is shifts enough votes to, to make a difference. Mm-hmm. Let’s compare Cambridge Analytica, like the letter, let’s compare it to like how the New York covering things.
Uh, let’s compare it to structural voter suppression and Cambridge.
Despite it being big enough to change, uh, an election that was just so incredibly tight that everything could, so, yeah. And mobilization ads also don’t have huge impacts like any individual ad There’s two added things we need to think about here though. One is, and I think I speak for most of political science when I say this, we are less confident in our predictions for 2020 than in any previous election cycle because the way you do this as a social scientist is you start from what we’ve learned from previous cases and you make what’s called a CS bu assumption.
The, the assumption that all other things being equal, the 2020 election in, in this important way is not gonna be like past [00:19:00] elections. ’cause we have not had a past election in the middle of a pandemic. So in the middle of a pandemic, we’re now forced to raise the specter of should you be voting by mail, which you haven’t done before, and will that ballot arrive or not?
Or should you be standing in lines where you may get a deadly disease and you and your family might die? Those are stakes and conversations that seem like really bad science fiction if I had rolled those out to you a year ago. So in that context, that’s where it, it may be possible that the advertisements, this election cycle could have a, a larger impact because now we’re, we’re in the land of people trying to figure out what do we know and what don’t we know about?
COVID-19. And whereas everybody knows it how they feel about Democrats and Republicans and they feel like they know a lot about both parties. They’ve lived their lives with them their entire life. Nobody actually knows anything about COVID-19. Even the experts are still figuring stuff out every single day.
So that’s a place where both misinformation effects are much larger. Like while I’m not particularly worried about misinformation, [00:20:00] changing votes on election day, I’m very worried about misinformation on Facebook, changing how people relate to masks and, and how we deal with the pandemic in the United States because the baseline knowledge for each individual is so much lower.
And since that baseline knowledge is also probably gonna impact the elections, we should, I’d say, be more concerned there than we would be in a normal time.
Zach: Yeah, I feel like there’s all of these separate issues that get combined and make these things hard to talk about. You know, you’ve got, for one, you’ve got political advertising, you know, whether digital or in other media and how deceptive that is allowed to be.
You know, there’s thoughts around how, how much you can say in that regard. And then you’ve got. The whole fake news and disformation, which sometimes gets bundled in there, but is a separate issue because that’s a broad set of things from ideologically motivated propaganda to, you know, Macedonians making fake news for money.
So that’s a whole separate issue. And then you’ve got, you know, that seems to be under the category of the general topic of the internet, and its overwhelming amount of [00:21:00] content and alternative facts and how that might have weakened our ability to determine fact from fiction. You know, so there’s, then there’s the use of people’s data, and that’s a whole broad category of, of things, uh, and what companies are allowed to do or what they should be allowed to do.
So I, I feel like it’s, it can be hard to talk about these things because people will start talking about one thing, but then they are actually talking about another thing and it’s, they’re so overlapped, you know?
Dave: Yeah. There’s, there’s a bunch of different conversations that all end up kinda layer on top of each other.
You’re exactly right. One of the key points that I often try to make with this stuff, part of the bind that we’re in in the United States is that our regulators stopped regulating a long time ago. So like the question of what sort of deceptions should be allowed in political communication, um, or in electoral communication right now there’s a big conversation about like, what should Facebook and Twitter allow?
Like they should, should they allow Donald Trump to lie on his Twitter account? Should they allow a political campaign to put out a Facebook ad that has a lie in it? Should they allow random people to, to take that out? And one of the [00:22:00] problems there is like, who the hell elected Facebook or Twitter to make those decisions about what communications are allowed in election?
And the actual answer is, well, yeah, it’s not supposed to be Facebook and Twitter. It’s supposed to be the Federal Election Commission. The Federal Election Commission hasn’t had a quorum for months. Dating back to before the COVID-19 crisis. Back in like still relatively normal times, we now don’t have enough commissioners on the FEC to have a quorum, which means the regulators cannot regulate.
So like Facebook and Twitter are downstream of this regulatory mess, where since we no longer have the capacity to govern, the platforms are just left being like, uh, I guess we have to make up some rule and everyone is just gonna yell at us for it. I find it very hard to defend Facebook these days ’cause they just make it so hard.
But that is the one defense I’d still offer of Facebook is like when people point out like, Hey Facebook, how dare you? This isn’t your role, right? The answer is like, yeah, it’s shouldn’t, their role
Zach: shouldn’t.
Dave: But the people [00:23:00] who, whose role it is aren’t doing it. So like somebody has to. That’s a great point.
Rebuilding regulatory capacity is something that’s gonna take an awful lot of time. But once we’ve rebuilt it, then we ought to have a government that ask questions about like, how much data on individual citizens should each of these companies be able to hold for How long should they be able to hold them?
In what way should they be able to repackage them and sell them? And what harm should be prevented along the way? There’s a lot of scary, important conversations to have there. None of which we can really have until we’ve got regulators who take the thing seriously, which we haven’t had for at least four years now.
Zach: So that federal election group not being staffed as that related to the general pattern of Trump not filling positions, his admit not filling positions.
Dave: Yes. Though it, so it actually dates back before that the, the Federal Election Commission was set up to have an even number of commissioners. Half Democrat, half Republican back during the Obama years.
What happened was the, and I mean I can be frank about this, but also like I can show you all the reporting, like this stuff was, was news back then. [00:24:00] Um, back in the Obama years, basically the Republican commissioners decided, Hey, if we just say that everything is fine, any types of lies, any types of new behavior, we’re just gonna say that that’s legal.
Then people will get away with what everything and like with whatever, and we’ll just like let that go. And other commissioners called Foul, and then people started quitting and then they were already like pretty toothless in a way That was a real problem and in particular in a way that left Google and Facebook saying, okay, well if you guys won’t make any rulings for how we’re supposed to handle this, we’ve gotta make it up as we go ourselves.
So like that was part of the problem going into 2016. But yeah, then since then, Trump has refused to appoint a Democrat along with a Republican to have an even, even number. And so they’ve just, as people have dropped off, they’ve no longer, they no longer quorum and like, that’s just the way it’s now. Now there’s still staff who work for the FEC.
So existing rules, they can at least enforce. But we’re so far behind the times. Most of, yeah. Yeah. The question is like, what the hell are you allowed to do on Facebook? [00:25:00] And if Facebook goes to the FEC and says, can you give us some advice here, FEC, their basic answer is like, no, sorry. We can’t, we don’t have a quorum.
So yeah. Then Facebook’s gonna have to come up with an answer. It’s not gonna be a good answer. ’cause that’s not Facebook’s job. It’s not their specialty. It’s like not their role. And we’re gonna be left with exactly the mess that we have now.
Zach: It seems like when you watch these Senate hearings, watching the legislators and how they know so little about modern, uh, social media and, and computer stuff in general, it, it makes you worry because it seems like nobody’s in charge and knows about the issues.
Dave: Yeah, and I always have some questions about that because, well, like when we see Mark Zuckerberg. Hauled before Congress. That’s a performance on both ends. And often the senator who is giving that softball question where they just look like an clueless out of touch, 80-year-old. What’s going on there is that the senator has decided, I’m going to give you a softball question to take the heat off.
Mm-hmm. Because I don’t wanna ask you the [00:26:00] harder questions right now. Interesting. Um, it’s gonna take a lot of time and work to rebuild our regulatory capacity, but Congress isn’t full of complete fools all the
Zach: time. Yeah. I can relate to that because I’ve interviewed people for various project and sometimes you just want to set them at ease because it can be a good strategy to set people at ease.
Yeah. To get back to the micro-targeting, uh, topics. Do you know, uh, how much actual ad differentiation was done by ca, uh, Cambridge Analytica or other political firms? Do you know how much has actually been done versus what has been claimed to have done?
Dave: So, I, I don’t know. And systematically no one does know or can know really.
The people who have that data are the people who work at the individual firms. They have an incentive not to share that data publicly. Uh, they have also, also have an incentive to overhype what they’re doing in order to invite in more business. And, you know, people like, uh, like Guy Chris Wiley who became one of the [00:27:00] Cambridge, the, the main Cambridge general analytic whistleblower.
While I’m glad he came forward to blow the whistle, he’s also still marketing that same product. So when he comes out and vouches for how effective it is and how scared everyone needs to be, he’s also trying to drum up interest in his business doing exactly that work. So I think the same grain of salt that we should take to their marketing materials, we probably need to take to him.
Part of the difficulty here again, is since there are strong incentives for all of that data to remain private, not only do I not have strong data on how much ad differentiation was really done and how effective was it, but nobody else does either really. All the systematic studies, though, have found time and time again that we’re talking about, at least within the realm of electoral communications.
’cause again, American elections are a massive sea of money focused at a single, uh, behavior that you do once every few years, and that you’re already pretty well ingrained in what you’re gonna do. The effect sizes of any given ad are gonna be really small, and they’re gonna be. [00:28:00]
Zach: What do you see as the main downsides to people having the wrong view about these topics?
Dave: So there’s two downsides, and here I think it’s important to make a distinction between like the mass public, your listeners, and political elites. Though I don’t know, maybe some members of Congress listen to your podcast. Uh, if they do, they’re in group two. So for group one, I, I think there’s, the problem here is that we end up focusing on the wrong thing.
It becomes so easy to focus on the simple explanation that the reason why these elections are going the wrong way is because some company hacked your brains. And if that happened, the solution is well, Democrats need to go invent their own company and hack people’s brains just as well, or even better.
And then we’ll all be better, or at least we, the mass public should, you know, watch a documentary and feel more informed about how data is all around us. And the, the, the difficulty there is like, sure, we should all be more aware of how our data is getting trafficked online. That’s a nice thing. But the effect sizes [00:29:00] there are so small and there are bigger things to be worried about.
If you wanna talk about how data gets used in elections, we should probably be focusing on the very real ways that political elites are using data on who votes where and how to change the rules for like. How many polling places will be open in a African American majority district in Georgia? You know, that happened in the last election.
It’s, you have fewer polling places with that, that have shoddy your machines in areas with a lot of Democrats. All of the work that’s been done over the years, state by state, every time you have Republican Secretary of State to say like, oh, well if you have a gun permit, we’re gonna say that that counts as identification for registering to vote.
But if you have a student id, then that doesn’t count as it, you’re gonna need to go this, get this special application that’s, you can only get if you drive two hours and it’s only open one hour every, every week. They are systematically doing that kind of stuff to make it easier for some segments of the popul to vote and other segment of.
That’s an active outrage. And I don’t think [00:30:00] anyone who’s worried about Cambridge Analytica would say, oh yeah, that’s not an outrage too. But like, that’s a fundamentally depressing outrage where you learn about it and you’re like, well, that’s terrible and we should all be mad. But there’s no like, aha.
That explanation call to action. Yeah. There’s no easy call to action there. It’s just depressing as hell. Right? And so we end up not focusing enough on the structural voter suppression that’s really going on by political elites and instead getting entertained and getting, uh, getting ourselves thinking about how like, oh, well it’s, if you just had like better media consumption habits, if you’re just more careful about what you trusted on Facebook, then you’re doing your part and the world would be better.
So in that sense, I, I think there’s an opportunity cost. We’re focusing on the Cambridge Analytica of the world lead us to not focus enough on things like structural voter suppression, which is much. The second answer though for political elites is I think there’s a very real negative impact on political elites because like we’ve never actually had a mass public, an American public that lived up to our [00:31:00] ideals of what the public ought to be.
We’ve never had a well-informed, deeply engaged public holding elected officials accountable. We have over the decades, particularly in i’d, I’d say, uh, a big chunk of the 20th century during this sort of the mainstream media era, we had political elites who believed that a public like that existed and behaved as though they existed.
That then regulates that elite, elite behavior that leads them to say like, well, if you make a campaign promise, you need to at least like. Try to live up to that campaign promise, otherwise bad things will happen.
Zach: Have some shame about lying.
Dave: Yeah, exactly. Like the, that, that sort of shame and that sense of like, well, you know what, like Democrats, Republicans need to occasionally reach across the, uh, across the aisle to collaborate on something.
Otherwise we can’t just go and say like, well, no, it was their fault. ’cause the public will notice that we, we were at fault. We weren’t willing to collaborate. The big danger with things like Cambridge Analytica and in and in general with sort of the, this rise of political microtargeting [00:32:00] is that it leads political leads to come to the realization that they can just lie and get away with it and nobody cares.
Zach: Yeah.
Dave: Josh Hawley, who is now the senator for Missouri when he was running for the Senate. He, he was, and he was Attorney General for the state of Missouri, and he was in court bringing a suit, trying to strike down the Affordable Care Act and trying to get rid of preexisting conditions. He then ran adver, ran advertisements saying that he was trying to defend preexisting co conditions and his opponent, the Democrat, was trying to take them away.
That’s just like an outright lie. Blatant.
Zach: Yeah.
Dave: And it worked. And the problem is that when those blatant things work, that then leads you with political elites who realize, you know what, we, we can just get away with whatever. And that makes America ungovernable. That’s how part of how you get an FEC that doesn’t function anymore is when you have a bunch of commissioners who decide like, you know what?
We don’t even need to pretend to play this game anymore. We will just sabotage the entire thing and our side will face no consequences. Mm-hmm. Like part of the, the story of how we got ung enough to [00:33:00] get Trump definitely dates back to Mitch McConnell at the beginning of Barack Obama’s terms. So in 2009, saying outright that his primary goal was to make Barack Obama one president.
He said strategically, the way we’re gonna do this is no matter how hard Barack Obama works to try to craft bipartisan solutions, we’re not gonna give any Republican votes. And then we’re gonna turn around and say, that’s ’cause you weren’t able to work with us. You were too partisan. So they basically decided like, we are going to try to slow down everything.
We’re going to try to gum up the economic recovery under the assumption that if we do that effectively, you’ll be blamed for it. And in 2010, he was blamed for it. Democrats lost seats, and the danger there wasn’t just that Democrats lost seats. It’s that that taught Mitch McConnell and other political elites, particularly in the Senate, that there was no cost to that kind of obstruction, that if they obstructed the other side, could get blamed for it.
That’s how you become ungovernable. So the, the challenge here is that the more we focus on the microtargeting and this [00:34:00] narrative of like a, a secretive company like Cambridge Analytica can Yeah, exactly. Like wizards. Uh, often when I, I give talks about this, the, the theme is what I call living, living in a world without wizards.
If we believe that there are these wizardly, micro targeters who can get the public to believe whatever you want them to believe, then you don’t need to do the hard work as a political elite of actually doing what you said you were gonna do. You just need to hire a wizard or figure out, okay, there are no wizards, but I’m not being held accountable, so what the hell?
Let me just get my fellow cronies rich.
Zach: And, and it weakens in the sense that even if the leaders don’t believe that, if the public believes that it weakens the myth of the attentive public, you’re saying?
Dave: Yeah. But that the myth of the attentive public, to be clear, is a myth amongst political leads. What we need for America to function again, isn’t a public that is finally.
Attentive in the way that we’ve always wanted them to be. We’re not gonna get that.
Zach: Right.
Dave: But what we need is just we, we need political elites who e either like who act if, take that seriously in their bones. Yeah. Who act as if [00:35:00] either because they’ve come to believe it or because they’ve just all believed like, well if we’re gonna have this privilege, we ought to do something proper with it.
Damn it. And if we don’t have those political elites, then like, we’re not gonna have a functioning country.
Zach: Yeah, that’s a great point. I mean, reading your pieces and tying it into, uh, new gingrich’s behavior and, and I was reading more about him the last couple days and there was a great Atlantic piece just describing, you know, what, what you’re saying, where he basically realized and taught other people that these, these things that they thought were holding them back, these kind of moral or ethical things that they previously thought were holding them back.
Uh, like the attentive, you know, people paying attention to what they were doing really didn’t matter and that they had more leeway than they thought to just basically, you know, do whatever they wanted. And, and the public in the end wouldn’t really care that much about their. About lies or about, uh, insults or about things that were previously out of bounds for them.
Dave: And there’s been a cycle there. ’cause of course, as Newt Gingrich is figuring that out in the mid 1990s that we’re leaving the mainstream media [00:36:00] system that we had had, in which we had a small number of channels and therefore a small number of gatekeepers to oh, we add on the internet, but we also add on partisan news, cable news.
So then you get more gatekeepers, but also these partisan audiences. So like now we have Republican senators who correctly can, you know, like Matt Gates is a Republican member of the house who has correctly calculated that his path to fame is just acting like somebody who’s gonna show up on Hannity a lot.
So long as he’s acting out in that manner, he’s gonna get attention on Fox News and that’s gonna make him a rising star of the Republican Party. And the real problem is like that calculus works out. Another great example is, is Ted Cruz back in, I think it was 2013, when Ted Cruz shut down the government.
’cause Yeah, I know it must have been 2011. ’cause it was before. It was. As the Affordable Care Act was coming online, he shut down the government in order to try to get the house to deny funding, even though he was a senator. And he like read Green Eggs and, and Ham on the house floor was like, or the Senate floor was like this whole big thing.
And the government [00:37:00] was shut down for weeks on end with no real plan there. And it was like, he didn’t really have a plan for like, how is that gonna work? How are we gonna get Barack Obama to give up on his signature legislation here? So eventually it just gave it all up. If you looked at public opinion polls, everyone was saying, this is dumb, like reopen the government already.
But his calculus was like, this will make him more popular and more famous in hard conservative and Fox News circles. This will make him a bunch of like add fame and add attention and add funding to his coffers. And there will be no actual costs for it because by the time the next election rolls around, people won’t be voting on that.
They’ll be voting on the latest scandal or on their like, deep preferences.
Zach: It’s almost like the extensive, uh, cable news and fast news cycle combined with the internet. You know, it’s made everything just about getting attention or, or at least it. That’s a, that’s a path that people have now is just getting attention just from the sheer amount of coverage that’s available and very, you know, across the board.
Dave: Right. And getting attention from the audiences that care the most. Right. Like when there was only a half [00:38:00] hour of six o’clock news, there was still a dynamic of like, okay, you would like to get attention. Attention would be nice, please. But that was like a pretty si slim pool to be sipping from, and the gatekeepers there expected you to behave in this sort of, mm-hmm.
Lockstep, bipartisan manner. Once you get a much wider media system, that then changes the incentive structure and then you start realizing like, wow, if I’m playing in such a manner that leads Donald Trump to retweet me and Sean Hannity to devote segments, to me that’s gonna matter in material ways, in, in terms of like your ability to raise money for your next election, that is a thing that gets rewarded, whereas.
And this goes again, this primarily goes back to 1998 with the Republican Party. But you know, if you are a member of the Republican Party in in office, if you’re a member in office who is routinely partnering with Democrats on signature legislation, you will face a hard primary challenge from somebody who’s well funded.
That sets up a set of incentives that you run it for a few election cycles, and you’ll be left with people who try to say [00:39:00] extreme things to get on Fox News and who never, ever work with the other side on anything.
Zach: How do you see the Internet’s role in weakening our collective agreement on what is fact versus fiction?
Dave: It’s, it’s an accelerant. I wouldn’t say that the internet is the central actor in this drama in particular because we’ve always had conspiracy theorists. We’ve always had people who like believed things that were wrong. The internet makes that more public and it leads it to, to travel in different ways, particularly the current internet.
Like the internet of 2020 in a way that the internet of like 1995 didn’t, I was doing some historical research. This was a few years ago. Um, I read the entire back catalog of Wired magazine chronologically and cover to cover, which was a fun thing to do. Um. One of the thing cases that, one of the stories that stood out for me in that, uh, was from January of 1997, there was this whole article about all the conspiracy theories about Candidate Clinton, but it was conspiracy theories on like message boards, and it was Bill Clinton instead of Hillary Clinton.[00:40:00]
And it was just this like great side by side of like, they were writing about that back in 1997, but talking about how it doesn’t really matter because like, look, Clinton won. And then in January, 2017, we’re writing about Macedonian fake news factories and how that cost Hillary Clinton the election. So, you know, some of it is like, okay, who won?
Who lost? But some of it is also, we have a different internet in 2017 than we do in 1997, um, because we have Facebook and social media and, you know, sort of these new giants that didn’t exist back in the dial up era. But that’s, that’s kind of a digression. I, I’d say broadly, the, the internet has made it easier for our perspectives being made public and to travel.
And that both leads us to, to be more aware of the people who disagree with us. Like it’s now easier to find someone you disagree with and yell at them. So that can create a set of incentives. But I like, again, I think we need to make a distinction between the mass public and political elites, because I think particularly in politics, it’s okay for people to not have thought and read really deeply on every political issue under the sun.
Like, it’s okay if, like you don’t know [00:41:00] an awful lot about like the details of public schools in your area, it’s okay for me not to know an, an awful lot about like. How do public health officials deal with crises? We need public health officials to be really good at dealing with crises, so I don’t need to think about that.
We need educators who are really good at dealing with education policy, so you don’t need to think about that at any point in history. If we like, were able to like pause time and investigate how many wrong beliefs do people have? Not actually sure that we have more wrong beliefs today with the internet than we did in say 1957, but we can see more of them.
And again, the, the real problem is that we are now actively mistrusting our experts and, and our, our elites are no longer behaving as though we need expertise to get things done. And so now we’ve become the country with such a bad COVID crisis that you can’t even travel to Europe anymore and school’s about to start in a few weeks and like, it’s not gonna work.
That’s what happens when we actually just take the expertise out and the Internet’s not causing all of that, but it’s certainly not helping it either.
Zach: Would you agree that? ’cause it seems to me that the, just the sheer overwhelming amount of [00:42:00] content on the internet, which includes all sorts of weird and wrong stuff.
Makes it difficult for people to feel like they’ve found the truth or found something correct. And I, I know you’re saying it’s an accelerant, not the main factor, but it seems to me that. It hurts the myth of the, uh, attentive public in a, in a major way. You know, for example, if I’m an elected leader and I know that it’s so hard for people to find the truth and there’s so many different points of view, then it, that’s weakening the myth of the attentive public in a, in a major way.
Would you, would you agree with that?
Dave: I, I would agree that politicians having the ability to both see that their public, like the, the, their constituents don’t really know what they’re doing, that, that has a corrosive effect for you and me, though, the, the thing that stands out for me is like, it depends on where you go on the internet.
Like, it’s really easy to find yourself on the internet. Like if you spend all your time in certain Reddit groups or like certain YouTube communities, [00:43:00] it’s really easy to become convinced that everyone on the internet completely agrees with your weird Q anon or white nationalist beliefs. Right, your bubble.
Yeah. Yeah. Like it’s, it’s pretty easy to actually like. Find your people on the internet. If you and I were having this conversation in 2005, we’d be talking about how amazing it is that people could find their people on the internet from like the late nineties into the early aughts. Like the groups that were finding their people on the internet were like gay people.
It used to be really hard, like my best friend when I was in high school in the nineties was gay and it was very hard. There was a lot of friction for him to find his people.
Zach: Right.
Dave: My students, I mean I’m a college professor, like my gay students now are like, they don’t have a sense of what the closet was back then.
Partially ’cause of how social norms have changed, but also partially ’cause it’s now so much easier to find your people.
Zach: Yeah. Positive things for like psychological issues and stuff. You can find people talking about that and Right. So it’s got these positive and negative effects. Yeah.
Dave: Yeah. And, and we’ve seen this, we’ve seen this also with Black Lives Matter that like the main, the what used to be sort of like.
The mundane terror [00:44:00] of white cops shooting young black men. That was like a mundane tragedy because of how often it happened. But like there was just quiet rage. There was no way to organize around it. And now we’re seeing that building into something because those moments are becoming movements. So all of that is, is I think, good.
And then also we’re seeing how that same phenomenon can lead Q Anon supporters to decide. Actually, all of the Democrats have secret pedophile rings where they also eat children. Then a few of them will like show up with a gun at a pizza place demanding to go down to the basement, even though there doesn’t, there is not a basement in that pizza place.
That’s the common ping pong pizzagate story.
Zach: It’s like an accelerant. I mean, I, I think the, the, the same thing that makes it good that people can find there are people also is a radicalizing influence because it’s like, uh, you know, they just keep, uh, accelerating, uh, sharing the same content. And, and, you know, I, I’d, I’d argue, I’d argue, you know, you could, you can find the same instances in the, uh, the anti-cop [00:45:00] Black Lives Matter because, because people are sharing, like, you know, there was a example of people sharing videos from another country and everyone thought it was from the US and Riling people up online, you know?
Mm-hmm. It’s the fact that these, these kind of like bubble uh, chambers where people are just accepting the stuff from their peers unquestioningly. And so it’s really easy to go in there and. Purposefully or not, just these groups can get accelerated.
Dave: Right? The ease of sharing can have some downsides. I remember there was, um, this was probably 2017 or so, there, there was like a brief story going on, uh, going around that I saw on Twitter about like, somebody had studied, I think it was a photo of Trump that like there was the actual photo and then the photo was released by the White House.
And in the, the photo released by the White House, Trump’s hands were larger. And Trump has this, you know, longstanding thing about getting called small handed and then insisting that its hands are big and that’s indicative of other things. And so like I saw that story going around and I like looked at it for a second and because it fits so well with a narrative that I had already heard and believed.
Mm-hmm. [00:46:00] And like, you know, the Trump campaign, this is, you know, this is after like. Them insisting that the crowds were larger than they were and all that. So I was like, yeah, ha ha, they got caught doing that thing again. So I retweeted it. And then some reporters noted like, uh, actually this, this didn’t happen.
Like Right. Some somebody just goofed on this. Yeah. Yeah. And like, we all do it. Yeah. So like those things happen because in particular, the ease of sharing is so simple that when something fits a narrative, that, and. If I had spent five hours, like making sure that was real before retweeting it, that means that there’s some important work that I have to do that I really don’t wanna do.
Like I’m really procrastinating hard if I’m putting that much effort into that. And what that can lead to is a situation where like, depending what part of the internet you’re in, you’re either being bombarded by it, like things that silence you or getting rewarded a lot. You know, like it is much easier, like for, for the two of us as white dudes, like we don’t encounter an awful lot of hate on [00:47:00] Twitter.
The, the women who I know who do, who have the same job as I do, like the amount of shit that they have to go through on a daily basis is incredible. Which is one of the ways that like I would say like Twitter is a toxic cpo, but not for me. Twitter’s really fun for me. Uh, it’s a toxic, toxic, successful systematically for an awful lot of people who don’t look and behave the way I look and behave.
And they, it would be really nice if they fixed that. Damn,
Zach: I’d argue it’s successful for me too. I mean, I don’t, I honestly don’t even check my responses anymore. And I mean, I got retweeted by the CEO of the Babylon B yesterday for one example. Mm-hmm. I mean, it, it’s, you know, I, I get a lot of, a pretty good amount of hate actually, from, from both sides of the mm-hmm.
You know, of the aisle. But actually we can talk about the impacts of, of social media a little later. I, I did want, uh. First mention that, um, back to Cambridge Analytica, it, it, it was true that Cambridge Analytica was doing some pretty creepy manipulations in other countries. I did wanna mention that because, you know, [00:48:00] them and their parent company, uh, SCL, you know, as mentioned in the Great Hack documentary, they were doing some weird things in other countries and they actually were built on this concept of like, manipulating people’s behavior.
Uh, you know, what, how much of that they actually did or not, I don’t know. But it’s, it seemed pretty clear from a little bit of research that they had done some shady campaigns in other countries. So I did wanna throw that in here and also mention that. Point out that obviously that’s different than, you know, just the fact that they did shady stuff doesn’t mean that they were experts at Microtargeting and, and you know, data geniuses, they, they claim to be and, and probably want to be, but I think it’s good to separate, like yeah, they were a bad company almost.
Yeah, certainly in many ways. But
Dave: they were definitely bad dudes who were engaged in a bunch of. Pretty old school, shitty behavior. I mean like Roger Stone style, shitty behavior. But on the global stage and for money, I wouldn’t want any of your listeners to come away being like, Cambridge Analytic has gotten such a bad rap.
They were [00:49:00] great. Right, right. No, like they were really scummy. And one of the ways that they were scummy is that they made a bunch of claims about their magic data to cover up the, like, old school shitty behavior they were engaged in.
Zach: They, they were lying across the board in, in many ways. Yeah. Yeah. Uh, oh.
So I wanted to ask you this, you know, watching that and, and seeing some of the all Alexander Nick’s testimony in front of, uh, Britain’s, uh, law mm-hmm. Parliament, legislature. Parliament. Yeah. I was curious, why do you think he just didn’t come out and say like, look, we exaggerated this stuff. We didn’t actually do that.
Do you think he just wanted to have it both ways and be seen as a genius for later work and he was kind of afraid to shoot himself in the foot?
Dave: Yeah, I, I mean, it, it, it’s hard to get into Alexander Nick’s head. I don’t particularly wanna be there.
Zach: Okay. It
Dave: did seem to me that at least in the first few months of the Cambridge Analytica scandal.
They initially engaged in sort of the, the normal behavior of there was a surprising win. We need to claim credit for the win, and there’s pushback within the Trump campaign over. [00:50:00] Brad Parscale wasn’t Cambridge Analytica, he was Trump’s data guy. So Brad Parscale wanted to be like, I am the data genius.
And Cambridge Analytica were like, no, no. We were the data geniuses. And like the two kind of fought back and forth, right, right. Where Parscale was like, yeah. Like, yeah, you were like one, like one computer terminal in our office. Like I was the guy. Mm-hmm.
Zach: Mm-hmm.
Dave: All of them are engaged in myth making.
There’s been, there was a New Yorker article about Parscale, like years and years ago about like the story, the, the biography, the bi story that he would tell in any public speech. And they just noted like all the things that were not even a little bit true. And he was like, yeah, but I’m telling the story.
What are you gonna do about it? Um, you’re the lying media. So like, I think Cambridge Analytica were trying to over claim credit in the same way that they over claim credit, uh, for their role in the leave campaign. Like they had a role in the leave campaign. They weren’t the single engine behind it all, but they were like.
Trying to claim to be the single engine behind it all, and then when everything blew up in their faces, they were stuck being like, yes, we did say that back then. And not wanting to say, [00:51:00] but we’re just a bunch of bullshit artists. Right, right. A couple people like me were saying at the time, like, that’s a massive over claim.
And other people were saying like, if that’s true, it’s explosively terrible and it’s like, eh, it’s kind of mundane, normal,
Zach: shitty. Watching that, some of that, his testimony, I got the feeling he was like, I want to say more, but I also want to retain my myth that I’ve built. You know, I would
Dave: really like to have another job after this, so let me not say that I lie to people in order to get money.
Zach: Right, right. Are you on Facebook? Yeah, I don’t use it, but I have it. Would you have any problem now doing a quiz on there, like a one of these site quizzes or whatever quiz from some third party app company. Would that bug you at all, or not at all?
Dave: I mean, it would bug me ’cause I’m over 40 and those quizzes were always stupid.
I’m not like, I’m not gonna do those quizzes anymore ’cause I’m 40. Yeah, I was never
Zach: interested in this.
Dave: Now the problem with those quizzes is, was like only it, it wasn’t that everyone took those quizzes, it’s that back then when you took the quiz, it [00:52:00] auto-populated that all of your friends who hadn’t self-selected out of it, it, they got access to all of your data and all of your friends’ data.
Ian boast, uh, I dunno if I’m pronouncing his last name or not, because I’ve only read his work. Uh, he has this wonderful piece that he wrote, I think for the Atlantic. He, he, he’s a, uh, a computer scientist and a game designer and he created this game, this Facebook game, way back in the day called Cow Clicker.
And the point of the game was to show how stupid these games are. Like all you could do, you had a. Cow and you could click on the cow and like it got weirdly popular and like everyone downloaded it. Oh my God. And this was back in probably 2014 or so, certainly before they had, Facebook had had made the changes to, to, uh, the design.
And so not only were people using that and giving him all of their data, but they were then giving him all of, all the data of everyone who was friends with them. And he wasn’t gonna do anything with it, but like he got access to it. Because back then the, the phase shift here, if we’re going wanna talk about it from Facebook’s perspective, for a long time Facebook had all of this data on what do you and I click on?
And [00:53:00] also who are our networks. And Facebook was desperately trying to convince advertisers that that data was worth anything, uh, up until 2012 and a little after that. Before when 2012 is when they go public. But for a long time they’re, they are just desperately trying to convince advertisers that, which means that they’re giving away access to all of our data because at the time it’s viewed as worthless.
And then later after they’ve gone public and after it’s looking valuable, they then start to shore that up and say like, okay, like if somebody takes a quiz, maybe you don’t get access to everything they’ve ever clicked on and everything their thousand friends have all clicked on, except for the six friends who said, no, I don’t wanna give access.
Um, like, maybe we’re not gonna give away all this data away. ’cause now it’s viewed as valuable. But that horse had already left the barn. It’s like the, one of the interactions between Facebook and Cambridge Analytica is how Facebook, when they were informed that Cambridge Analytica had all this data, they weren’t supposed to have sent them a cease and desist letter, and Cambridge Analytica was like, yeah, yeah, we’re getting rid of it.
And they didn’t like on one [00:54:00] level, like that’s proof of Facebook’s negligence. But on another level, it’s like, like Facebook doesn’t have the Facebook cops, that they can send it to SEL and say like, Hey, we need proof that you destroyed it. What you do is you have your lawyers send a sternly awarded letter, and you assume that like, that company’s then gonna destroy the data, otherwise they’ll get sued.
And in this case, this scummy company didn’t do it. But like the, the underlying story here is that like all of that data was freely available to anyone who created a game like cow clicker, uh, or like a dumb quiz because that data used to be viewed as completely valueless. Right now it’s viewed as insanely valuable, and it’s probably somewhere in between those two.
Zach: Right. It’s these extreme viewpoints and Yeah, it’s interesting too, like, yeah, it’s kind of similar to Cambridge Analytica because Facebook was probably like, well, we want people to view this stuff as valuable, so we can sell it, but we don’t want people to, you know, have the backlash where we now, we can’t really sell it as freely as we want it to, you know?
It’s Right. It’s interesting how the perception, you know, has these backlashes to the people trying to promote it, you know?
Dave: Yeah. And it’s, and you know, it’s like first you [00:55:00] want to create, you wanna create a market for this data, and then after you’ve created a market, you wanna deal with the consequences of having really succeeded at creating a market.
Zach: If you could make one change to Facebook to help the world or help our society in, in some beneficial way. If you, if you were the leader of Facebook, would you have a idea of what that would be? Whether that might be like requiring more verification or something like that? I wouldn’t do
Dave: this as a leader of Facebook.
They get fired so well, or fired so fast. But, um, look, I, I think we need data protection laws in the United States, at least similar to what they have in the eu. But the other thing is they gotta break up. They, they’ve gotta break off things like Instagram from Facebook. Part of the real problem with regulating Facebook right now is that Facebook is a monopoly.
It is. It is too big to be run well and effectively, uh, and responsibly. You know, people who quit Facebook when you ask them, okay, where did you go? They went to Instagram, they went to WhatsApp. Those are both Facebook. So while [00:56:00] it would be nice if we had a regulatory state that was good at regulating, again, I’d.
But like you either need regulators or you need markets. When you’ve just got monopolies that don’t fear regulators, you don’t have either of those things. So you should probably be breaking up Facebook, at least like the big components that ought to be competing with each other so you can have some real market dynamics.
And then we should have some basic regulations, at least duplicating what Europe is doing. That would be a start. Mm-hmm.
Zach: So getting back to the social media question, you were part of, uh, an interesting moment in the history of social media conflicts. You had a joking tweet comparing New York Times, colonist Brett Stevens to a bedbug.
Yeah. And this got some traction. Uh, could you, you explain in a few sentences how that went down and
Dave: Yeah. So for for listeners who have no idea what you’re talking about, they should just type in bedbug Twitter. Um, it was almost a year ago now. It was late end of August, 2019. Uh, it was the first day of the semester at GW where I work and there [00:57:00] was a, a news headline going around Twitter that the bedbugs had been founded in the New York Times Newsroom Twitter on a good day does this thing, or at least the parts of Twitter that I’m in, where like people will see a headline like that and everyone will just like offer up riff.
None of are real comedians, but we’re best. Making, making jokes for each other. And the best joke that I could come up with since I’m part of, I, I’m in the segment of Twitter that every time Brett Stevens, who’s this conservative, never Trumper columnist, but, and like, that’ll be important later, but like he’s this conservative columnist who writes for the New York Times after driving wa written for the Wall Street Journal.
And he’s just like a giant obnoxious scold who’s often wrong and also often boring. And so, like my segment of Twitter for like months and months, every time there’s a new Brett Stevens column would be complaining about how like, oh my God, this guy is such a pain in the ass. How can they not get rid of him?
So my like 32nd riff was like, the bedbugs are a metaphor. The bedbugs are Brett Stevens. That got zero retweets, it got [00:58:00] nine likes. Again, I’m not a professional comedian, but like I thought that deserved like three retweets and like 30 likes. Like I, that’s like me. It’s always
Zach: the ones you’re most proud of that never get any traction.
I was
Dave: like, ’cause like this is a good thing. Like not only is it like a burn, but it’s a burn where you think about, it’s like, oh yeah. ’cause like that’s what
Zach: struck me is it felt to me like I, you know, there’s some things people say that where I’m like, that was really mean and uncalled for, but this to me, it’s not, like you said Brett Stevens is a bedbug and he needs to be stopped out.
You know, my LIMUS test for comedy anyway, if I were Brett Stevens survivor or conservative, could I read that and still find it funny, A funny metaphor. And to me it passed that test because you can imagine even as a conservative, if you saw Brett Stevens as like the Beleaguer columnist who everybody abused and complained about and the New York Times, you know, liberals liberal audience viewed him as a bedbug.
It’s a funny analogy to me. And that’s, you know, so anyway, it leading up to the overreaction.
Dave: Yeah. Yeah. So like. Again, like, not my best joke, but like one of my better jokes and I think it’s like nice as a thinker. ’cause like the people who at that time [00:59:00] follow me on Twitter, I’ll find Brett Stevens to be obnoxious and difficult to get rid of.
And like those are the qualities of bedbugs. Um, so I make that joke. Nobody pays attention to it. Okay, that’s fine. And that night I get an email, which is titled from Brett Stevens, New York Times from his New York Times account. I’m not gonna read it verbatim for you, but like people look online, they’ll see it.
But he emailed me CCing the provost of my university. I should note, I I didn’t use his Twitter handle in that tweet either. ’cause like that seemed rude. Mm-hmm. I don’t follow him, he doesn’t follow me. I did not use his Twitter handle. I was like, this joke was not meant for him. It was meant for other people who annoyed by him.
Who? Mm-hmm. This tweet, uh, he claims it’s ’cause somebody sent it to him. I’m pretty sure it’s ’cause he a Monday night randomly like. Name searching himself on Twitter, which, you know, don’t do that. Yeah. Um, but so he found this, he emails me and the provost of my university saying that I’ve set a new low for discourse on the internet, which like, hi, have you been on the internet?
Um, and [01:00:00] at inviting me to come to his house, meet his wife and child, and then call him a bedbug to my, to his face, because then at least I would show some, like, I forget the exact term we used, but it was like genuine courage and intellectual integrity on your part. That’s it. Yes. Genuine courage and intellectual integrity, which, um, is an overreaction.
And the thing that bothered me about, like, if he had emailed me privately, I probably actually would’ve like written back to him and engaged with him because like. I’m not someone who normally gets New York Times columnists emailing him that’s not my wife. And also like, what the hell’s wrong with you man?
Um, let’s see where this goes. But the fact that he was CCing my provost was very clearly a power move for people who dunno, university administration, that’s like my boss’s boss’s boss. And like that is, that is like a high level. I’m gonna call the manager and try to flex that. I’m at the New York Times.
You are not, and you better watch what you say about people who are at the New York Times.
Zach: And this is a guy who has complained about the, you know, cancel culture kind of stuff before, right? Yeah.
Dave: This, this is a guy who, like, he’s got sort of three or four themes he trots out in columns [01:01:00] and one of them is how like kids these days are so sheltered in their safe spaces and need to encounter, uh, ideas that they disagree with.
That’s like one of his main hobby horses. He’s like, he gave a, a graduation speech on that exact topic, and I’ve written multiple columns about that exact topic. So he writes, me and my provost about this. I’m a professor of strategic political communication, I should note. Um, so like I, I teach episodes like this, so I knew exactly how this was.
I didn’t know it was gonna go down this big, but I was like. Oh, Brett. Oh, Brett, no. So I then tweeted, not naming him, but saying, uh, this afternoon I made a joke about a New York Times columnist and, uh, he just wrote to me and CC’d my provost. He really didn’t like being compared to a metaphorical bed book and all of poli sci Twitter and all of media Twitter who I’m connected with were like, that’s gotta be Brett Stevens.
This is funny. So they all started like churning about it and talking about it. And then, um, I then I followed up with that like an hour later. So I was saying like, okay, fine, here’s the email. And I [01:02:00] included a screenshot of the email taking his email address out. ’cause again, I’m classy. And that then just set the internet on fire and we had like two or three liken B the next morning.
Tried to defend it and said that, like, actually me saying he was the metaphorical bedbug was akin to like what authoritarian regimes do, which like, Hey dude, don’t go there. Um, and he just like got dragged about it for days. And then that following Friday, even by
Zach: Breitbart, even by, uh, conservative, you know, even by far conservative sites, they were, I mean, ’cause probably ’cause he, they view him as a, you know, a liberal, uh, conservative.
So they don’t mind dragging him, but it was, he got
Dave: pretty universally that mattered. I mean, Fox News and Breitbart were on my side with this, which like, mm-hmm, mm-hmm. Weird. That matters a ton because like, one thing that stands out for me is like. This lasted a week on the, the Friday of that week. So after it settled down, he wrote a New York Times column where he argued, he didn’t use my name, but he said that, uh, whereas the Nazi propagandists had radio, today we have Twitter and [01:03:00] in liberals on Twitter who are mean to moderate.
So the modern day Nazi propagandists, hi, I’m Jewish. So like that, that wasn’t great. And like my friends at the JC were like, what’s going on with that? And he got dragged for another few days over that as well. But one thing that stands out for me with that is like, I went from having about 8,000 Twitter followers to having about 40,000 Twitter followers, basically.
Whoa. We had an, there was an entire week where like I was interviewed on NPR, like, like bedbug. Twitter was a thing. Through all of that, I didn’t get a single death threat. And part of that, like at the time, that was really Well, yeah, you brought the
Zach: internet together. Yeah, I guess so. I mean, everybody was, was hating him.
Yeah.
Dave: Yeah. So, yeah. And like part, part of what stood out to me at first was like. Me being at the center of the internet storm. It was weird and I didn’t love it, but it was nowhere near as hard as, you know, since this is what I study for a living. Like it wasn’t as bad as I thought it would be. And part of that, again, it was like checking my privilege like, oh, like snarky white dude is not the prime target of, of social media.
So I’m [01:04:00] like, I think if I had been any of my female colleagues would’ve gone way worse. Then the other, I think the other layer of it though is that’s where it matters, is he is a never, Trumper, Alexandria Ocasio-Cortez tweeted about it and so did Donald Trump, and everybody was just dragging Brett Stevens.
And so I think the other reason I didn’t get death threats is because like, if it had been me versus like Tucker Carlson, I think I would’ve had to like hide my family. I think people would’ve phoned my address and they would’ve tried. Right. He was an
Zach: easy, he, he was an easy, uh, easy opponent in, in a sense.
Yeah. The
Dave: only people who were backing Brett Stevens were like. Baby boomer retirees who were writing me, like, I got actual letters to my office saying like, I don’t know. I don’t know how you got tenure. When you’re like, so mean-spirited, you should apologize to the man. You used a bad word. It’s like, oh grandma, this is adorable.
But like that, that was his audience. And so like that was a way that, again, going back to that theme of we can find our people, we can find our Publix online. Like I got real lucky that the person who I fought with, there was somebody who had such a small online following. Everyone’s like, yeah, we’re just gonna [01:05:00] rip this guy.
Like, I’m sure he has two friends on the internet who don’t like me, but like, that was about it. The other thing there is like, like those moments would’ve been, I think, so much harsher, had I had a different identity and had had an identity that internet publics like to troll and like, like to attack much more readily than they do.
Zach: So here’s what Brett Steven said when he got off Twitter. His, this was his tweet time to do what I long ago promised to do. Twitter is a sewer. It brings out the worst in humanity. I sincerely apologize for any part I’ve played in making it worse and to anyone I’ve ever hurt, thanks to all of my followers, but I’m deactivating this account.
Yeah, I mean it really read to me that he had been getting worked up about various, you know, people insulting him and he was looking for somebody to make a stand against and he chose a really bad thing. And I think it’s probably because he was itching for a fight and kind of read to me like he knew he was kind of ashamed for choosing the wrong thing and he, that’s why he deactivated his Twitter and felt he had to keep doubling down.
Would you agree with that? Read basically, or
Dave: I would only partially agree with it. And the two [01:06:00] things that, so one thing is like he is a long history of. Sending those sort of call to manager emails? Mm. This actually, this came out recently with the changes of the New York Times, uh, when James Bennett, uh, stepped down as head of the opinion section and New York Times editors and staffers were talking about what it’s like working there.
One thing that came up, uh, is that apparently when somebody is hired at the New York Times, a person of color, that they’re given the Brett Stevens talk that like at some point Brett Stevens is probably going email your manager.
Again, as somebody who teaches strategic political communication, the advice is after you’ve gotten off Twitter, like this happened on a Monday, like the thing that is gonna work for you is if you just keep your head down and write your next column about something else. Anything else, people will move on and not remember you as the bedbug guy because people have short memories and like they don’t care like it, it feels like you’re at the center of the world.
It’s gonna last forever, but like, just let the story pass you by [01:07:00] and instead he devoted his Friday column to calling me a Nazi. And now he will always be known as the bedbug guy like that. That also I think, shines in there where it’s like the doubling down really means like, man, you are not capable of actually realizing that you made a mistake here.
Zach: Right. And I think it, it’s a fundamental mistake strategically in the sense that you, if you have someone who’s really well known and has a lot of attention to give someone else. The opportunity to basically get that attention. He basically gave you all of those followers by making a big deal out of it, which is just a, to, you know, just a fundamental misunderstanding of how these things play out, you know?
Dave: Yeah. This is the punching down versus punching up concept. But also, and I, I know you’re a card player, we’re not here to talk about cards, but he basically spent a week being on Tilt. Mm-hmm. You know, it’s like he, he lost in a spot against me where he didn’t expect that he would. Then he just double bound.
And then he decided every single time I see Dave in a hand, I keep raising try. I’m just gonna throw all my money at him. Yeah,
Zach: yeah, yeah,
Dave: yeah. And like, I could just sit there being like, I’m gonna wait for spots in which I’m very comfortable and strong, [01:08:00] and then accept your money. Thank,
Zach: thank you. Right.
Well, I think, uh, you know, I actually do believe that social media is a big factor in our polarization for many reasons. I’m actually working on a, a piece about this. The factors that I see that I’ve seen them mentioned in, in several pieces, but I haven’t seen them. A comprehensive list of what makes it, you know, kind of in a medium is the message kind of way.
But one aspect I see is that people can be made more extreme in their viewpoints by how the crowd reacts to them. Right? Like, so, you know, Brett Stevens, or you know, pretty much anyone. I think we all have these. Inclinations emotional inclinations. When someone attacks us on social media, there’s that first instinct to double down and keep going and say like, I wasn’t wrong.
And, and there’s also this emotional hurt that you feel like you, you feel this visceral reaction and you, you wanna do something about it. And the these things can lead mo a lot of people to, uh, you know, to feeling like the other side is bad. And, and I felt this before, uh, you know, I’ve been attacked a good amount ’cause I’ve written some [01:09:00] things that got featured pretty prominently and, you know, you have that feeling of these people are crazy, but then, you know, you have to take a step back and, and try to be mature about it and say like, oh no, these are just a few people being very rude and being very unreasonable.
They don’t represent a large percentage of, of the population, but I think it’s very easy to overreact in those ways. And I think that’s one of the factors in social media, you know, being an accelerant. Do you have any thoughts on that? And have you, have you felt that or seen that as a, as a factor in, in polarization?
Yeah.
Dave: So, yeah, I think that’s definitely there. The dimension I would add in is like we, we talk a lot in my field about the concept of the marketplace of ideas. ’cause again, I’m, you know, in political communication, I’m a media politics scholar. Like we talk about that a lot. We’ve never really had a marketplace of ideas.
But one thing that I think particularly stands out in cases like this is we need to think about how market incentives skew how we behave online. Right? And so what I mean by there is, I think you’re focused [01:10:00] in, you, you’re, you’re focused at a, at a level that is, I think correct and real and needs to be examined, which is sort of how does an individually, individual, psychologically.
React and deal with either praise online or punishment online and you know, how will that then shape what we do in the future? Will we then retreat from behaviors that otherwise we would engage in? Or will we do more of a bad thing? Um, or more of a good thing. And then the layer there I would add on there is when you elevate beyond you and me and your listeners like, like people who were just going about their lives online and sometimes facing a rebuke they didn’t ex expect or like cheers that they didn’t expect.
Like if you’re looking at like YouTube stars or podcasting stars, they’re the incentives. It’s not just that the incentives are magnified ’cause it’s a larger audience, but those incentives also become monetary and that then skews even further like the, that’s where policy decisions by a company like Facebook can really affect the course [01:11:00] of, you know, like a phenomenon.
Like Q Anon actually just. Today, the day we’re, we’re recording, Facebook has finally shut down, like over a thousand QAN sites, thank God. But like, if they had done that a year ago, uh, I, I dunno if you heard the news that there’s a, a republican in, I think it’s the state of Georgia who just won the, she just won the Republican primary.
She’s a Qan believer.
Zach: Right?
Dave: And Trump pushed her well Democrat. So like, she’s, she’s gonna be in Congress and like she’s a full on Q Anon conspiracy theorist, like that will be a member of Congress. But like, part of how we get the Q Anon phenomenon isn’t just people being able to find misinformation online and believing it.
It’s also this strong incentive where like, if you are really good at selling the Q Anon story than you were making bank, you know, like Alex Jones turning into the Infowars guy who insists that Sandy Hook was, was a false flag operation and then sending his supporter, his listeners for years and years to harass the families of the victims.
They can’t visit their children’s graves. [01:12:00] The way that you get there isn’t just a psychological story. It’s also a story financial of economic incentives. Yeah. Part of the story of Facebook and the social media year, like Facebook, Google, Twitter, and social media, like part of it is how it’s changed in individual’s ability to be heard, like speak, speak and listen and be heard online.
But the other part of it is how it has completely changed the like skewed the finances of news and information where now all of the money flows through Facebook and Google and they both get, get their cut, but then also it flows to the actors who are best at playing the Facebook and the Google game.
Those actors then end up often being really bad actors. So like that’s I think a whole other level of it where if we’re thinking not just about like the psychological stuff and like I don’t wanna you away from that. ’cause I think that’s also really important to explore and really rich. There’s also a layer of like, the way this stuff gets absurd and bad isn’t just because a YouTube star has realized like, Hey, I get rewarded for this.
Why don’t I [01:13:00] do more of it? But it’s also, they then realize like, oh, my ability to keep on making seven figures a year
Zach: mm-hmm. Uh,
Dave: is tied to me being able to like continue to own these particular searches. What do I need to do next? And that leads them to be more and more extreme and more and more awful.
Zach: The algorithms that these companies are using are just so influential. And so like, you know, Facebook, they change one little aspect of the algorithm and everything shifts. Right. And getting back to your point, it’s like these companies should not be responsible for these world changing influence. You know?
Like where are the rules that they should follow, you know, and they’re not getting that. And so they’re just changing thing, something one day that’s might, you know, might make Q anon kind of stuff more likely or whatever it may be. Yeah, it’s, yeah, it’s a great point. Yeah.
Dave: Yeah. And the problem there, there’s this wonderful piece I always think back to by Joshua Micah Marshall at Talking Points Memo, uh, it’s called a Surf on Google’s farm.
I end up thinking about that like once every few months. It’s a piece about [01:14:00] what it’s like being a publisher on the internet where Google is the entire advertising stack, and there have been a couple point, a couple times when Talking Points Memo, they would be covering news on like white nationalist violence and Google’s algorithm would accidentally then classify those as white nationalist stories and so then they would just get like be unfindable on the internet and they wouldn’t be able to make money.
The, the reason I mention that is it’s not that, the answer here is like, well there shouldn’t be algorithms because actually no, like algorithms are value valuable in a lot of ways as well. The problem is actually the scale. If it wasn’t just Google, Facebook, Twitter, uh, if everything wasn’t relying on, well, really just the two companies, Google and Facebook, uh, and Twitter’s a rounding error and a lot of this stuff, the fact that it is just the two of them means that their market force is so awesome that you live by the algorithm, you die by the algorithm, and that is more power than anyone can safely handle.
So the, the problem is the scale. The problem is that they are monopolies and as monopolies, little changes in the algorithm just have much bigger [01:15:00] consequences than they should ever have.
Zach: I feel like so many of these problems seem kind of insolvable to me in the sense that Facebook and Twitter have such hard problems to solve.
That in the sense that no matter what decisions they make to police content, that someone’s gonna complain about it. And then that leads to those groups being, uh, feeling persecuted and becoming, you know, having a stronger sense of group identity, you know, those kinds of problems. And the problem seems so ha hard to try to reach something that, you know, most reasonable people would agree on.
Do you see this as a solvable problem or is it more like just something we’re gonna have to learn to deal with, whether that’s, you know, becoming more mature as citizens, as people like, you know, adjusting to the, the modern fact of these things? Or do you see it, you know, as something we can address with better policy?
Dave: I would say that it’s solvable, but there’s no simple solve. What we need to do is move back to a system of like a, a governance system where we take regulation seriously. And taking regulation seriously [01:16:00] means crafting regulatory frameworks that like aren’t perfect, but solve the problems in front of you.
And then when new problems emerge, you adjust them to solve those problems. Do that, like, that’s not rocket science. It’s just like hard work and requires like serious commitment from people who take knowledge and expertise seriously. Uh, and commitment from elected officials who say, who like aren’t just gonna like randomly politicize every single little thing because they think they can fundraise better off of it.
Now how do we take. That’s gonna be a hard trip. I’m like, I’m a bit of a dystopian there because we don’t have a lot of time before things like climate change, produce spikes that make all of this that much harder. The fact that we are handling COVID so terribly, like as a dry run for bigger, harder things that will come in the next 10 to 20 years, that should scare us.
That’s pretty depressing. But it’s, it’s not impossible. Like what it requires is recommitting to saying like, [01:17:00] okay, hard problems have hard solutions, so we’re going to work on them, and then when it’s not good enough, we’re gonna make it better. That can work. Other countries are doing better, both with regulating the internet and with handling COVID.
So like the fact that the United States is doing so piss poor at all of this stuff doesn’t mean it’s impossible. It means that we should probably look at other countries and say like, okay, let’s do what they’re doing.
Zach: Do you want to mention anything you’re working on now before we go? Uh, any, in any ways to contact you?
Dave: As I mentioned, I’m big on Twitter. Uh, best way to find me is usually, uh, at Dave Carb on Twitter. I mentioned the, that history of the internet project reading all of Wired. Um, that’s the big thing I’m, I’m working on for the next few years actually, is trying to get a sense of. What did the future look like with the internet of the 1990s and the internet of the early aughts and the internet of like 2012?
Um, because I think we keep on expecting technology to change politics and society in roughly the same way. It never quite works out that way. So I’m studying those patterns to try to kind of figure out why do we keep on getting the [01:18:00] future wrong if people are curious about that, if people wanna riff on that, like I love geeking out on those questions, so I’d love to chat.
Zach: Great. This has been Dr. Dave Karp and thanks a lot for coming on, and thanks a lot for your work. It’s, it’s great work you’re doing. Thanks. Great talking with you. This has been the People Who Read People Podcast. I’m Zach Elwood. If you’d like this episode and found it educational, please consider sharing it on social media.
I make no money on this podcast, so increasing the listener count is the main way. I’m encouraged to do more interviews, and you can help me do that by sharing episode links and by leaving ratings or reviews for the podcast. If the platform you listen on allows that. If you play poker, you can read about my work examining poker behavior, AKA poker [email protected] and you can find me on Twitter at a poker player.
Thanks for listening. Hope you enjoyed.