Hey all,
Just a heads-up that I’ll be taking next week off. I’m offline vacationing, and then undergoing a big cornea surgery — which is both exciting and terrifying, but I’ll see ya’ll once I return. (Hopefully better than I can today, anyway 😉.)
And we’ve got our first podcast interview coming out the week after next, with an amazing couple who’ve done heaps of interesting work at the intersection of AI, tech, and art in India. So look forward to that.
Onto this week’s post — thanks, as always, for reading and engaging.
Ted
Russ Roberts: “Do you think [the internet] will change some of the outcomes of how the world gets molded by… political processes?”
Hal Varian: “Well I think one of the nice things about Google or other search engines is it allows for pretty rapid fact-checking. And if somebody’s out there spouting a bunch of nonsense you can check into it. And if everybody is sitting there with their mobile device verifying what the [political] candidate is saying, I think it’s going to bring a little more accuracy into political activity….”
July, 2008 — Excerpt from the EconTalk podcast
Those words were spoken by Russ Roberts, economics professor and host of the EconTalk podcast, and Hal Varian, Berkeley professor and Google’s Chief Economist.
I came across this interview a few months ago, and delighted in it — not because I think Dr. Varian comes across naive in the conversation (though he does), but because I think his views in 2008 were fairly representative of how many internet enthusiasts felt at the time. As I mentioned last week, I was one of them!
Varian and Roberts discuss — beginning around minute fourteen of the interview — the ways in which the internet allows for a “greater democratization” of information and education. And more democracy, of course, implies more voices — which implies more opinions. Which, it occurs to me now, should have led us to realize that we’d have a lot more truths, or “truths,” to deal with down the line.
So what happened? And where do we go from here?
Concentration and Fragmentation
We talked about the internet’s concentrating impacts in my review of Kyle Chayka’s book, Filterworld. About the many ways the internet’s size and scale — how we’re all using the same apps and websites, all in the same way — may be homogenizing our art, music, food, design, travel, and much more.
But the internet doesn’t only cohere and homogenize — it can also fragment. It fragments ideas, and beliefs, and social groups, and even the notion of “what truth is.”
I don’t want this to turn into a thesis-length discussion about the rise of disinformation and misinformation in our political lives. (There’s a difference between the two, it turns out.) But what I do want to discuss is the unique ways in which those forms of information — good, bad, ugly, true and “true” — wend their way into the lives and minds of some, and bypass others.
It would be a problem if all of us were receiving the same incorrect information — at one point in time, nearly everyone on earth agreed the planet was flat. But when some of us are told it’s flat, and others that its round, and still others that it’s, I don’t know, an impossible triangle, that can create real conflict.
Enter the Filter Bubbles
Eli Pariser coined this term when he warned us of the increasingly partisan nature of online media consumption in his 2011 TED Talk. He pointed out how, even thirteen years ago, he started seeing (or really, not-seeing) political content he disagreed with filtering out of his online experience.
This occurs in large part because the algorithms of Facebook, Google, and others “reward” our clicks by showing us more content similar to that we click on. If we interact with posts from the New York Times and NPR, we’re more likely to be shown more posts from places on the web like them — and the same applies, of course, to those more likely to click on posts by Fox News, Breitbart, and others.
As Pariser says, “There is no standard Google anymore.” In his book, he wrote:
“A world constructed from the familiar is a world in which there's nothing to learn ... (since there is) invisible autopropaganda, indoctrinating us with our own ideas.”
Another way of saying this, of course — since we’ve come to so heavily rely on Google and similar sites to glean information, and thus form a sense of truth — is that “there is no standard truth anymore.”
The US has been polarizing by political geography since at least the seventies, and the internet only exacerbates this social polarization. I think it’s reasonable to blame this in part for why Donald Trump’s election shocked so many in 2016 — many of us didn’t even know, online or off-, anyone who openly supported him.
It shouldn’t surprise any of us to learn that research consistently shows people of different political leanings are more likely to trust different sources of news, but it’s worth realizing how stark these differences are — not only in the US, but everywhere in the world, as we’ll soon discuss.
Just look at the incredible gulf between trust levels of even “mainstream” news organizations like CBS, ABC and NBC by those with different political leanings in this 2022 YouGov poll. Only ~15% of Republican respondents trust these organizations, while ~65% of Democrats do.
And only about 50% of Republicans even said they trust something as anodyne as the Weather Channel!
We’re accustomed, I think, to recognizing that different news sources apply various partisan spins on stories. But increasingly, different media sources will cover entirely different stories altogether. One person may think Story A is consuming everyone’s minds, while another has never even heard of it — and thinks Story B is all the action.
Pew Research found in 2019 that two in every ten American adults got political news primarily from social media. And in 2020, they discovered these same people were the “least knowledgeable about key facts during” Covid, and to “hear more about unproven theories that Vitamin C and 5G technology are connected to COVID-19.”
Consider what we know about filter bubbles and online dis- and misinformation, and read more from Hal Varian in that 2008 interview. Again, I don’t point this out to uniquely penalize him, but to show how far our understanding of the internet’s impacts on a shared “truth” have changed since:
Well one of my political science friends once told me that the best predictor of your political views are your neighbor’s political views. So the people you associate and talk with will have similar, reinforcing political attitudes. And of course, I only have a few neighbors, so then you’ve got a little circle of self-reinforcing points of view. And now I’ve got the world as my neighbor, so now you’ve got some of these blogs, you can experience a much larger variety of viewpoints. And I think only have a much better chance of reaching an intelligent view.”
Isn’t that remarkable — that fifteen years ago, we thought the internet would lead us all to “only have a much better chance of reaching an intelligent view!” Not a different view, or a conspiracy-minded view. Boy, where those the good old days.
There have been efforts to combat the partisan effects of filter bubbles, of course. Around the time Pariser was first warning us of them, Sean Munson, a professor I worked with at the University of Washington, created Balancer, a browser plug-in that would “nudge” users to diversify their online media consumption if they’d been reading too much of a certain political leaning. And the organization AllSides runs an Instagram account under the same name, striving to provide balanced news and media ratings to “tackle media bias head on.”
These efforts are admirable. But they do feel, today — in a world of TikTok and Truth Social, X and Instagram, NewsMax and OAN and whatever Tucker Carlson’s doing now — a bit like fighting a forest fire with a pipette, don’t they?
The Whole World 🔥
It’s easy to think that our collective siphoning off into different information cohorts is a uniquely American problem. But it’s not — at all! Which may make us feel better about America, but perhaps much worse about the state of the world at large.
Foreign Policy invites readers to look “Inside Latin America’s Fake News Problem.” Conspiracy theories spread among elderly Korean conservatives on YouTube, and via Telegram in Singapore. And the Africa Center for Strategic Studies (a division of the US Department of Defense) highlights that “Disinformation campaigns seeking to manipulate African information systems have surged nearly fourfold since 2022.”
Rest of World is tracking incidents of AI-generated election content around the globe too, as “more than 2 billion people in 50 countries head to the polls” this year (!)
According to Reuters, “Audiences in the Global South are more worried about misinformation than those in the Global North. According to the Digital News Report 2021, percentages of people concerned are higher in Africa (74%) and Latin America (65%) than in North America (63%) and Europe (54%). Increasingly polarised politics, a growth in messaging apps and the COVID-19 pandemic may have exacerbated these concerns.”
Indeed, the growing disparity of “what truth is” may have even greater impacts in developing countries than it does in the US and Europe. In 2018, academics, journalists and activists began highlighting stories of “WhatsApp Lynchings” in India, and New York Times journalists working there described how “false rumors about child kidnappers have gone viral on WhatsApp, prompting fearful mobs to kill two dozen innocent people since April.”
I actually met one of those NYTimes journalists in India, and he told me how he’d recently interviewed a farmer with a sick cow. But the farmer, he said, was afraid to bring it to the veterinarian in his truck, afraid others would accuse him of taking it to slaughter — and attack him because of it.
Violent WhatsApp messages spread like wildfire across Myanmar then too, during which Myanmar Security Forces systematically murdered, raped and burned homes of the Rohingya Muslim population. This led over 700,000 people to flee the northern Myanmar state of Rakhine.
Following this and the reports from India, the company set limits on the number of message forwards in certain countries. But the damage was already done. Amnesty International later wrote a report tying direct blame for Myanmar’s anti-Rohingya violence to Facebook’s negligence around rampant WhatsApp misinformation. They accused the company of creating “an anti-Rohingya echo chamber,” and went so far as to say the company owes reparations to the affected populations.
How Do We Go Back?
These efforts are admirable. But they do feel, today — in a world of TikTok and Truth Social, X and Instagram, NewsMax and OAN and whatever Tucker Carlson’s doing now — a bit like fighting a forest fire with a pipette.
It’s worth distinguishing here between rampant rumors fueling violence India and Myanmar via chat messages, and the pervasive question of “media polarization” across the world.
In the US, at least, some smart people are arguing that we can’t go back at all — and that maybe we shouldn’t even want to.
And I’m increasingly inclined to agree.
This feels to many of us like a wild idea. I can almost hear the voices in the back of some readers’ brains here screaming “BUT THE TRUTH!!”
We had the chance to interview the brilliant scholar of conservative media A.J. Bauer on our podcast last year, and he convinced me of this point. Maybe the universal understanding of truth, in the “everyone huddles around to watch Walter Cronkite” sense of the term, was a mere artifact of broadcast television. A fifty-or-so-year anomaly in the millenias-long history of human culture. And maybe the idea of a “shared truth” across wide swaths of our population was an anomaly, just like it.
It’s a terrifying idea — but also, perhaps, a freeing one. No longer, says Bauer, are we beholden to an old, complacent set of ideas. He argues that truth, and hegemony, and what we view as “common sense,” are produced:
“In the mid-twentieth century — at least among middle class, to some extent working class and rich white people — there wasn’t a lot of conflict of facts, right? They got their news from television… a lot of people tended to get their news from the same sources. That created a sense of stability. There wasn’t a lot of fighting about facts — there were differences of opinion. I don’t know how you put that toothpaste back in the tube.
And I don’t think maybe that it’s even good to…. Because I think that, despite the fact that we’re living in a much more complicated media environment, that it’s allowing for a much more nuanced and sophisticated conception of… facts! What they are, right? How truth functions in our society. I would much rather have a conversation about how… you need to engage in the political in order to ensure that your truths are salient and convincing.”
He continues:
“You’ve got one side, the [political] right, which has a gigantic infrastructure and funding that is fixated on culture war. They understand that hegemony, and what we count as common sense, is produced. It is politically constructed, and emerges through conflict — and winning conflicts.
And then on the left… you have people who are still holding onto a conception of the public sphere from the twentieth century….. which is that there is one set of facts, that are self evident. And that as long as everybody believes those, there are self-evident policy conclusions that emerge from those. And that is a ludicrous thing to believe. It’s never been true — it only appeared true briefly in the twentieth century because of regulation and media and how it was structured. It isn’t, and never will be, true again.
Conservatives know this, and they’re making funding and strategic decisions based on it… there are a significant number [on the left] who say ‘let’s fund fact-checking,’ ‘let’s fund media literacy,’ and things like this… That’s fighting, you know, fire with—”
“A hammer,” I suggested to him then. Which is about as useful as a pipette.
Song of the Week: Sayyedul Hassan and Nurul Amin — Rohingya Songs performed at a refugee camp near Cox’s Bazar, Bangladesh. Watch for the neighbors peeking their heads in for the live show.
Excellent post, loved hearing a throwback to where so many of us were years ago. The trust in media sources chart was awesome to see as well. Turns out republicans just don't trust much of anything.
I wonder though, is it really a problem if we all believe in something that's false. I don't want to get too philosophical about what is truth and all that but I'd argue that we do all believe in things that are false right now and just don't know it.
Thank you sir, great points! I like AJ Bauer's similar points in there, referencing Gramsci, that truth is created by hegemony — ie that it's not necessarily "true" so much as manufactured to be viewed as true. (I might be butchering that a bit, but that's the gist.) Definitely good fodder for thought.