HBO recently adapted the first book in the four-part series for the small screen.
Last week’s election told us many things. Perhaps chief among them is how divided we are. We don’t agree on our most pressing problems, nor their solutions. Many say these divisions come from our increasingly segregated social media universes. We self-select of news and information to reinforce our respective worldviews, a development that’s particularly troubling given that much of what’s on Facebook and other social media isn’t news at all. It’s manufactured false information that brings in lucrative ad revenue but leaves us sorely misinformed. Join us to discuss how social media is shaping our world.
- Joshua Benton Director, Nieman Journalism Lab
- Deepa Seetharaman Reporter, Wall Street Journal
- Caitlin Dewey Reporter, Washington Post
MS. DIANE REHMThanks for joining us. I’m Diane Rehm. Donald Trump was the winner in last week's presidential election, but it's clear that Facebook and Twitter were winners as well. According to recent research, social media played an important role in shaping public opinion on candidates and the issues in the 2016 campaign, a troubling trend, given how easily misinformation can be spread.
MS. DIANE REHMAnd welcome back. We now to turn to the role social media played in shaping public opinion on candidates and the issues in the 2016 campaign. Here with me in the studio is Caitlin Dewey, she's a reporter for the Washington Post, Joshua Benton is director of the Nieman Journalism Lab, and Deepa Seetharaman is a reporter for the Washington -- sorry, for the Wall Street Journal. And throughout the hour we are going to be taking your calls, your comments. Do join us, 800-433-8850. Send us an email to firstname.lastname@example.org. Follow us on Facebook or Twitter.
MS. DIANE REHMAnd Deepa, let me start with you. I know in a piece published last week in the Wall Street Journal, you wrote the 2016 presidential campaign was the first social media election. How so?
MS. DEEPA SEETHARAMANHey, Diane, thanks for -- thanks for having me.
SEETHARAMANI think this is the first time you've seen this number of people on social media. In 2012, Barack Obama and the -- and his campaign were able to use social media to great effect, but this time you -- just the volume of people is much, much higher. You know, Facebook has a quarter of the world's population on it. That didn't happen -- that wasn't the case four years ago.
REHMAnd a growing percentage of voters now rely on Facebook News for their news.
SEETHARAMANAbsolutely. I mean, it -- Pew says that some 62 percent of Americans get some kind of news from social media and 44 from Facebook, and the entire product is designed just to get people to spend more and more time on it, to give them more things, in Facebook's parlance, to give them more things that they want to read to improve their lives.
SEETHARAMANBut it is -- it's a huge source of news, and a lot of people get their news exclusive from what's in their feed.
REHMAnd Joshua Benton, you wrote American political discourse in 2016 seemed to be running on two, self-contained, never overlapping sets of information. Tell me what was happening and why.
MR. JOSHUA BENTONWell, what I meant there was that on one hand you had a circle of information among people who were operating the media that were absolutely shocked that Donald Trump was actually elected. And on the other hand you had this sort of isolated information universe that happened for a lot of conservative voters, a lot of Trump voters, who were, along with seeing real, legitimate, critical news stories about Hillary Clinton, also seeing a lot of made-up news.
MR. JOSHUA BENTONI can speak for my own experience. You know, I live in Cambridge Massachusetts, a pretty liberal place. The universe that I was seeing on Facebook was very different from the one that my family in South Louisiana in a small town were seeing. And I looked on the Facebook page of my mayor of my hometown in Louisiana, and he was sharing things that Barack Obama had finally confessed that he was born in Kenya, that Hillary Clinton had led to the murder of an FBI agent who was investigating her, that Pope Francis had endorsed Donald Trump.
SEETHARAMANAnd these were being shared alongside real news stories but at an incredible volume. Some of those stories are shared hundreds of thousands of times. And that's a sort of, you know, shadow information universe in some ways that it putting these untruths out, you know, to large numbers of people.
REHMBut shadow information that has extraordinary power, Joshua, to influence not only voters but the population at large.
BENTONIt's true. I mean, Facebook is not some scrappy young upstart. It's the single biggest driver of attention to news in the United States and arguably in the world. And when you get to that kind of scale, you know, when Mark Zuckerberg was -- made a statement after he started to get some pushback on this issue, saying that less than one percent of Facebook content is hoax or false information, well, less than one percent of Facebook is still an awful lot of Facebook. It's still a lot of content.
BENTONAnd, you know, Facebook has built an enormous advertising business around the idea that the content that you put in front of people on Facebook influences them. You know, it influences buying decisions, it influences political decisions. Both presidential campaigns use Facebook advertising to a great extent. For them to say that, you know, Facebook didn't really influence anybody in this election is I think a tough sell.
REHMAnd turning to you, Caitlin, what about the hunt for ads for online information and how that sort of propels this kind of spread of misinformation?
MS. CAITLIN DEWEYYeah, that's really interesting, and it goes right to some of the changes that we're seeing Google and Facebook make in response to this. You know, I think one of the reasons this phenomenon has gotten so out of control is that it's been industrialized by this emerging group of Internet entrepreneurs who realize that they can use Facebook to drive enormous amounts of traffic to their fake news websites, and there they can sell display advertising.
MS. CAITLIN DEWEYSo, you know, I've spoken to people who run these websites who will make upwards of $10,000 a month on advertising that's sold through Google. In fact it's so profitable that you can make that your primary occupation, just writing fake news intended for Facebook.
REHMAnd deliberately put that news out there to mislead, and yet you said that they -- that Facebook and other organizations, Twitter even -- how do you deal with that misinformation? Do you reluctantly, the word that Joshua used, put some editorial control in? Do you put some brakes on? How do you do it without crossing that line that says I have First Amendment rights?
DEWEYThat's a really difficult question, and I think it's computationally much more complicated than people realize. So there's a lot of calls for Facebook to introduce some sort of algorithmic change that will hide fake news in news feed, for instance. And then you get into these questions of, you know, how does Facebook determine what is fake and real, and then even further than that, I mean, do we want Facebook to address sites that are so partisan they are verging on the fake but maybe are not, you know. So there are some questions there.
DEWEYI think what they've done so far in terms of trying to damage the monetization model of these fake sites in that both Google and Facebook have announced they're not going to let fake news sites use their ad platform, that seems like potentially a good first step, but the big question is what they'll do more aggressively in the future.
REHMBut Deepa, how do they know, how do they track, how do they follow so-called fake news sites?
SEETHARAMANWell from the ad-selling perspective, Facebook hasn't really set any kind of criteria to tell you -- to tell its users what is and isn't fake, and how they're going to make that determination is an open question. And if you think about the feed, that's an even more complicated question. Like Caitlin was saying, there's so much partisan news out there, stuff that is kind of vaguely true or partially true or has some truthiness to it but isn't true.
SEETHARAMANAnd I think, you know, you -- you see this a lot from the conservative side, but this exists on the liberal side, too. I mean, that meme that has been going around for a few months, it's an apparent quote from "People," from "People" magazine about Donald Trump saying if I were to ever run for president, it would be the Republican Party because they're so stupid.
SEETHARAMANAnd there's no evidence that that -- that he ever said that. So you see this issue on both sides, and there are some clear-cut cases of right and -- of true and false, like the "People" magazine example I just gave you, but there's a lot of -- a lot of gray, and that's going to be very difficult to sort out.
REHMAnd Caitlin, you once wrote a regular report on news, fake news stories, but you gave up on it. How come?
DEWEYIt's a little disheartening, Diane. So for a little over a year we ran a weekly column on the subject, but as time went on, we found, number one, that the volume became too great to address. And also there's this really interesting effect that's been observed in academia and in research on the subject, sort of backfire effects that in fact when you correct certain types of deeply partisan fake news or hoaxes that sort of go to people's fears or beliefs that in fact not only do they not believe the correction, but it enhances their belief in the false information to begin with.
DEWEYSo at some point the column became almost a fool's errand, and we began to worry we were doing more harm than good an felt maybe there was a different way to address misinformation that wasn't that.
REHMDo you still feel that way?
DEWEYI do, and, you know, I think especially in the past six months we've seen how very true that is. I mean, the amount of fake news on Facebook and elsewhere has proliferated to levels I never thought possible, and, you know, the share counts suggest that many people believe even extraordinarily outlandish information.
DEWEYWell such as the pope endorsing Donald Trump. I think an easy Google search would show you that wasn't the case. That has been shared upwards, I believe, of 1.4 million times on Facebook. There were some memes going around, especially in Pennsylvania, that insisted Clinton voters could vote from home, online, the idea being that those people then wouldn't actually show up to vote.
DEWEYYou know, there's some really ridiculous stuff out there, but the share counts suggest a lot of people are falling for it.
REHMJoshua Benton, do you believe that fake news had an effect on this 2016 presidential election?
BENTONI think it had an effect. It's I think impossible to say whether it had a decisive effect. When you have an election that came down to, you know, the placement of 107,000 votes, there are a lot of things that you could imagine that might have changed the course of how things go.
BENTONI do think that a very large share of the people who are sharing these fake news stories and these memes are people who already are -- they're just confirming their own existing perspective. In other words, the people who are sharing that fake Donald Trump quote were probably people who were not going to vote for Donald Trump, and the same is true for the people who were sharing the Hillary Clinton information.
BENTONSo I don't know who -- the degree to which it changed votes. I'm sure it changed some, but there's no way to know how many. What I think we can say it does is it really increases polarization because it does create a narrative that is so much cleaner than the actual, real narrative of news, right. An actual news story isn't going to paint someone as the enemy and as evil to quite the same degree that a fake news story can because it's not constrained by existing in the factual universe.
BENTONSo if you have a story that -- you know, there was a story going around that Hillary Clinton was an actual demon or a Satan worshipper, these are things that are not true, but those are the kinds of things that make it unthinkable to compromise with the other side. And I think the fact that we are moving from a two-party system, where there used to be a fair amount of overlap, to one where there's not much at all, this feeds into that.
REHMDeepa, do you want to comment?
SEETHARAMANYeah, I mean, I completely agree. I think there's no direct line that you can make between somebody read a fake news story and then was somehow duped into voting for either Donald Trump or Hillary Clinton. There's -- but what it does is it shapes the broader political discourse. It makes people less willing to listen, and it makes them, you know, just a little bit -- it hardens the divisions.
SEETHARAMANAnd I think that can have a caustic effect on political discourse. I think you saw some of that this year. But you -- there's no way to really kind of isolate the effect of social media because it does exist within all these other -- other dynamics that are happening in the election this year. It's just social media played such a big role in amplifying some of those decisions.
REHMAnd you're listening to the Diane Rehm Show. I want to open the phones. There are folks who would like to join us with their own thoughts, their own ideas. Do join us, 800-433-8850. First to Joe in Cleveland, Ohio. You're on the air.
JOEHi yes, yes, thank you. I think the desire and the demand from other sources like social media and what may be called alternative media is that I think the mainstream media I think has to look at itself and that they've lost their own credibility. And, you know, if -- look, the Washington Post, NPR, New York Times, CNN, whatever, they do not lack for outlets. But I think it's been absolutely proven, and I'm not a Trump supporter, I'm a third-party supporter, but this election has proved, if nothing else, that what used to be, you know, supposedly, it never was, but what used to be supposedly objective media really took sides. And once you take sides, just like one of your guests said, once you start constantly demonizing one side, you stop being listened to.
JOEAnd the Washington Post and the New York Times, and NPR, your show included, Diane, constantly demonized Donald Trump. I mean, he deserves some of it, but the coverage was unbelievably one-sided. So people go elsewhere. And as far as the social media with Facebook, well Facebook is a marketplace, right. Somebody puts up a stupid right-wing story, somebody can counter it with facts, and it shouldn't be incumbent on -- you know, Facebook is like a telephone line. You know, this is where people talk and communicate, and if somebody puts up something untrue, somebody else can counter it with something that's true.
REHMAll right, and Joshua Benton, how do you respond to the notion that the major media outlets, the Washington Post, the New York Times, NPR, all became so slanted in one direction, anti-Trump, that people were forced to turn elsewhere?
BENTONWell, I think that it's absolutely true that the path that people are taking in consuming their media is of a piece with a general distrust of institutions, distrust of the government, distrust of mainstream media, and that is -- you know, distrust of institutions has been growing markedly for some time now.
BENTONYou know, I look at the coverage of the major news outlets in this country, and I think there was a lot of great investigative reporting on both sides. I think that this election cycle was unique because we've never had a candidate like Donald Trump before, and I would question whether or not the shift to social media is driven by the aggressive reporting against Donald Trump because that's a trend that's been happening for a very long time, and I don't think you can say quite the same thing about, for example, Obama-Mitt Romney or Obama-John McCain.
BENTONBut there's certainly no doubt there is a very large and growing share of the American population that doesn't believe that the media is trustworthy, and that's a question the media can respond to in a few ways. One would be to try and earn back trust. You know, that's one way to approach it, and that's what they've been trying to do for some time, and it hasn't seemed to have worked, doesn't seem to work.
BENTONAnother approach would be to sort of have a system much more like Europe, where news organizations have a specific ideological identity. You know what The Guardian is, and you know what The Telegraph is in the U.K. Maybe that's what the New York Times and the Washington Post are going to turn into.
REHMJoshua Benton, director of the Nieman Journalism Lab. We'll take a short break. I do want to say briefly that I think Donald Trump made his own news, and we as journalists did report on it. Stay with us.
REHMAnd welcome back. As we talk about fake news, how it showed up on Facebook, how it reverberated throughout the 2016 election campaign. Let's go now to Steve in Germantown, Maryland. You're on the air.
STEVEHello, Diane. Thanks for taking my call.
STEVEYou know, I just want to follow up on a previous caller who I think said a lot of what I'd like to say and very succinctly. But this discussion I'm hearing now is partisan. The examples of these people influenced by false information are suggestive of Trump supporters when it is on both sides, there's plenty of false information that people just simply believe. And those people that believe...
REHMSteve, I do think, excuse me, I do think -- I know you have lots more to say, but I do think you've heard, already, examples of how it is surmised that the Clinton campaign may also have put out fake news. Maybe you missed that.
STEVEWell, how about that Trump is a racist, that he's a xenophobe and he's misogynist. That is fake.
REHMI think that what they were doing there was picking up on some of the things that Mr. Trump said.
STEVEWell, right. And you, and your commentary often use that as a presumption. It's underlying your questions while you act like an objective observer, which you are not, Diane Rehm.
REHMI think what I'm doing, Steve in Germantown, Maryland...
STEVEIt's a presumption. It's a presumption, and I heard it today in the presumption that these people, somehow, Trump supporters, are just going to blindly believe Bigfoot stories. They can discern the difference, Diane. And it's not just liberals that can. They're smarter than the liberals, because they voted in real change for all. And you will see how smart they are by voting in Donald Trump.
REHMAll right, Steve. Thanks for calling. Joshua, a comment.
BENTONWell, I think, you know, we have more evidence that this is a very polarized universe, very polarized country and a very polarized media ecosystem. I mean, I think that there is, I mean, the caller is correct that this sort of behavior happens on both sides and happens in a lot of other directions that could be divided into sides.
BENTONBut there, you know, there have been some, some analyses that do indicate that it does seem to be a bigger factor on the right than on the left. That, that's -- both sides do it. It happens on both sides, but Buzzfeed did an analysis looking at right leaning and left leaning Facebook pages and the kinds of information they were sharing and found a significantly higher share on right leaning Facebook pages. And there was reporting a couple days ago that Facebook had done its own internal analysis of this question.
BENTONAnd it found that filtering out fake news disproportionately affected conservative points of view and conservative pages. And that's one reason why they didn't implement that, that technology. So, it is absolutely a phenomenon that exists in lots of directions, but, and I think we've tried to acknowledge that. But it's not equal on both sides.
REHMAll right, to John in Ann Arbor, Michigan. You're on the air.
JOHNHi, Diane. Long time listener and I'm really going to miss you.
JOHNI'm going to miss everything you do for us. I was really going to take issue with one of your guests talking about how the mainstream media's coverage was slanted. Because from all of what I listened to, and I try to listen to a broad spectrum, they were basically, and I don't say this pejoratively, but they were basically taking dictation. They were repeating the man's words. And if people are upset about what he is saying, or what the media was saying. What they were saying was what Trump was saying. So I don't get their distrust of the media for basically taking dictation.
REHMCaitlyn, do you want to comment?
DEWEYYeah, you know, Diane, as we listen to these callers, I'm thinking, you almost don't need us as guests. They so perfectly illustrate all of the fraughtness of this fake news issue, right? So, you know, we have John call in and say, you know, this is, this is objective truth, that we understand. You know, they're just taking dictation of what Trump said, which, of course, as a member of the mainstream media, is also my impression. But, you know, then you had Steve call immediately before that and say, you know, the sort of dictation is, in fact, false, is in fact partisan.
DEWEYAnd this illustrates the difficulty that Facebook and others are going to have. I mean, we can't even agree on what is factually true or not anymore. And so how are we expecting a technology giant or an algorithm to do that? I mean, it's really a mess. I can't imagine.
REHMAnd at first, Jeff, Mark Zuckerberg said, you know, he was going to downplay the whole argument, but more recently, he has talked about making some changes. Here we have an email from Noah that I want to read to you. He says, like it or not, Facebook, Google and Twitter are media outlets. They are obligated to behave like one. As a minimum, they should post to each and every instance, quote, false, mostly false, or insubstantiated. Posters and commentators should be notified when changes are made to these posts.
REHMPerhaps you could use the users to identify false information and provide proof that they are not true. How do you react, Deepa?
SEETHARAMANI think we run into this issue of what is and what isn't true. You know, I think one -- the effect of having us get so much of our information from social media is that people are operating from completely different sets of facts. And it's -- who's going to make that call if something is mostly true or true? Who's the arbiter of that? I mean, in, in, you know, when you think about the way Facebook internally thinks about it, and they're very, very reluctant to look partisan in any way.
SEETHARAMANThey want to maintain this, this standpoint of being a neutral platform and an exchange for all ideas. And deciding what is and isn't true is a partisan activity in their eyes. And it's also difficult. You know, all of us do this professionally. It's -- it takes a lot of work and we don't always make the right call and it's a thankless task in many cases. Because people will still criticize you for making certain decisions or emphasizing certain things. I -- this sounds like a nice, and it sounds like an easy solution, but it isn't.
SEETHARAMANBecause the underlying mechanism of deciding what is and isn't true, especially through software, is incredibly difficult.
BENTONI think that's right. And that's why I would think that Facebook should probably not be messing around with deciding whether a Breitbart piece or an MSNBC or a piece from Alternet is, is correct or not. I think there is a broad sweep of information that is partisan, but Facebook shouldn't be messing with it. I mean, there is a very threatening edge to the possibility of one company deciding what is true and what is not. That's why I think the only piece of this that I would advise them to be focusing on is the obviously untrue.
BENTONThe -- the ones, not the ones where this is a matter of dispute, but the ones...
REHMOkay, all right, let me stop you right there.
REHMFor years, Donald Trump said that President Obama was not born in the United States. During the campaign, he made a single sentence statement, saying President Obama was born in the United States. Now, how to figure how news organizations are going to deal with just that simple statement that Donald Trump himself had been putting forth all those years. And to my listener who was so critical of me, that may sound like an anti-Trump statement. However, it comes from Donald Trump himself. Joshua Benton.
BENTONI think you're hitting on a difficult set of calls that editors have had to make. I mean, Dean Baquet, the Executor at the New York Times, said some time ago that Trump, by being such an extreme figure, within our political history, had in his mind, in his words, given the New York Times a little bit extra courage and calling things that are not true false. Calling them lies in some cases. In that particular case, a story by Michael Barbaro and others, called that a lie.
BENTONWhich is something that the New York Times has been very cautious about doing historically. Even cases where there is a clear truth and untruth. So, news organizations are changing how they think about this, and I'm sure that, you know, we're going to see a significant changes in that over the next four years as we try to cover a Trump administration.
REHMWhat about the Wall Street Journal, Deepa?
SEETHARAMANI think, you mean, when it comes to discerning between truth and fiction?
SEETHARAMANAnd truth and lie.
SEETHARAMANI think -- I think news organizations generally speaking are always a little hesitant to call something an outright lie. I mean, we say that there's a lack of evidence or this is something that, you know, he did say this, he did tweet this four years ago. You know, we'll provide evidence, but I think what some of these callers are reacting to, and potentially why you see more bogus stories on the conservative side, and the liberal side is this broader feeling that mainstream media does not speak to them, right?
SEETHARAMANI mean, this was such a -- explicitly just now from a caller. They don't feel like -- it's like a hostile environment for people who might have different views. And I think there are a lot of -- you know, I get a lot of reader email and a lot of reader feedback and especially these days, you know, in any story I've written that's the intersection between the election and Facebook, I've gotten a tremendous amount of reader feedback. And the feeling is is, you know, you guys don't speak to us.
SEETHARAMANSo we're going to go somewhere else. And they read into the lack of, kind of, you know, the lack of coverage that they would feel, kind of, speaks to them as a, kind of, condescension. And I think it's very, it's very obvious that a lot of people feel let down and that's why they go to these alternative news sites. Which, you know, some of them are very interesting and bring to light things that I wouldn't necessarily have noticed or covered. And I can't say that like mainstream media doesn't have some type of bias, but there's a lot of fake news there too.
SEETHARAMANAnd I think, I think what some callers and some of our readers are reacting to is well, how can you possibly -- why are you making such a big fuss about fake news now? Is it just because you lost the election? I mean, I literally, like as I've been sitting here, I've gotten emails just telling me to get over it. My candidate lost, even though I have, in no way, said who I did and didn't support in this election. And my coverage doesn't, I don't think, reflect that. But I think read in a deeper meaning.
REHMCaitlyn, tell us about the trolls. Who are they? What do they do?
DEWEYWho are the trolls? That's a great question. I would argue, in this election, that perhaps everyone is a troll. Traditionally, it's a term that has applied to a certain type of antagonist on the internet who tries to provoke people, largely to upset them. There's a sort of schadenfreude there. Frequently, they just want to sort of disrupt other discussions and things that are going on. And, you know, I recently wrote a piece that argued that this was sort of the year that politics sort of became a form of trolling.
DEWEYThat, the trolling culture, in fact, went offline and sort of began to permeate other areas of our political discourse. Because of course, we're sort of perpetually trolling each other all the time. Arguably, fake news is sometimes a part of that.
REHMAnd you're listening to "The Diane Rehm Show." All right, let's go to something that Deepa was talking about. To Becky in Hudson, Ohio. You're on the air.
BECKYHi, good morning Diane and guests. And Diane, I'm going to miss you so much.
BECKYGood luck to you.
BECKYYou know, I, so many interesting things that you're discussing this morning, but I just wanted to share a little bit about what I'm observing in my family and my life. You know, there's, there seems to be a huge disconnect for folks. It's like, there's this notion that if I believe it, even if it's not true, you have to accept it. I mean, my -- I've had conversations with, with family members and relatives around facts, real facts that can be verified from numerous sources.
BECKYAnd when I share those facts, it's as if I'm attacking their beliefs. And I suppose, in some way, I am because I'm trying to shift their thinking. But the beliefs structure seems to be greater than just political theory. But something more personal, something more like faith. And I think in Ohio, you know, I have relatives in Youngstown and Steubenville, you know, all along these corridors that voted heavily for Trump. They're really, really looking for someone to save them.
BECKYAnd having conversations about facts around Mr. Trump and his comments, his personal comments in the past. And now people who are surrounding him pushes so hard on their desire for a better life that they've expressed through their vote for him that they're just not interested in listening.
REHMAnd I think you've touched on the key word, Becky. Listening. Strikes me we're all going to have to learn to listen better, to listen more broadly to understand that exactly as you said, you're speaking out of what you say are facts that you sourced from many different places. And they believe they are speaking out of facts. And their own beliefs, and that's gonna be hard to change. Wouldn't you say, Joshua?
BENTONAbsolutely. And social media certainly amplifies that, you know. Facebook is designed to, as was said earlier, follow what you're interested in. Follow your click patterns, follow the things that you spend time with. And that means that it ends up sending you more information that confirms your own point of view. And I think having fake news injected into that really kind of weaponizes that space. But there's one other thing that I think is, is a factor here.
BENTONThe newspaper media, for example, in the United States, used to be very geographically distributed. Every town had a paper, every city had a good sized newsroom. But local news has really fallen on very hard times and is going to continue to fall on hard times. And the national outlets are becoming more dominant and increasingly, news is being created by folks in places like where all of us are, in Washington, D.C. or San Francisco or Boston or New York.
BENTONAnd that's something that I think, I would hope news organizations, the national ones, would think about addressing and doing a better job of covering the rest of the country.
REHMIndeed. And I think that the diminishment of local news, the diminishment of local newspapers, has helped to divide communities. I mean, we're fortunate enough to have The Washington Post, the Washington Times, Wall Street Journal, New York Times, but we need more local newspapers. I want to thank you all. Caitlyn Dewey, Joshua Benton, Deepa Seetharaman. Forgive me. Seetharaman. I think I got it. Thanks all.
SEETHARAMANYou got it.
REHMFor being with us. I'm Diane Rehm.
Most Recent Shows
Concerns over user privacy, the continued spread of misinformation and strong-arm tactics to crowd out competitors have users -- and governments -- rethinking their relationships with Facebook.
Why Diane's guest Ben Wittes says no.
In this moment of political discontent, when we talk of deep divides and a growing sense that our democracy has gone off track, historians counsel us to look to our…