Eli Pariser

Eli Pariser

A quiet revolution is taking place on the Internet. The top 50 websites collect an average of 64 bits of information each time we visit. The personal data they track — from our politics to the shoes we just browsed on Zappos – help advertisers tailor offers just for us. But one online pioneer believes we pay a big price for that customized experience – living in our own information universe. In our so-called “filter bubble,” we receive mainly familiar news that confirms our beliefs. And we don’t know what’s being hidden from us. Diane and her guest, Eli Pariser, talk about understanding the costs of online personalization.

Guests

  • Eli Pariser Board President and former executive director of MoveOn.org

Author Extra: Eli Pariser Answers Your Questions

Eli Pariser stayed after the show to answer a few more questions.####

Q: Can you speak to the long-term retention of social media data, such as Tweets and Facebook status updates? The value of tweets to the individuals who make them diminish over time. Do you advocate for deleting tweets as they become stale?
– From Neil via Email

A: I think it’s a little different for Twitter than for Facebook, because Twitter is an inherently public medium – everything you do is public. And in general, I’m for retaining public data – the good, the bad, and the in-between.

Facebook should give users a choice in the matter when the data in question is private – if you want to take down your Facebook account and the data associated with it, there should probably be a process for that. Right now, Facebook seems to be willing to make private data (like when you said you were a fan of something in 2009) public without your consent. So it’d be good if there was a way to take that data out of their systems entirely.

Q: Google and Facebook are free and nobody is compelled to use them. Why shouldn’t they construct their algorithms to best suit their business needs?
– From Mark via Email

A: Mark Zuckerberg often mentions that he wants to make Facebook an indispensable utility, like a phone company. If that’s what he wants to do, more power to him – but utilities have a responsibility to the public to serve them well. This is like a phone company saying, “we’re going to tap your calls and use them however we want, and if you don’t like it you can use another communication platform.”

Q: I tend to be fairly conservative when it comes to personal information
I put out on the web, and I also use firewall software such as Peer
Block which is supposed to block most IP addresses and HTTP that try
to connect to my computer… is this doing anything is the way of
protecting my information or is it just making me feel better?

– Steve via Email

A: I did compile a list of the 10 quickest fixes here on my website, www.thefilterbubble.com. But the truth is that there’s no permanent way to protect yourself or opt out – the technology that does the data mining and personalization is way ahead of the tech that protects most users.

Q: Knowing this happens, how is writing a book that will be recommended to people on Amazon who already agree, going to change things?
– Michael via Email

A: Well, that’s where the existing broadcast media come in – hopefully, I’ll get out of the bubble of people interested in personalization. Thank goodness for Diane Rehm.

Read an Excerpt

From The Filter Bubble. Copyright 2011 by Eli Pariser. Excerpted by kind permission of Penguin Press:

Transcript

  • 11:06:55

    MS. DIANE REHMThanks for joining us. I'm Diane Rehm. With little fanfare, the Internet is fundamentally changing. Some users are happy that websites like Amazon and Netflix now make recommendations based on individual tastes. Others worry about the cost of website personalization.

  • 11:07:24

    MS. DIANE REHMA pioneer of online politics, Eli Pariser has written a new book about what the Internet is hiding, what it means for our daily lives and our society. The book is titled, "The Filter Bubble." Eli Pariser joins me in the studio. We'll welcome your calls, questions, 800-433-8850. Send us your email to drshow@wamu.org and surely join us on Facebook or send us a tweet. Eli, it's good to meet you.

  • 11:08:14

    MR. ELI PARISERGood to meet you, Diane.

  • 11:08:15

    REHMYou've been concerned about what you call a filter bubble. Explain precisely what that is.

  • 11:08:25

    PARISERWell, it used to be that when you went to a website like Google or, you know, Facebook or rather Yahoo News, you, you know, everybody saw the same thing. If you queried Obama in Google, everybody got the same results. And very quietly that has changed. Now, the results that I might get would be very different from the results that you might get. And...

  • 11:08:52

    REHMHow come?

  • 11:08:54

    PARISERWell, because Google is actually looking at everything I click on, what kind of computer I use, where I'm logging in from and it's trying to personalize my results. It's trying to give me the results that Google thinks I'll like the best, but not necessarily the results that are the most authoritative for everyone.

  • 11:09:12

    PARISERAnd, you know, this is spreading across the Internet. Nearly every top website now has this kind of personalization built in. The filter bubble is the kind of private, unique universe of information that this creates around you. You don't even know it, you know. It's invisible. What's -- that all this editing is going on and yet, you know, it's very hard to escape.

  • 11:09:37

    REHMBut you know, here people are listening to this program this morning. They might be listening to Rush Limbaugh, Glenn Beck, whoever, they're making a choice. But they know it and that's the difference you're pointing out.

  • 11:09:57

    PARISERThat's right. I mean, it's one thing to turn on MSNBC or Fox and you sort of know what the editing rule is. You know what kinds of things you're likely to see on Fox and what kinds of things you're not likely to see on Fox.

  • 11:10:11

    PARISERWhen you go to Google or you go to Facebook, you don't know who they think you are. And you don't know, you know, what that means about what kind of results you're getting and as a result, you don't know what's being left out. And when you look at the dynamics of these algorithms, sometimes the things that are being left out are the most important things. The things we all really need to know about.

  • 11:10:34

    REHMSo how are they making their choices on what they think you want?

  • 11:10:42

    PARISERWell, you know, there's this massive kind of behind-the-scenes of the Web, a race to collect as much personal data about every single one of us as possible. And the databases that now exist are really pretty stunning. Axiom (sp?) the sort of 800-pound gorilla in the field has about 1,500 data points on every single person in the country. They knew more...

  • 11:11:07

    REHM1,500?

  • 11:11:09

    PARISERThat's right. They know if people are right-handed or left-handed, what kind of dog they have. They knew more about the 9/11 hijackers the day after -- on September 12, than the federal government did. And the reason that this is all happening is that this personal data is very valuable because you can use it both to customize ads, but also actually to customize content.

  • 11:11:33

    PARISERBecause people have figured out that if you show people the content that is going to be most kind of personally relevant for them, the things that they're most likely to click on, you can make a lot of money.

  • 11:11:43

    REHMSo you're saying here that what's happening is all of this is invisible?

  • 11:11:53

    PARISERThat's right. You know, and when I first heard about this with Google, I was shocked because I use Google all of the time and I think of it as a way of getting what the Web considers to be the best result for, you know, whatever I'm looking for.

  • 11:12:12

    PARISERAnd it was only when I actually got two friends to sit down and Google BP, you know, at the same time, that I realized how different they were. Because, you know, one person got a search result that had lots of information about the oil spill, about the environmental consequences and another person just got stock tips. That was the only thing on her whole top Google results.

  • 11:12:44

    REHMFor BP?

  • 11:12:45

    PARISERFor BP. You know, and more recently I did this with Egypt. So I had two friends -- and these are fairly similar people. These were people whose politics were pretty aligned, they're around the same age, they even both logging on from New York City area. One of them got lots of information about the protests in Egypt and what was going on. The other one got travel websites. You know, get a ticket to Egypt.

  • 11:13:09

    REHMAnd were they both going to the same website for information?

  • 11:13:16

    PARISERYes, no, they both logged into Google, they were both using Google. You know, so this is the problem essentially is in some ways Google's algorithm is very smart, but in some ways it, you know, this kind of personalization is pretty dumb.

  • 11:13:32

    PARISERIt just shows you what you're the most likely to click on and sometimes what you're the most likely to click on isn't really the thing that you need to know about. You know, so one of the examples I talk about is Afghanistan, right. here's something that the war in Afghanistan, something that's incredibly important.

  • 11:13:49

    PARISERWe have people, you know, dying on the front lines there. We owe it to them to know about that. But Afghanistan's stories are never going to be the stories that people click on the most. And, in fact, you know, I talked to someone at The New York Times who said when they run stories from Afghanistan, they know, you know, that it's not going to do as well as, you know, the story about Donald Trump. But they run it anyway because it's important and these algorithms don't have that built in.

  • 11:14:14

    REHMNow, one emailer says, "I disagree with your guest. If you don't want personalized queries, just use an alternative search engine."

  • 11:14:27

    PARISERWell, I think, you know, that's, you know, not really a full solution. Partly because it's very hard to tell when this is happening and when it's not and it's increasingly built into a lot of these search engines. But I would say, you know, when Google has a huge amount of the Web's traffic flowing through it, it has a responsibility to point people in the right direction in a civic-minded way.

  • 11:14:52

    PARISERAnd it has a responsibility to let consumers know what it's doing and give us some control so that, you know, we can see this happening. I think hiding it from people is one of the, you know, makes it very difficult for people to respond to this because they don't even know that it's happening.

  • 11:15:08

    REHMBut isn't it, in some sense, our own fault because we're putting all this information about ourselves up on, for example, Facebook, so that you've got all this cross-pollination going on?

  • 11:15:25

    PARISERWell, you know, I think we all have a kind of tension inside us between our more impulsive short-term selves and our more aspirational long-term selves. The people who want to be well-educated citizens of the world and the people who want to read Celebrity News. And the best media and the best sort of information services give us a bit of both.

  • 11:15:47

    PARISERYou got a little bit of information dessert as a reward for eating your information vegetables. And these algorithms allow you to just kind of strip out the junk food. You got all the junk food, you don't even see the information vegetables. You don't even see the stories that you need to know but that you might not click on first.

  • 11:16:06

    PARISERAnd the thing is that actually when you look back you remember those stories more. Those are the things that, you know, where you hear some point of view that initially was alarming. When you, you know, hear some idea that you'd never heard before. Those are the things where you actually kind of grow as a human being. You don't remember, like, the ninth time that you read an article about some celebrity marriage breaking up.

  • 11:16:29

    REHMSo what you're saying is we're being influenced by the information that's coming to us, that we never sought out, that we didn't ask for and we're not seeing the full picture?

  • 11:16:46

    PARISERThat's right and, you know, I think -- I used to be, I mean, I still do believe in the power of the Internet to create a more democratic society and I used to be convinced that it just sort of naturally was. I'm no longer convinced of that because I think when you have the kind of consolidation that we're seeing now, where all of the information online runs through a few big companies who are adjusting that information for the purpose of maximizing their profits, you know, it may not be as good a replacement for, you know, the previous era of broadcast news and information as a lot of people thought.

  • 11:17:25

    PARISERAnd instead, you know, instead of sweeping the sort of editorial gatekeepers out of the way and allowing everyone to communicate with everyone. Which was sort of the mythology that a lot of us who love the Internet like to talk about. Really what we have is a new set of gatekeepers, who are Google and Facebook and whose algorithms control what we see and what we don't.

  • 11:17:46

    REHMEli Pariser is the board president, former executive director of MoveOn.org. His new book is titled "The Filter Bubble: What The Internet Is Hiding From You." And when we come back after a short break, we're going to take your calls, 800-433-8850. Send us your email to drshow@wamu.org, join us on Facebook or send us a tweet.

  • 11:20:03

    REHMAnd we're talking about a new book. It's titled "The Filter Bubble: What the Internet is Hiding From You." Eli Pariser is the author. He's also the board president, former executive director of MoveOn.org. He's worried about privacy on the Internet, what happens when you go to a search engine. The information that comes back to you may be what the search engine, very powerful, wants to send back to you based on what it already knows about you. This, you say, is not entirely new, but there are three major differences.

  • 11:20:57

    PARISERThat's right. Well, you know, one difference is it's invisible so you don't see this happening. You don't know that your search engine results or Yahoo News are being edited and reconfigured based on who they think you are. And so you don't know what's being left out. You don't know who they think you are. You don't even know that there's editing going on.

  • 11:21:21

    PARISERThe second is that you don't choose it. It's not like turning on FOX or MSNBC. And, you know, because when you turn those on, you kind of know what filter you're using. With this, it kind of follows you around everywhere that you go. And the third is that it's very hard to escape or turn off. You can't step outside of it. You know, this is something where, in a way, the whole web is working together behind the scenes to collect information about you. Many of the sites are sharing information with each other and they're using that information to present the Internet that you want to see, but not necessarily the things that you need to see.

  • 11:22:04

    REHMHere's an email from Sean who says, "Please don't forget to mention the data mining that's being done by Apple's IOs and Google's Android Smartphone platforms. Very scary that they're tracking users' locations, creating marketing material based on where you've been.

  • 11:22:31

    PARISERThat's right. And, you know, this is part of -- this is sort of the animating struggle of the -- you know, Internet industry right now is to see which of these companies Apple, Google, Facebook, Yahoo, Microsoft can compile the most complete dossier of information about each of us. Because they know that the more complete that information is the more that they can serve content that keeps us coming back. And the more that they can serve ads it gets us to buy specific products. And, you know, some of that is fine actually.

  • 11:23:07

    PARISERI find it useful sometimes to have a restaurant recommended to me or a movie. But when it's happening invisibly everywhere and when we don't have any kind of control over it, when we can't tell Google this is how I want you to direct this kind of search, it's a problem.

  • 11:23:23

    REHMHere's another email that says, "Can you reset the information these websites accumulate by clearing your history and cache?"

  • 11:23:36

    PARISERWell, you can in some cases and Google does make it possible to clear your web history and to clear -- you know, to visit Google without some of that information entailed. The problem is that it actually -- you know, you only need a few pieces of information to start to make recommendations that are profitable. So I was talking to someone at the company Hunch, which does personalization. And he said that if you take five questions about someone and you can ask them those five questions, you can then guess the answers to almost any other question with an 80 percent accuracy, you know, if you have the right math.

  • 11:24:20

    PARISERAnd so the problem is that it really doesn't take much, it doesn't take long to rebuild enough of a sense of who you are to personalize your data. Now, that doesn't mean that it's an accurate portrait of who you are. It certainly doesn't mean that it reflects all of your complexities, all of the nuances of your personality. It just means that it's good enough that they can show you one link instead of another and get you to click on it a bit more.

  • 11:24:47

    REHMAll right. And here are many questions from listeners. Let's go first to St. Augustine, Fla. Good morning, Charles, you're on the air.

  • 11:25:01

    CHARLESYes. Would you say that this process of personalization would lead more towards increased confirmation bias?

  • 11:25:09

    PARISERYeah, so confirmation bias is, you know, the psychological phenomenon where you, you know, want to -- you know, where you tend to agree with data that is presented to you that confirms your opinions and -- when you feel good about it. And, you know, that I think is in some ways the foundation of why content sites do this, you know, because it feels very cozy and very comfortable to visit a site and go, gosh, you know, the whole world agrees with me. I am so great.

  • 11:25:43

    PARISERYou know, that's a really wonderful feeling. And feeling like you might be wrong and like your opinions may not be, you know, the be all and end all, well, you know, that's more uncomfortable. So in a way, you know, that's the dynamic that these sites are playing on in order to get people to come back.

  • 11:26:02

    REHMHere is an email from Stan who's listening on KWMU in St. Louis who says, "Does using a private browser avoid the problem of personalization?" Now, explain what a private browser is.

  • 11:26:21

    PARISERSo most web browsers do allow you to do private browsing, which basically means that you're leaving some of the signals that your computer sends out to tell the Internet who you are, you're leaving those to the side. And that definitely does reduce the degree of personalization significantly. It means that you're seeing more what a lot of other people are seeing. But with Google, for example, even then an engineer told me that there are 57 signals, 57 different pieces of data that Google looks at to personalize the information you get. Everything...

  • 11:26:58

    REHMLike what?

  • 11:26:59

    PARISERLike what your IP address is. So that's the unique address that your computer has. Like what kind of computer you have. You can imagine that people who use a MAC visit different websites from people who use a PC. Like where you're located. And, you know, the -- what the settings are on your computer. There are lots of these things and together they give away more about you than you might think. So the point is even when you're in private browsing mode, that's not necessarily a certain stop to personalization.

  • 11:27:30

    PARISERLet's go now to Baton Rouge, La. Good morning, Eric. Eric, are you there?

  • 11:27:41

    ERICYes, I'm here.

  • 11:27:42

    REHMOkay, sir. Good.

  • 11:27:44

    ERICOkay. I just wanted to make a comment that, you know, I think that it's a problem because it hurts us being more well-rounded individuals. And, you know, it hurts us being accepting of other people's views if we only see, you know, one side of the news. And I was just -- my question was, is there anything that we can do as consumers or as, you know, people that surf the Internet?

  • 11:28:09

    PARISERWell, yeah. I mean, I think, first off, as individuals, it's -- you know, as the sort of old broadcast society where a few editors decided what went on the front page of the newspapers, what went on the TV screens, you know, starts to shift into this new world, it's more incumbent on each of us to make sure that we're building a good set of information.

  • 11:28:37

    PARISERPlus, when you can choose any set of information that you want, then it falls to you to some degree to take responsibility for this and to actually seek out views that may be challenging or uncomfortable. But I also think ultimately a lot of responsibility does rest with these companies, Google and Facebook who are building the algorithms that determine a lot of what people see online.

  • 11:29:02

    PARISERAnd there's no reason that those algorithms couldn't include data about what is -- people find important, what people find newsworthy, what people think is a matter of public interest, as well as data about what people like, you know, and click on. And, you know, so I think if you want to change this we need to call on Facebook and Google and get in touch with them and ask them to be more responsive to consumers who really want this.

  • 11:29:30

    REHMCourse, that filter has a certain appeal to people who don't want to be bombarded with everything.

  • 11:29:41

    PARISERRight. I mean, again, I sort of think all of us have that part of ourselves that wants to just be right and to hear our own views reflected back. And we can -- it's easier than ever to indulge that part of our brain. It's the part that just wants to hear, you know, stuff that we like and agree with. And, you know, I think it would be a real tragedy though if that was the only thing that we -- you know, that we used to get information. And actually, you know, we need our information sources to feed the other part of us, the good citizen, the part that wants to be knowledgeable about the world as well.

  • 11:30:21

    REHMAll right. To Kathleen. She's in Traverse City, Mich. You're on the air.

  • 11:30:28

    KATHLEENHi. I have a question for the author.

  • 11:30:30

    REHMSure.

  • 11:30:31

    KATHLEENDuring the primary season for the 2008 election when the author was the head of MoveOn.org, MoveOn held an online poll to see if members would support Obama or Clinton in the primary. When Obama won, MoveOn removed all those who voted for Clinton from further election communication, and I was one of those people.

  • 11:30:52

    KATHLEENI contacted him, 'cause I'd been a volunteer, and he defended it saying he didn't think MoveOn members who voted for Clinton would be interested in further information about the direction they were taking. I even asked to remain on because I was sort of like, well, you know, I still wanted to work on the election. And my question is isn't that the same kind of screening he's written about in his book, but on a smaller scale?

  • 11:31:18

    PARISERWell, you know, I -- you remember the details of this more than I do. I don't think we removed anyone from the MoveOn list. I mean, I'm certain that we didn't. But...

  • 11:31:28

    KATHLEENYou did. I have the email communication on my computer. I can send it to you.

  • 11:31:32

    PARISERWell, that would be great. You know, the -- that probably -- you know, that is an example of, you know, this kind of thing. And I think that, you know, in general, what we try to do at MoveOn was to actually, you know, give people a broad sense of what is going on. And to let people know, you know, everything that the organization is doing, while at the same time trying to provide people with the stuff that's most interesting to them.

  • 11:32:01

    PARISERBut I think, you know, the important thing is to do it transparently, to say what you're doing and to make it possible for people to adjust their own, you know, information streams.

  • 11:32:13

    REHMBut why would it make sense to remove Kathleen from that list?

  • 11:32:21

    PARISERWell, I don't think that we did remove her from the whole email list, but the reason that -- you know, I think what we were trying to do was not be sending a whole bunch of emails saying, get out the vote for Obama to people who, you know, had voted that they were supporting Clinton until after the primary was over. We thought that would be a nuisance to them.

  • 11:32:43

    PARISERAnd, you know, in some ways this is the same dynamics. It's, you know, not wanting to show people something that, you know, is going to infuriate them more than it's going to serve them. At MoveOn, you know, we've been trying to give people opportunities to take action. We thought that wouldn't probably be a very good opportunity to take action.

  • 11:33:01

    REHMEli Pariser. His new book is titled "The Filter Bubble." And you're listening to "The Diane Rehm Show." To Ray who's in Dallas, Texas. Good morning to you.

  • 11:33:20

    RAYHello.

  • 11:33:21

    REHMYeah, go right ahead, Ray.

  • 11:33:23

    RAYJust a brief question, and I know the guest touched on part of this already. If you had deleted -- just as a theoretical question. But if you deleted the browser history so there was no cookies left in the browser and if you were also, at the same time, not logged into, just say for example, the Yahoo homepage or your Google gmail, if you were not logged into anything and no cookies on the browser, would you still get these modified results?

  • 11:33:53

    RAYAnd then the second part of it is, how does that compare to using a public computer say at a library or, you know, like at a kiosk that offers Internet?

  • 11:34:01

    REHMInteresting.

  • 11:34:03

    RAYHow would your results be then?

  • 11:34:06

    PARISERWell, even if you remove the cookies, even if you remove -- you know, if you're in this private browsing mode, there still are ways that Google can get information about what kind of person you are, what kind of computer you're using and use that to tail your results. It's not as strong as if it has your whole web history at its disposal.

  • 11:34:27

    PARISEROn a public computer, you know, it's sort of an interesting question because, again, the algorithm is just gonna look at the signals it sees, and probably it sees a kind of pretty strange picture. This doesn't look like a normal human being. You know, different people are searching for very different things every 30 minutes. And, you know, that's probably not going to typecast you or profile you in any way that's particularly narrowing.

  • 11:34:58

    PARISERBut one of the interesting things about this is when I talked to the Google engineer who was responsible for this personalization product, he told me that it's not -- you know, it's hard for even Google to say how this will all pan out in any given individual instance. You know, all they can say is that, in general, it seems to have people clicking on things more and using the service more. But the algorithm is so complicated, several hundred thousand lines of code that, even for Google, it's sort of a mystery why something in particular happens on any given person's computer.

  • 11:35:31

    REHMSo -- okay. So let's, for one moment, put aside the narrowing of the information stream and turn to the business side of it and what it could mean for me.

  • 11:35:49

    PARISERWell, you know, one of the interesting things about this is that all of these personalization filters are built around this idea of relevance. And relevance for -- you know, in personalization speak, really means just the things that are most closely connected to the thing that you're currently clicking on or doing. So for example on Amazon, you know, we all see the Amazon recommendations and they're very narrow and it's -- you know, if you liked Godfather Part II than you'll probably like Godfather Part III. Not true as the case may be.

  • 11:36:24

    PARISERBut, you know, the problem is that a lot of the best ideas actually come from sort of serendipitous encounters outside of that zone of relevance. In other words, if you're searching for the solution to a problem rather than searching for information about something, what Google is going to give you is the most closely related concepts, whereas the solution often actually involves information from way outside of the sort of close perimeter of the thing that you're searching for.

  • 11:36:54

    REHMOne of the things you say in the book is, quote, "after you visit a page about third world backpacking, an insurance company with access to your web history might decide to increase your premium.

  • 11:37:13

    PARISERThat's right. You know, we don't -- the paths that this data travels isn't very clear to us and it isn't in our control. You know, once a company knows something about you -- definitely there are certain federal laws that say you can't use information in certain ways to deny someone a home or to deny someone a financial thing. But you can use it to make really important decisions about their life.

  • 11:37:42

    REHMWe're talking about a new book. It's titled "The Filter Bubble." Short break, right back.

  • 11:40:03

    REHMFor Eli Pariser, who's the author of the new book "The Filter Bubble: What the Internet is Hiding From You," here is an email from Doug in Independence, Kentucky. He says, "I'm very disturbed by this trend. Are there ways to keep this from happening, similar to something like a Do Not Call list?"

  • 11:40:32

    PARISERWell, there's certainly a lot of talk right now about a Do Not Track list that would allow you to take yourself off of the list for advertisements tracking who you are. But there isn't really any kind of proposals that systematically address this. And my concern about a Do Not Track list that merely, you know, sort of is an all or nothing choice, is that I actually think, you know, this personalization is useful a lot of the time. It is useful.

  • 11:41:03

    PARISERAnd what we need is for it to embed in the code the kind of principles that our old editing mechanisms had in them to make sure that people were well-served. So, you know, I don't actually want to go back to the sort of good old days of, you know, some people in a room somewhere deciding what you get to see and hear. I want to make sure that the code is actually up to the task of informing citizens, and I think, you know, it needs to broaden what it's focused on, rather than just this very narrow idea of what you're gonna click next.

  • 11:41:40

    REHMSo have you talked with the founder of Google? Have you talked with the founders of Microsoft? To whom have you spoken?

  • 11:41:52

    PARISERWell, I had a brief encounter Larry Page, who now runs Google, where I sort of described this concern to him. To be honest, you know, he didn't seem a whole, you know, very, very interested. He actually said, you know, this isn't a very interesting problem to me. And, you know, that may not be interesting to him, but for users of the product, it's really important actually that Google give us a good view of the world.

  • 11:42:24

    PARISERAnd a lot of the engineers that I talked to, sort of at lower levels at Google, you know, were more seriously grappling with what it meant that they were now responsible for a lot of the world's information and how it reached other people. And for them, I think it was more of a question of it just, you know, this was a real problem, but it never reached the level of priority that, you know, would get it on the list to actually fix.

  • 11:42:53

    REHMWhat about Facebook, how does that enter in here?

  • 11:42:58

    PARISERWell, you know, Facebook is a really interesting case because it now touches one in 14 people on the earth, you know. It's a huge number of people who are getting their information through the Facebook newsfeed. And yet the newsfeed algorithm that decides which friends' updates you see, what links you see, you know, is totally opaque. You don't know why you're seeing what you're seeing, and you don't know what's being left out.

  • 11:43:25

    PARISERAnd I personally had an experience with this where I, you know, have gone out of my way to make friends with people whose political views are different from me. I like hearing what people have to say who, you know, have very different views. And I turned on Facebook and noticed that all of my right-leaning friends had disappeared. There were gone from my newsfeed. As it turned out, Facebook had been looking at what I was clicking on and what I wasn't, and it sort of said, we know better than you.

  • 11:43:54

    PARISERWe know that, you know, you are more likely to click on links to the New York Times than on links to Fox News. You're more likely to click on this friend's than that friend's links, and so we're gonna show you the stuff that you're most likely to engage with.

  • 11:44:09

    REHMAnd here's a question from Kristen who says, "Your guest keeps referring to the fact that we get what we want as opposed to what we need. Who decides what we need? And therein lies the fascination of this technology."

  • 11:44:33

    PARISERWell, right now, Facebook and Google are making these kinds of values decisions about what you get to see and what you don't.

  • 11:44:41

    REHMBut who is it behind?

  • 11:44:43

    PARISEROh. Well, you know, it's these engineers, you know, who we don't even know many of their names, who are building this code. And, you know, my concern is that because these are fundamentally institutions that prioritize making money right now, and, you know, that they're not doing a good job of designing code that gives us good sets of information. So for example, you know, on Facebook, a lot of what you see is based on what people click like on.

  • 11:45:15

    PARISERNow, like has a very particular kind of (word?) to it. You're not gonna click like on, you know, more children killed in Darfur. You are gonna click like on I just finished a marathon. And so there is some kinds of news that you hear more about on Facebook than other kinds of news. Facebook could add a dislike button. And, in fact, according to one person I talked to, they've considered it. But advertisers don't want a dislike button because you could dislike their products and that would be a problem for them.

  • 11:45:48

    REHMWow. Here is an email from Diane. She says, "Another problem with this is that we don't share a common understanding of issues. At one time, we could have public discussions because we all listened to Walter Kronkite or Roger Mudd, or one of the other networks for news. We all received the same information. Today it's as though we're all living in informational parallel universes and we don't realize it."

  • 11:46:29

    PARISERThat's right. That's a really astute point. And it's not even just that we're living in, you know, filter bubbles that are filled with our own politics, our own political views. You know, I'm concerned about the sort of lack of ability to see the same problems on the left and the right, but I'm even more concerned about the fact that this makes it very easy for people to ignore the sort of common problems, the public's fear entirely, you know, that you no longer, if you're interested about baseball and, you know, Kanye West, that you can live in a universe that only has hose things and doesn't have the important public policy issues that, you know, you need to know about to be a good citizen.

  • 11:47:15

    REHMAll right. To Charlotte, North Carolina. Good morning, Carl.

  • 11:47:22

    CARLGood morning. It just seems to me what you're talking about, it's kind of like the opposite of George Orwell's "1984," the big brother in "1984," where he -- the big brother government told everybody what the government wanted you to know. Now we're being told what they think we want to know.

  • 11:47:48

    PARISERThat's right. I mean, actually, there's a great book called "Entertaining Ourselves to Death" that draws a distinction between the sort of "1984" view in which a dictator is kind of controlling what we see and the "Brave New World" scenario where actually people just sort of lull themselves into a sense of kind of entertained complacency and lose touch with the world. I think this is sort of more in the "Brave New World" vein.

  • 11:48:16

    REHMBut here is another view from Kate in Falls Church, Va. Good morning to you.

  • 11:48:24

    KATEGood morning. I hate to say it, but Eli, I think you're vastly underestimating the intelligence of the average Internet user. I mean, I'm 35 years old, and so I don't think I'm too old, not too young, average Internet user. I go on Facebook. I know how to change my settings so that I can see the newsfeed lists all of my friends, and not just the friends that I mostly am in touch with, or mostly like their views. That's an easy thing to do.

  • 11:48:57

    KATEAlso, on Google, when you get result that isn't necessarily what you wanted, say if I Google BP and I wanted to hear about the news of the oil spill, and I got the stock tip, well, I Google something else, or I go up to the news tab and click on that, so I only get news stories about BP. Is it really this difficult? I just disagree that this is something we need to fix.

  • 11:49:24

    PARISERWell, you know, I think the challenge is the dynamics of these things are really hidden. So, for example, you know, you mentioned clicking over to the most recent tab and seeing all of your friends on Facebook. What a lot of people don't know, and what Facebook actually never says, you know, explicitly as far as I can tell, is that that also is filtered, that it's not a complete view of what your friends are doing. And in fact, in a lot of cases, it's pretty heavily filtered.

  • 11:49:49

    PARISERAnd there was a great article on the Daily Beast website that explored this at some length. You know, similarly with Google, I think the problem is that if you Google BP and you get stock tips, you don't know to Google oil spill because you don't know about it. You don't have a sense that it's happening. And so there's some kind of starting point where if you're getting most of your information through these sources, and if they are tailored to what they think you want to see, you really can miss large parts of the world.

  • 11:50:21

    REHMThanks for calling, Kate. Here's an email asking "Can you talk about the new face recognition software by Facebook?"

  • 11:50:33

    PARISERWell, Facebook and Google both are getting better and better at recognizing people's faces in photographs. And this is another way of collecting data about people. You know, if you can actually identify this is Eli in this snapshot, and you notice that there's another person next to him whose last name is the same, then you can start to build a map of my family. You can start to build a map of who I hang out with.

  • 11:51:00

    PARISERAnd of course, if you apply this to all of the video footage that streaming online all the time, you know, you start to get a lot of data about people, where they are, who they are, who they hang out with. So both companies have a lot of interest in this. In fact, Google says that with a few photographs, it can now identify someone with about 95 percent accuracy. And the reason that it hasn't rolled out like Google people image search, where you search for Eli and you get images that Google has recognized as me, is that Google thinks it'll be creepy to people.

  • 11:51:35

    PARISERAnd so, while they're using it on the back end, they're not providing it to consumers on the front end.

  • 11:51:40

    REHMLet's go to Tulsa, Oklahoma. Hi there, David. You're on the air.

  • 11:51:46

    DAVIDHey, hello. I was just calling to see how Eli would respond to the argument that, especially from the side of advertisers and Google, that we are entering a realm of digital property. And just like with these services that Google provides, just like if someone were to come to my house, there's an understood line of rules, and it's just part of the service. Like, how do you respond to that, that if they don't like this, just stop using it?

  • 11:52:11

    DAVIDIf someone were to come over to my house, there are certain guidelines, there are certain underlying rules that they like obey. And I -- and Google provides this information in the terms of service, so how do you respond to that?

  • 11:52:25

    PARISERWell, I think, you know, it's a good question. But when you think about something like Facebook, at this point, a lot of our public life for better or not, happens in this one place, Facebook. And I think, you know, there's a sense that Facebook has some responsibility to offer a good place for people to communicate with each other, because actually, you know, there isn't another network where most of my friends are, or most of most people's friends are.

  • 11:52:51

    PARISERFacebook is a monopoly in a lot of senses, and we've always had some sense that when companies have a monopoly on something of great public import, that they are responsibilities that go along with that. And I think that's kind of, you know, where Google and Facebook are.

  • 11:53:04

    REHMAnd you're listening to "The Diane Rehm Show." Here's an email from Mark who says, "Google and Facebook are free. Nobody is compelled to use them. Why shouldn't they construct their algorithms to best suit their business needs?"

  • 11:53:25

    PARISERWell, you know, this is the dance that they do. They say, you know, you can go anywhere. But these are also companies that say that, you know, in Facebook's case, Facebook -- Mark Zuckerberg says he wants Facebook to be social utility, like a telephone. And if you think about that metaphor, it's as if your phone company said, you know, by the way, you know, you -- we're gonna record everything that you say, and we're gonna use it to send you mail, that's advertising, and if you don't like it, you can use a different service besides a phone.

  • 11:53:56

    PARISERYou know, these kinds of arguments are great ways of avoiding the sense that these are companies that were once kind of insurgencies. Now they are the central places where people find out about things online, and we need them to take responsibility for that, and to step up.

  • 11:54:10

    REHMSo, Eli, the bigger picture, how do you think that this concern that you have about the filter bubble is affecting our democracy as a whole?

  • 11:54:26

    PARISERWell, that's what worries me, you know. I grew up hoping and being very excited about the Internet as this thing that was gonna connect us all together and introduce people to new ideas, and actually make politics a sort of more democratic process. And what I think is happening instead, and that the Internet is very good at kind of bonding like-minded groups of people together.

  • 11:54:48

    PARISERIt's very bad at doing that bridging function of introducing us to things that we don't know about that are new, that are different, and especially things that are uncomfortable or important but unlikely to get clicked on. And the problem is, you know, again, to go back to the war in Afghanistan, there are these really important things that if you leave it just to these algorithms in their current state, can slip out of view.

  • 11:55:10

    PARISERThere was a study that showed that Google news was showing way more stories about Apple products than Afghanistan. And, you know, that's concerning to me at a time when we, the people in this country, are making decisions and voting in a way that affects people's lives on the front lines in a war. You know, we need to know about that stuff even if it's not the thing that we're gonna click on the most.

  • 11:55:35

    REHMSo if you could wave your own magic wand over the search engines, what would you do?

  • 11:55:44

    PARISERWell, you know, I would get the engineers to embed a sense of civic ethics in the algorithms that they're making a sense that they need to inform people not just about what they want to know, but what they need to know. A sense that you can value other things besides this sort of very narrow version of relevance, that you can value social importance, that you can value opposite views and a diversity of opinion. And then, you know, I would ask them to make it more obvious to us what they're doing and make it more transparent.

  • 11:56:18

    PARISERAnd I would ask them to give us some control so that we can decide when things are being filtered and how they're filtered and who they think we are.

  • 11:56:28

    REHMDo you mean messaging what they're doing?

  • 11:56:34

    PARISERYeah. No. Google could easily have a slider at the top that allowed you go from sort of Google search results for people like you to Google search result for people not like you. You know, that would be an easy technological fix, but, you know, it would require making obvious what they're doing, and right now it's invisible.

  • 11:56:54

    REHMEli Pariser, his new book titled "The Filter Bubble: What the Internet is Hiding From You." Very, very interesting. Congratulations.

  • 11:57:07

    PARISERThanks for having me on.

  • 11:57:08

    REHMThank you. And thanks for listening. I'm Diane Rehm.

Related Links

Topics + Tags

Comments

comments powered by Disqus
Most Recent Shows