Keynote Conversation with Eli Pariser

September 14, 2011
By Lisa Barone in Internet Marketing Conferences

GOOD MORNING, SMXers (and non-SMXers, as well)! Are you AWAKE? Did you have a fantastic night last night? We certainly did, attending the SMB Influencer awards and then running into Lady Gaga at dinner. No. Really. That happened.

I’ve almost begun breathing again.

But don’t worry, just because I’m TOTALLY LADY GAGA’S BFF NOW doesn’t mean you can’t still expect a jam-packed day of liveblogging coverage.  We’ll be bringing all the dirt, the hits, and the SEO knowledge.  To get us started, up first we have a keynote with Eli Pariser.

Chris Sherman starts things off with a good morning. And then another good morning when he decides the first one wasn’t chipper enough.  We do it again, Chris is happy with that, Eli takes the stage.

Morning, Eli.

Eli wants to talk about the moral consequences of living a life that is mediated by search algorithms. And just like that it’s almost as if my father has appeared in the room.  Eli says it was a quote from Mark Zuckerberg that got him thinking about this. Mark said a squirrel dying in front of your house may be more relevant to your interests right now than people dying in Africa. Eliwants to talk about what a Web based on that idea of relevance may look like. I want to talk about who thinks it’s a good idea to let Marc Zuckerberg speak publicly.

Just sayin’.

Eli used to live in a magical world where he believed the Web was something that was going to connect him to other people. He wonders if the Web is as connective as he thought. The first place he started noticing this was on Facebook.

Eli leans to the left politically, but he’s always tried to get to know people who think differently than him. He likes different ideas and seeing what other people are posting. One morning he logged onto Facebook and he noticed that his conservative friends weren’t there. They disappeared. Facebook was watching his behavior on the site, what links he was clicking on, etc. Facebook noticed that he was only clicking the links of people similar to him, so just like that his conservative friends disappeared.

And it’s not just Facebook that’s doing this. Google is doing it, as well. One engineer told him that even if you’re logged out of Google, there are 57 signals that Google tracks – computer being used, where you’re sitting, browser you’re using, etc — to make some guesses about you and what you want to see.  He asked some friends to do an experiment to see what it looked like when they searched for [Egypt] and he got back 30 screenshots.

They were all very different.

Increasingly, the Web is showing us what it thinks we want to see, not necessarily what we need to see or the world as it is.

It will be very  hard for people to watch or consume something that has not in some sense been tailored for them – Eric Schmidt

We’re surrounded by this membrane of personal filters. You don’t choose what gets in your filter bubble and, more importantly, you don’t know what’s been edited out.

The best media does a great job of balancing what you want to know about (Justin Bieber) and what you should know about (Afghanistan). But because our information is being filtered by what we clicked on, we’re only getting the information junk food.

This suggests that the picture he had of the Internet was wrong. As he was growing up he was pretty excited about the Web theory that in the 20th century there were these gatekeepers who controlled the flow of information. And then the Internet came along and swept that out of the way and people could go direct to one another. We didn’t need those gatekeepers. But that’s not really what’s happening at all. There’s a new sense of gatekeepers and they’re not people, their code.  That code is making the same value judgments as the old gatekeepers. It’s deciding what’s important but it doesn’t have any kind of civic sense build into it. It can show us what we like, but not what actually matters.

We shouldn’t get the squirrel OR the kid dying in Africa. We should be getting both. And if they are making these edits for us, we need to make sure they’re not focusing on this very narrow idea of relevance. They need to build in other signals, as well. We need the Internet to be that thing that connects us to new ideas and new ways of thinking.  But it won’t do that if we’re stuck in a personalized bubble of one.

From here, Danny and Chris start asking Eli see questions.

Do you find there’s still some degree of commonality between the results for everyone?

There’s a great study that came out since I wrote the book. He took three different philosophers and build Google profiles off the index profiles in their books.  Then he looked at how the search results were changing as they went from 100 queries to 1,000 to 10,000.  If you do the Egypt experiment, there’s a lot of variety.  What he found was interesting was that at the beginning, Wikipedia is very dominant. But as personalization takes off, Wikipedia drops in the ranks and people start seeing very different things. The higher personalization, the less common experience even if the search is generic.

When you did the experiment, did people feel like they should get the same results?

Yes. Most people don’t know that Google is doing this at all.

I remember when Google first launched personalization. Marissa Mayer said it was subtle, that it would surface more long-tail results, and that it’s only going to be tied to your personal Web or search history. Do you think that’s accurate now?

It’s hard to say. He interviewed the engineer doing personalization and he said it’s hard to say in any given case what the algorithm is doing. They don’t really know themselves. For some searches its subtle and for some searches it is not. He thinks Google undersells how significant it is. He doesn’t think Google has malicious motives in doing this. They genuinely think a more personalized experience will keep people coming back to their search engine. They also look at this as something that makes it harder to game the results. One of the strongest places where you see personalization is for vanity searches, especially the more you do it. [Heh]  He thinks Google takes a perverse pleasure in that. [Double heh!]

When Danny first started writing about search engines, he’d be dealing with librarians doing complicated queries. He’d tell them it wasn’t about getting every document back, it was about getting a good collection.  Don’t you think the bubble gets popped a bit – Republican sites are going to link to Democratic sites. Does that mitigate it?

The more exploratory you are, the more you get out of your own bubble. The problem is you may not know if you’re in or out of it.

You raise an interesting question when you say even Google doesn’t know what’s going on. Should Google be the one controlling this?

There’s no ombudsmen at Google like there is at the New York Times. There’s no sense of accountability.  He thinks the key changes are, it would be good if people knew what Google was doing with the information they’re handing over. You should be able to decide which data Google can use and when it’s on and when it’s off. Google says they don’t want to make it too complicated. That may have been true before, but as people become more literate they should be able to direct their search experience.  Resetting those expectations is important and probably needs to happen at a regulatory level.

You had talked about creating different dashboards where people could edit their information – I’m not a surfer anymore, I’m not Republican anymore, etc.  Google doesn’t classify you in that regard, it groups you by what topics you’re interested in. It sounds like you think they should be like Amazon and let you say “that was a bad choice”.

Different companies do it different ways. DoubleClick tries to reduce to categories that are human intelligible. The important thing is to give people the tools to understand and play with it. He suggest a slider that allows you to turn personalization up or down.

You’re talking toa grou pf marketers. These are people who believe that personalization is the holy grail because you can get in front of people in a targeted way. You’re also advocating for searchers who need a more balanced diet. Where’s the middle ground?

From a marketing standpoint it’s a double-edge sword. It’s another hoop to jump through to get to consumers. His position isn’t that personalization is bad overall, it’s that we have to be careful and pay a lot of attention to how its done. To do that, it needs to be done in a way that you can figure out what it’s doing. Google could do a lot more to explain its philosophy about this without making it susceptible to gaming. There needs to be more research about what this is doing and what the consequence is. The balance comes when people are able to use these tools the way that they want to use them and to be able to decide that right now they just want the information junk food and right now I want to see articles that provoke me and get me thinking in a different way.

 

 

Social Media
Social Media

Having “The Talk” with Staff, Social Media Style

on Oct 27 by Lisa Barone

Listen, it doesn’t matter if your company plans on getting actively involved in social media or if you’re just gonna…

Online Marketing
Online Marketing

The Indispensable Items of Outspoken Media

on Aug 13 by Rae Hoffman-Dolan

Everyone has some things they can’t live without in their working environment and we’re no different. Below you’ll find the…

Internet Marketing Conferences
Internet Marketing Conferences

Morning Keynote – The Evolution of Search: End Users Signal The Way

on Mar 25 by Lisa Barone

Good morning, folks! It’s Day 3 here at Search Engine Strategies New York and I am just a little bit…

^Back to Top