I recently viewed Eli Pariser’s TED talk – which I’ll summarize so you don’t have to watch the video itself (or read the book for that matter). Essentially, Facebook, Google, et. al. look at your clicks, searches, likes, friends, replies, etc, and determine that what you do, algorithmically, equals your preferences. And over time, Google & Facebook filter what they display to you based on these assumed preferences.
One example he gives in his TED talk, is that, in Facebook, he has Democrat friends, and he has Republican friends. Over time, even though he wants to stay abreast of what his Republic friends are up to, he naturally prefers what his Democrat friends say more, clicks on those links more, “likes” their posts more. And what he noticed is that his “un-liked” friends have slowly become filtered out completely in his Facebook activity stream.
So… the facts on the ground here are very true. I’ve often had the experience where even on my own computer, with cookies intact, changing WIFI networks yields (slightly) different search results.
Eli Pariser sees this as a problem to be blamed on the big networks (Google, Facebook, etc). He finds this to be dreadfully wrong. He feels there is something very Big Brother about it. He is looking outside of himself, for something external to blame, which can therefore be rallied and called upon to change. I disagree only with where his finger is pointing.
I actually see this as something very natural, highly reflective of how our minds work in the real world. I’ll give this entire phenomenon the label of “What you focus on, you see more of” and allow it to cover both this ‘filtering’ issue online, and the ‘filtering’ issue that occurs in all of our lives.
The classic example is when you’re considering buying a car, you suddenly see that car everywhere on the road. It’s not that the car became more popular; it’s that your mind is dwelling on it, and therefore filters your awareness accordingly.
But this also happens to Democrats and Republicans. Democrats focus on all the liberal banter; Republicans focus on all the conservative banter. Both sides, over time, grow more and more distant in terms of styles and preferences, and over time, simply filter each other out. The people at the Glen Beck rally don’t go to the John Stewart/Stephen Colbert rally, nor vicey versey.
So, what Eli Pariser has pointed out, is not a flaw in the system, but, in fact, an opportunity in the system. Let’s take his own case, for example. While in the real world it’s easy for him to not even notice that he’s filtered out those of differing viewpoints, on Facebook he’s been able to compare his current filter with a past filter, and therefore become aware of the filter itself. This is a big deal. In the real world, it’s much, much harder to even notice our filters at all.
Second, Google & Facebook have no skin in the game, no ego to protect. Since our own perceptive biases might be helping cover up certain insecurities, these perceptive filters are actually protecting something deeper from ourselves; there’s a conflict of interest going on that’s not as easily swayed. But Google & Facebook rely on simple pattern recognition; what you choose, you will be served more of.
SO if you’re like Eli Pariser and you WANT to expand your horizons beyond your more animalistic “more-like-me” herd-like preferences, all you need to do is make that very conscious choice and figure out how to semi-trick Google & Facebook into allowing your horizons to expand.
Eli could make certain conscious choices to inflate Facebook’s assumed importance of those Republican friends of his, in order to see them more often in his stream. I haven’t tested this out, but I know where I’d start experimenting: make a list of those people who I know are different from me, regularly go over to their profile page, click on links, “like” certain posts, etc. This simple activity, done on a regular basis, would surely send signals to Facebook to “show me these friends’ activity more often”.
Look, we have conflicting sets of preferences within us. Each of us wants to validate our current selves, so we click on things that reflect our current state of being. But each of us also has an element of wanting to expand beyond our current selves, and so we sign up for that charity, follow people different from us, and read articles that make us uncomfortable (or at least, we WANT to).
In pointing the finger solely at Google & Facebook, Eli Pariser relinquishes control of this issue, and it falls into a classic “victimized by the aggressor” model. And this may be due to his brilliant past success with MoveOn.org, where he knows the answer to a problem can always be solved by rallying many people around a cause.
In this particular situation, I’m sure he’ll succeed in rallying people to heed his message; but will it actually lead to the expanded perceptual growth he’s so after?
The situation might be better served if he made a slight re-adjustment, and built/promoted a Facebook/Firefox/Chrome plugin that in some easy-to-use way, nudges you, reminds you, to take those certain actions which will let Facebook & Google know how to help you expand your viewpoints.
On the ‘net as in our minds, the question is not about knowing how to expand, but remembering that you want to.
Leave a Reply