Showing posts with label filter bubbles. Show all posts
Showing posts with label filter bubbles. Show all posts

Saturday, March 03, 2012

An Unintended Profiled Life





By Gary Berg-Cross

In May of 2011 I wrote a blog called The Unfiltered Life about the idea of Filter Bubbles a phrase coined by internet activist Eli Pariser. Simply put Google’s technology guesses what you want and gives it to you. It can do this because Google knows things about us from our past use of Google services. It can thus adapt searches (and target ads) to reflect this knowledge. Google’s new privacy policy is in the news just now because it has integrated/consolidated 70 previous policies into one.
This allows data it knows about us to construct a coherent, behavioral profile of us. Some questioning whether their stated policy is at odds with their star wars inspired company motto - “Don’t be evil.”

I like it’s summary of altruistic intentions, but the creation of a behavioral profile has potential downsides. In a sense this reflects a reality that our “free” Google services has a hidden cost. We or our information is a product that Google and others is selling to advertizing. Gmail activity is one that is used but also our search patterns, conversations and chat on Google+ as well as the videos one watches on YouTube. There would be more I guess if you are an Android user. Google has already been sued by a privacy activist who has demanded that it should pay to replace his Android smartphone because he won’t consent to its new privacy policy.

I’ve no experience with creating a profile from such activity, but IT professionals have been working on this and a company like Google can leverage this. Their revenue comes from ads and behavorial ads are know to be effective. One recently cited example, is the case of Target who was able to identify that a teenager was pregnant. This was before her father and she had discussed it. Target knew more about his daughter than he did in that regard). Target (no pun on targeting intended) apparently has identified certain patterns in expecting mothers which they use to assign shoppers a “pregnancy prediction score.” And this is a worry with Google’s knowledge of us through patterns of use.

I’m note sure if this is a conscious grab for power and knowledge or just a continued slide down a slippery slope of we are just a business that wants to make a profit. It is another confrontation of business values and broader humanistic values and I fear that the culture is slipping sideways but downward.

Friday, May 27, 2011

The Unfiltered Life


The other day, as I was searching for something about humanism using Google Gmail, I noticed a list of topics on the right side of my screen under "More about...". I rarely look for such things on my browser that prompts me with linked topics. But in an ideal Internet world it could be something that I might like to know more about. One of these labeled Secular Humanism (SH) seemed appropriate. So for probably the first time, I clicked on this topic-link. Maybe a path to a new Google experience. About a dozen things popped up and I was so surprised with the results that I saved the page, so I could comment on what I saw.

The first thing was a link labeled "New Spiritual Magazine"! Not exactly my idea of a SH topic. Next came one called New Age Book Publisher which said “ Share Your Spiritual Journey with Your Inspirational Book. Free Guide” and listed a web site. Then I was enticed to “Earn a Ph.D. in 3 Months". Only after this did I get a real SH topic -Discover Humanism! connecting to The American Humanist Association (The voice of Humanism since 1941) at www.americanhumanist.org. There was one other item I’d think of as properly appropriate for an SH person. The rest of a distracting (and maybe a bit disturbing list) of non-SH topics included:

• Ancient & Mystical Order - Find the Key to Universal Wisdom.
• South African Diamonds - We will not be undersold !
• Expand Your Mind - The Key to Change Your Life is to…
• How to Do Meditation?
• Study Psychology Online - Launch Your Career in Psychology
• Practice Futures Trading - Forget The Old Boys Club!
• Find Truth or Reality
• EthicShare: Research Site Find, Share, Collaborate, Network
• Spiritual Seeker.

Clearly the Secular Humanist “ads” were in the minority and most of the competition was hogging my space and crowding my time. Weren't there enough Secular Humanist things for me to learn about? Apparently not the paying kind. My free Google service was guiding me into a easy path to meditation, education and futures trading. This is the Internet at work - making money. Sure there is perhaps a bit of a reason to be concerned, but why should I expect Google to understand that I'm not interested in such things.

Later in the day I found a reason to be more disappointed, because Google and other Apps may know too much about me and still not shield me from things I'm not interested in. A friend from a discussion group I’m in sent me a link. It was something he found both interesting and frightening (“the dark side of Internet personalization”) and he wanted to discuss it at our next meeting. The topic was Filter Bubbles, which is a term for the automated, personalized filtering of information you get from Facebook, Google and Yahoo! etc. The idea is that sites such of these personalize what we see based on info we have provided, which features we have turned on, the nature of our searches, or what we have chosen to view. Automated filters hide stuff we typically ignore and show search results similar to the kinds we've preferred in the past. The result is being in our own bubble while online. This wasn't exactly my experience with the Google ads, but I decided to look into what was being discussed as filter bubbles.

This activities and possible implications are discussed in Eli Pariser's (MoveOn co-founder) new book called The Filter Bubble: What the Internet Is Hiding from You. I haven’t read the book but you can see Eli's 8 minute or so TED talk on The Filter Bubble and the effects of online personalization at http://www.wimp.com/filterbubbles/.

From this and reports I’ve read you can understand Pariser’s concern. "Smart" but invisible customization of one’s Internet experience can limit exposure to the broad set information that we expect from the Internet. The example Pariser uses is from a person’s recent search using the key word “Egypt”. One user, who in the past looked at political topical pages, may get the latest news such as on the revolution or its aftereffect. Others might only see search results about Egyptian vacations, since they usually search for such things by naming a place. Aren’t they both entitled to all the facts? I for one was not aware that searches could give such different results based on how an Internet site classifies us. Pariser’s research suggests that the top 50 websites collect an average of 64 bits of personal information each time we visit. Some of these are our responses say using a Like button. Others are not, but they are all in the cloud of data we produce, which Apps may use as they want. The total set is used to custom-design sites to conform to a simple, perhaps naive view of our perceived preferences. After all Internet Apps need not be into deep analysis of our philosophies and intents. They exist in a synthetic environment not the one we evolved from.

As a person interested in transparency (see http://secularhumanist.blogspot.com/2011/05/transparency-and-tcamp-shining-little.html) I agree with Pariser's concern about the implications of site personalization. It's an:

“invisible, unaccountable, commercially driven customization turns into a media-bias-of-one, an information system that distorts your perception of reality, parochial, exploiting your cognitive blind-spots to make you overestimate the importance or prevalence of certain ideas, products and philosophies and underestimate others.”

After looking at the TED video I quickly did my own test of my own bubble. I did a Google search for “Secular Humanism” logged into my own PC. A minute later I did the same serach using my wife’s login on her computer. The results were different, although since we are similar, I assume not as much as they would be with a random person. I got 691,000 hits while she had 706,000 so they might be filtering me from a few of those spiritual hits! The first 2 hits were the same for each of us, but 3 and 4 were different and my wife got to the Wikipedia listing before me (and I thought that I was a big Wikipedia user!). But she also got to BeliefNet on the first page and I didn’t.

I’m not sure what it all means yet and like Pariser I’m not convinced that all of it is intentional malicious. It may be more of a candy-like phenomena on both ends. Automation offers me some benefits and it is too sweet and tempting idea for the Internet companies to resist. They want to do things to bring us to their sites and appealing to what we “like” is an easy way to ensure that. If I see something that I "like" I can help promote it. So Facebook has a ‘like’ button which is easy to click, especially for simple ideas. But consider something complex like war and conflict - news about Afghanistan or Iraq. When Pariser talked to the people who run news websites like Yahoo!, they’ said that war in Afghanistan doesn’t “perform” well. To them it means they don’t get a lot of clicks. Remember clicks are how they make money. Yet, this is the type of the information a citizen needs. I share Pariser's concern that big Internet companies are by default a new type of Gatekeeper. The way the system works they have a financial reason to dumb
down and repackage content. But the internet needs to handle complicated, important topics like ‘war in Afghanistan enters its 10th year’. But whose job is it? It's’s not Facebook’s mission. They are just doing the simple stuff that makes a profit. Long-term implications are not a problem they may wrestle with. Sounds like that is the lack of wisdom we sometimes get from markets. But unless we as a society think through the implications of things we might wind up in bubbles that are echo chambers for simple ideas in a very non-simple real world. That's not the future vision of transparent access that I signed up for.