Did you ever hear the phrase, "Give the people what they want?"  Well, it would seem as though Google (and very likely other search engines as well," are looking to take that statement to a more intimate level, and give the PERSON what s/he wants.  The customization of search results to the person doing the search and his or her likes and dislikes is creating a "filter bubble."  The problem is that said bubble, while it may filter out garbage, may also sift out important information that the user simply has no taste for.

Ordinarily, I'd post up an excerpt of an article with a link to the complete piece.  Frankly, considering the current ubiquity of Google, the use it gets and the potential impact of such a phenomenon, I felt that this particular piece required more attention and consideration.  That said, here is today's complete entry from delanceyplace.com on Eli Pariser's book, The Filter Bubble:


==================================


In today's selection -- from The Filter Bubble by Eli Pariser. Because of the personalization of the internet, an internet search of the same term by two different people will often bring very different results. We are each increasingly being served not only ads for what we are more likely to want, but also news and information that is familiar and confirms our beliefs. The issue is that we are increasingly unaware of what is being filtered out and why -- leaving us each more and more in our own unique and self-reinforcing information bubble. Author Eli Pariser calls this "the filter bubble" -- and it is leaving less room for encounters with unexpected ideas:

"Most of us assume that when we 'google' a term, we all see the same results -- the ones that the company's famous Page Rank algorithm suggests are the most authoritative based on other pages' links. But since December 2009, this is no longer true. Now you get the result that Google's algorithm suggests is best for you in particular -- and someone else may see something entirely different. In other words, there is no standard Google anymore.

"It's not hard to see this difference in action. In the spring of 2010, while the remains of the Deepwater Horizon oil rig were spewing crude oil into the Gulf of Mexico, I asked two friends to search for the term 'BP.' They're pretty similar -- educated white left-leaning women who live in the Northeast. But the results they saw were quite different. One of my friends saw investment information about BP. The other saw news. For one, the first page of results contained links about the oil spill; for the other, there was nothing about it except for a promotional ad from BP.

"Even the number of results returned by Google differed -- about 180 million results for one friend and 139 million for the other. If the results were that different for these two progressive East Coast women, imagine how different they would be for my friends and, say, an elderly Republican in Texas (or, for that matter, a businessman in Japan).

"With Google personalized for everyone, the query 'stem cells' might produce diametrically opposed results for scientists who support stem cell research and activists who oppose it. 'Proof of climate change' might turn up different results for an environmental activist and an oil company executive. In polls, a huge majority of us assume search engines are unbiased. But that may be just because they're increasingly biased to share our own views. More and more, your computer monitor is a kind of one-way mirror, reflecting your own interests while algorithmic observers watch what you click. ...

"For a time, it seemed that the Internet was going to entirely redemocratize society. Bloggers and citizen journalists would single-handedly rebuild the public media. Politicians would be able to run only with a broad base of support from small, everyday donors. Local governments would become more transparent and accountable to their citizens. And yet the era of civic connection I dreamed about hasn't come. Democracy requires citizens to see things from one another's point of view, but instead we're more and more enclosed in our own bubbles. Democracy requires a reliance on shared facts; instead we're being offered parallel but separate universes.

"My sense of unease crystallized when I noticed that my conservative friends had disappeared from my Facebook page. Politically, I lean to the left, but I like to hear what conservatives are thinking, and I've gone out of my way to befriend a few and add them as Facebook connections. I wanted to see what links they'd post, read their comments, and learn a bit from them.

"But their links never turned up in my Top News feed. Facebook was apparently doing the math and noticing that I was still clicking my progressive friends' links more than my conservative friends' -- and links to the latest Lady Gaga videos more than either. So no conservative links for me.

"I started doing some research, trying to understand how Facebook was deciding what to show me and what to hide. As it turned out, Facebook wasn't alone.

"With little notice or fanfare, the digital world is fundamentally changing. What was once an anonymous medium where anyone could be anyone -- where, in the words of the famous New Yorker cartoon, nobody knows you're a dog -- is now a tool for soliciting and analyzing our personal data. According to one Wall Street Journal study, the top fifty Internet sites, from CNN to Yahoo to MSN, install an average of 64 data-laden cookies and personal tracking beacons each. Search for a word like 'depression' on Dictionary. com, and the site [automatically collects and stores information about your computer or mobile device and your activities] so that other Web sites can target you with antidepressants. Share an article about cooking on ABC News, and you may be chased around the Web by ads for Teflon-coated pots. Open -- even for an instant -- a page listing signs that your spouse may be cheating and prepare to be haunted with DNA paternity-test ads. The new Internet doesn't just know you're a dog; it knows your breed and wants to sell you a bowl of premium kibble."

Tags: Eli Pariser, Google, filter bubble, search engine

Views: 140

Replies to This Discussion

i have been increasingly aware of 'confirmation bias' in my searches and selections'  I also realized that it was at work in solidifying the opinions of everyone else. What never dawned on me was that even when I did a search, I was not getting the same results as everyone else!  Those who have read any posts from me will heartily agree that I am a person of extreme views.  In an attempt to modify my own biases, and learn about  other people's points of view, I read web newspapers like Glenn Beck's, "The Blaze", and make other efforts to read my opponents' views. Also, like Loren, I read the blogs and posts of those with whom I strongly disagree on various subjects.  I thought I had only my own prejudices to dispell, I had no idea that my passions were being fanned, rather than controlled by my reading on the internet.  It's really rather alarming!  Here we are, millions of people, all having our opinions, knowledge of facts, and even our emotions confirmed, encouraged, and unchallenged.  The web has become a simpering, obsequious servant assisting.  Like the story of the Emperor who had no clothes, We adjust our outfits, asking our servant how we look.  "You look wonderful! Says our ever evolving servant. 

Hmmm, "servant?"  There's a question for ya: who is serving WHOM here???

That gentle caress you feel is your search engine's hand in your pocket! 

Oh, is THAT what that was!!!

I find this article very, very, scary.

Point on the curve - the following is the result of a Google search on the phrase, "religion is":

The obvious question is: does EVERYONE get the same thing?  Invitations to others' results are herewith solicited.

And Dogly, thanks for your input!

Loren, I got identical results from Google.

So, we birds of a feather may get very similar results, but what do those of diametrically opposite world views get?

That, as they say, is the $64 question...

They were more generous with me.  I got, 'evil, and stupid', too.

Well, GEE!  Ain't YOU special!

I have become increasingly aware of 'confirmation bias' in my searches and selections.  I realized that the same process was at work in solidifying the opinions of everyone else, too. What never dawned on me was, that when I did a search, I was not getting the same results as everyone else!  Those who have read any posts from me will heartily agree that I am a person of extreme views.  In an attempt to modify my own biases, and learn about  other people's points of view, I read web newspapers like Glenn Beck's, "The Blaze", and make other efforts to read my opponents' views. Also, like Loren, I read the blogs and posts of those with whom I strongly disagree on various subjects.  I thought I had only my own prejudices, and partisanship to dispel, I had no idea that my passions were being fanned, rather than modified by my reading on the internet.  It's really rather alarming!  Here we are, millions of people, all having our own opinions, knowledge of facts, and even our emotions confirmed, encouraged, and unchallenged.  The web has become a simpering, fawning servant.  Like the story of the Emperor who had no clothes, We adjust our outfits, asking our boot-licking servant how we look.  "You look wonderful! Says our ever evolving, and most obsequious servant. 

RSS

Support Atheist Nexus

Donate Today

Donate

 

Help Nexus When You Buy From Amazon

Amazon

 

© 2014   Atheist Nexus. All rights reserved. Admin: Richard Haynes.

Badges  |  Report an Issue  |  Terms of Service