By Blake Deppe
People's World
May 19 2011
It's a little known fact that the Internet not only hides things from the general public (sometimes intentionally, sometimes not), but that it constantly feeds and accomodates mass consumerism in order to suit corporate interests.
The problem begins with supposedly "better marketing techniques and search parameters" on Google. If someone is signed into Google, whatever search they make is tweaked according to their interests and location. It is much the same with advertisements, via every little ad on the side of a Google search results page, and in every pop-up and pop-under ad they see. If they search a lot for a particular store, for example, Google recognizes this and shows them mostly ads about that store. This is a method of viral marketing that is especially effective when aimed at young and vulnerable kids and teens. But where do people draw the line between having the luxury of websites doing their searching and shopping for them (if indeed that can be considered a luxury) and falling victim to misinformation, or a simple lack of info to begin with?
Eli Pariser, author of new book The Filter Bubble, says that the online social landscape is changing. In an interview with Amazon, Praiser said that "We're used to thinking of the Internet like an enormous library, with services like Google providing a universal map. But that's no longer the case. Sites from Google and Facebook are now increasingly personalized - based on your web history, they filter information to show you the stuff they think you want to see. That can be very different from what everyone else sees - or from what we need to see.
"In general, the things that are most likely to get filtered out are the things you're least likely to click on. Sometimes, this can be a real service - if you never read articles about sports, why should a newspaper put a football story on your front page? But apply the same logic to, say, stories about foreign policy, and a problem starts to emerge. Some things, like homelessness or genocide, aren't highly clickable but are highly important," he concludes, pointing out the dangerousness of not having vital knowledge of world events.
Moreover, Siva Vaidhyanathan, author of The Googlization of Everything (And Why We Should Worry), says the truth is that "we are not Google's customers, we are its products. We are what Google sells to advertisers. When we use Google to find out things on the Web, Google uses our searches to find things out about us."
So, when does Google stop being a helping hand and start turning into "Big Brother"?
Facebook poses another problem. A common trend on Facebook is seeing a status update or news article posted by a friend, and clicking the "Like" button, showing one's approval of it. However, people tend to click "Like" less on grave news bytes (such as the BP Oil Spill or the tsunami in Japan). This is because 'liking' something so tragic can be misconstrued as making light of such an event. As a result, trivial or comic news floats to the top of Facebook, while important events get pushed to the bottom - and ultimately, this can be detrimental.
The Internet should be a tool to forward positive movements, but when various sites are tailoring themselves to suit personal interests in order to make money, this becomes increasingly difficult.
Pariser, who has a background as a political organizer for the website MoveOn.org, notes the risk of overt personalization. "A more democratic society has yet to emerge," he says. "I think it's partly because while the Internet is very good at helping groups of people with like interests band together (like MoveOn), it's not so hot at introducing people to different people and ideas. Democracy requires discourse and personalization is making that more and more elusive."
So just what is a "filter bubble"?
Pariser says, "Your filter bubble is this unique, personal universe of information created just for you by this array of personalizing filters. It's invisible and it's becoming more and more difficult to escape."
No comments:
Post a Comment