The Problem With Micronetworks

11 10 2010

As mentioned in this post on GeekWithLaptop, Facebook has introduced the concept of creating groups of online “friends.” These groups can range as something as small and intimate as college suitemates or an extended family, or something as large and diverse as the North America Man Boy Love Association (mentioned by Lora Bentley on her IT Business Edge blog as a prank pulled on Facebook founder Mark Zuckerberg by TechCrunch’s Michael Arrington).

The prank highlights the idea that is so discomforting about the Facebook Group platform: that anyone can be invited and included in a group sans permission. Before this gets misconstrued, this is partially true. While a user can invite or be invited into

 

TechCruch's Michael Arrington used the privacy loophole in the NAMBLA Facebook Group to add Mark Zuckerberg as a member.

 

a group by someone else, should the invitee leave that group, the inviter is unable to automatically add that person again without expressing a manual invitation. Even so, for a user to be included into a group without expressed consent is one of Facebook’s fundamental problems regarding user privacy.

Fast Company expert blogger Brian Solis notes in his post about Facebook Groups that “privacy is now a process of boundary management. It is in our control to define how much other people know about us, what they see, and the impressions they form.” Solis is absolutely right. However, a large part of Facebook users might not be in the same mindset. As a longtime Facebook user, I know for a fact how easy it is to invade a person’s privacy simply because they erred in managing privacy settings, such as friend lists and tagged photographs.

The idea of these micronetworks only makes this lack of privacy less secure; in unwittingly being included in Groups, potentially infinite people have access to any and all information not explicitly made unavailable in Facebook’s navigational gauntlet of privacy settings. Photographs from the weekend, unflattering status updates, it’s all accessible to family, employers, coworkers, etc.

In a world where Facebook is becoming exponentially more influential and omnipresent (it’s even on your TV’s!), users have a responsibility to themselves to appropriately manage their information output and especially upload (for example, if you have a picture of yourself hugging the toilet after a night out, it’s probably best not to post it).

Or, in the extreme case scenario, you can just opt out of Facebook altogether and close your account. I’m continuing to lean in that direction, it’s just a matter of coming to terms with communicating with people by email and phone again. One day.

For those of you that love corporate propaganda:





The Filter Bubble

11 10 2010

While some first took it as alarming, many of us are now used to Facebook and Amazon providing us with targeted advertisements and product recommendations, based on interests we’ve indicated through web search queries. Now, Google and Yahoo! are doing the same things to our search results, according to an interview with MoveOn.org board president Eli Pariser. MoveOn.org states in its About page that it is a cluster of organizations that “work together to realize the progressive promise of our country. MoveOn is a service — a way for busy but concerned citizens to find their political voice in a system dominated by big money and big media.” One of the goals of Pariser, a renowned Internet activitst, is the passage of net neutrality legislation. He also wants to promote awareness of the problem created by customized search results, an issue he calls “the filter bubble.”

 

MoveOn.org Board President Eli Pariser

 

Pariser describes the filter bubble in his interview with news and entertainment site Salon: “Since Dec. 4, 2009, Google has been personalized for everyone. So when I had two friends this spring Google “BP,” one of them got a set of links that was about investment opportunities in BP. The other one got information about the oil spill. Presumably that was based on the kinds of searches that they had done in the past. If you have Google doing that, and you have Yahoo! doing that, and you have Facebook doing that, and you have all of the top sites on the Web customizing themselves to you, then your information environment starts to look very different from anyone else’s. And that’s what I’m calling the “filter bubble”: that personal ecosystem of information that’s been catered by these algorithms to who they think you are.”

Pariser explains that while these personalized search filters serve a purpose in helping us navigate through the vast amounts of information available online (an issue I delved into in one of last week’s posts), they’ll provide us with a plethora of information on the subject of search query, but nothing else. He says, “There’s a looping going on where if you have an interest, you’re going to learn a lot about that interest. But you’re not going to learn about the very next thing over. And you certainly won’t learn about the opposite view.” This results in a “feedback loop” creating an informationally-restricted environment: the filter bubble.

Pariser suggests a reasonable legal alternative to prevent this from becoming a locked-in part of social media SOP: that web sites become required to have users design their own privacy agreements. Rather than have users “read” pages of legal disclosures then agree to become a site member, Pariser suggests ” a standard format by which customers can have their own policy for how they want their data used.”

Implementing this, along with the success of net neutrality legislation, will prevent the Internet from becoming dominated by a few major media corporations, a disappointing trend of every telecommunication technology of the last century. This, Pariser claims, “is the project of the next couple of years.”

Pariser discusses the filter bubble at the 2010 Personal Democracy Forum: