We’ve heard it called many things: Confirmation bias. Influence bubble. Like-minded people flock together.
Some interesting data is being pulled from the clouds to determine the linkage between us and our political views. And even more data is being organized and sold to allow businesses to sell you more products and services. Ad infinitum.
Here’s a chart of tweets analyzed for RED or BLUE political content.
That is a chart in a study of tweets and hashtags. The colors (BLUE vs RED political leanings) indicate what “side” of the controversy a person is on, and the position and distance indicate how closely two accounts are related (by following one another).
And if you take that information at face value you an see the RED folks are much more diverse in their topics of affinity. And they appear to be less-likely to engage in conversations with the BLUE folks. That’s nothing new, do you think? But the abstracts from the study show an even more interesting picture of our internet mind.
A deeper study about Facebook involved political opinion and our growing reliance on social media for our connections and … yes, news.
And here’s the abstract of that report.
Exposure to ideologically diverse news and opinion on Facebook
Exposure to news, opinion, and civic information increasingly occurs through social media. How do these online networks influence exposure to perspectives that cut across ideological lines? Using deidentified data, we examined how 10.1 million U.S. Facebook users interact with socially shared news. We directly measured ideological homophily in friend networks and examined the extent to which heterogeneous friends could potentially expose individuals to cross-cutting content. We then quantified the extent to which individuals encounter comparatively more or less diverse content while interacting via Facebook’s algorithmically ranked News Feed and further studied users’ choices to click through to ideologically discordant content. Compared with algorithmic ranking, individuals’ choices played a stronger role in limiting exposure to cross-cutting content.
But wait, it gets more interesting. As the scientists are looking for something bigger than trends and influences. Something called:
The rise of the social algorithm
Humanity is in the early stages of the rise of social algorithms: programs that size us up, evaluate what we want, and provide a customized experience. This quiet but epic paradigm shift is fraught with social and policy implications. The evolution of Google exemplifies this shift. It began as a simple deterministic ranking system based on the linkage structure among Web sites—the model of algorithmic Fordism, where any color was fine as long as it was black (1). The current Google is a very different product, personalizing results (2) on the basis of information about past searches and other contextual information, like location. On page 1130 of this issue, Bakshy et al. (3) explore whether such personalized curation on Facebook prevents users from accessing posts presenting conflicting political views.
Yep, you got it. We are seeing mainly views of people we agree with on Social Networks. Facebook’s algorithm is tuning our experience to give us more content from people we LIKE and content like the content that we LIKE. As our Facebook profiles evolve we are weeding out the people we don’t LIKE and in effect creating a bubble of like-minded friends.
Of course the sparks can still fly when a political topic spurs cross group commenting. When one of your friends, one who doesn’t follow you or agree with you on about 90% of your opinions, is looking around and sees your comment on Healthcare reform. If the debate is strong within them, they might venture a slap at your ignorant and naive perspective. And then, if the topic takes, a Facebook shit storm might take place.
It’s true that very few opinions are changed in the process, but is usually good fun to watch when one of these Facebook debates boils over into name calling and Fox News links given as reference points. (Yes, it is easy to tell which side of the ideological debate I’m on.) But the fun comes in around the banter that also takes place. Sparing is fun as long as everyone fights fair.
Again, I’m not going to convince you that Obamacare is a good thing on a Facebook post. I might get in a few of my ideas. (Reform is necessary. Obamacare is better than what we had, nothing.) And you might get in a few of your talking points as well. (Insurance companies are the winners, the president lied, the website sucked, Obamacare is evil incarnate.) And while these political opinions could open up into a healthy (pun intended) debate, on Facebook they usually devolve into mud-slinging. And in many cases the Conservatives who argue Fox News Bulletpoints will unfriend or even block some of their more diversely referenced liberals.
It’s a shame we can’t take a few of these and create the debate, with live people, in a real setting. Invite all of the folks on this incendiary thread to a dinner party and see what happens.
- The conversation will be much more civil.
- Fox News may be used as a reference site, but only for comedic purposes. (I rarely see The Daily Show used as facts.)
- When the conversation goes beyond social media sized soundbites we can see there are some valid points within many of the arguments.
In the end, my social political experiment may not yield any spectacular agreements between two sides, but I bet we had fun and kept our wigs on. In social media it’s more fun to burst into flames, run screaming bloody murder, and see if we can burn the other person.
But our social algorithm is being built anyway. And it is being sold to Amazon, Google, Target, TimeWarner, ATT, and any other corporate machine that wants to keep you in a consumer state of mind. I recommend using tools like AdBlock and Privacy Badger, but in the end your profile is out there and your preferences and buying habits are up for bid. Let’s hope our healthcare data isn’t available in the same type of auction.
- Red vs. Blue: Twitter Controversies Vividly Visualized in Study – NBC News
- Study Shows You Make Your Own Political ‘Filter Bubble’ on Facebook – NBC News
- Charts and abstracts from ScienceXpress