The functioning of Facebook is a mystery to most of its users, but an actual divider of social groups. Facebook gives the impression that if a user post a status update, all of her friends receive it in the timeline, but that is not always true. The Edgerank algorithm decides what is shown at the timeline based on three variables:
- Affinity: the interaction rate between the person who posted ant the person who receives it.
- Weight: the relevance of what is being posted based on the content type (picture or text) and the interactions gathered around (likes and comments).
- Decay: the content’s age. The oldest, the smaller the score.
This abstract description does little to understand the consequences of the Edgerank algorithm in the formation of social groups. For example, when someone is liking content only from right-wing friends, she might be receiving less and less content from left-wing friends. This is what Eli Parisier calls a filter bubble.
To discuss the impact of Edgerank in the formation of political groups, I proposed a parlour game to be played at the Brazilian Free Software Forum. There the Social Participation Lab was organizing public experiments about current and future social participation technology. Facebook is the most used channel by Brazilian politicans to interact with their ellectorate, and perhaps also the main channel for citizens to discuss politics with each other. The goal of the public experiment was to make explicit the Facebook’s mediation of the political debate.
The game consists of people sharing updates on Post-its to other people who is connected by a wool thread (a friendship connection). Every player has a set of 21 like stickers to put on updates from other players. When an update is received, the player may choose either to put a sticker on it (the equivalent of a like button) and forward to another friend, or to return to the giver. When the update reaches its author, it cannot be distributed any longer, but if everybody is liking the update, it circulates across the network. The winner of this parlour game is the one who has the most likes in his updates minus the remaining like stickers left on his hand.
In the experiment at the Free Software Forum we asked players to post updates about the reduction of the penal age, a constitutional change currently being discussed by the Brazilian Congress. The updates that got the most likes were the ones in favor of penal age reduction, but there were more updates against the penal age reduction. The popular updates circulated so much that at some point nobody knew who was the original author and could not return the update to him or her. This was considered analogous to viral spread.
We could also observe changes in friendship. If a player had all his updates rejected by a friend, he would not try sending more updates to him. In contrast, a player who was liking everything had more chances to make new friends and spread his own updates. Not surprising, the winner was the one who gave all his like stickers at the beginning of the game.
After playing, we held a debate about the algorithm. The participants were surprised to know how quickly they can be isolated into bubbles if they are consistent in their behavior. They learnt that if they want to be aware what others are thinking, they need to like content that they do not really like, otherwise the algorithm does not show them any longer. This rather counter intuitive behavior is the only way to prevent being isolated with people of the same opinion in Facebook.
The Facebook parlour game has demonstrated a good approach for bringing the controversy around social network algorithms to the public. We expect the game to be also useful for designing algorithms that prevents people from isolation.