Now I know what Facebook does, thanks to the controversy generated recently over the academic article by Cornell researchers collaborating with a research team at Facebook Inc. See Experimental evidence of massive-scale emotional contagion through social networks.
On your personal Facebook home page there’s an invitation to “Update Status,” which means entering text into a field to create a message (post). You can also add uploaded images, as well as links to other media and websites. These posts appear in reverse chronological order, and the arrangement on the page is called a Timeline. Visitors to the page can click Like on a post to indicate they’ve seen it, or acknowledge it, whether they really like the post or not. They can also attach comments, and add a link in their own Timeline to one of your posts.
There’s also a News Feed, which is a page that looks very similar to the Timeline, but contains posts from the Timelines of people with whom you have agreed to link up as Facebook friends. I used to think the News Feed showed all the posts of my friends, but there are likely too many of these, and the posts that appear by default are selected by a Facebook algorithm.
The News Feed also includes targeted advertisements and other information, and changes dynamically as new posts and events are created. Other people don’t see the same News Feed that I do as they probably have a different set of friends, and I can’t view other people’s News Feeds. I can however see other people’s Timelines, if I am their friend, and/or their permissions are set appropriately.
So the News Feed is an important feature of Facebook. It’s a channel for receiving and passing on personalised information — alerts from friends to mainstream news items, local gossip, announcements, pronouncements, opinions, complements, comfort, insults, abuse, scandal, quips and jokes.
With (conventional) mainstream media, broadsheets, local newspapers, and circulars, everyone in the catchment community receives the same news items and in the same configuration. As media theorists such as James Carey tell us, one of the values of the mass media, and the news, is that people can gather round and talk about the same information. There’s solidarity in that, and comfort. Facebook News Feeds don’t meet that particular need. We all see different news.
To the extent that anyone relies on their Facebook News Feed as a serious source of sociability, it’s a bit isolating. If you happen to have a jolly circle of friends who are always posting up-beat messages and pictures then you might be more inclined to join in and contribute to the positive melee with your own up-beat contributions. The reverse might apply if you are surrounded by melancholic poets and doom-sayers. Mostly we encounter a combination of posts that are bright, gloomy, tragic, euphoric or angry. But the items that appear in our default News Feeds are selected algorithmically.
What does the News Feed algorithm in default mode do? Facebook says “The News Feed algorithm uses several factors to determine top stories, including the number of comments, who posted the story, and what kind of story it is (ex: photo, video, status update).”
The Cornell experiment involved nearly 700,000 Facebook users, and found that there is a change when people are exposed to positive messages. Individuals are likely to continue the line of positive messages they see in their News Feed. Negative messages prompt further negativity. These effects are apparently more pronounced than the reverse, i.e. where happy messages make you feel a bit gloomy by comparison, or sometimes people feel a bit better in the face of other people’s distress.
The researchers assert that the “contagion” effect is independent of the content of the news being reported. The effect is different to what happens when people pass on bad news. The researchers studied the words used and therefore the transmission of emotions (or moods), rather than information.
The research is controversial in part because, as part of the experimental protocol, the researchers deliberately adjusted the News Feed of the targeted Facebook users. This enabled the researchers to analyse News Feeds and user’s posts under different conditions — News Feeds with positive and negative posts filtered in or out to see if that influenced the user’s own subsequent posts.
The researchers could have conducted the experiment with a “living lab” of willing participants who had signed up to the project. Instead it was conducted without people knowing that their behaviours were being subtly manipulated. The project was not just mining data but deliberately influencing individual’s behaviour, without their explicit consent.
Even if the protocol involved consent, many Facebook users are alarmed that the streaming of personalised news and information can be controlled in this way, and countless comments about the experiment in online news reports and social media attest to a concern that Facebook is able to manipulate people’s emotions and without us knowing.
The mood effects in the experiment may of course be real but insubstantial. What people say in their Status Updates may not always reflect their moods. Maybe the propagation effect is negligible, i.e. comments indicating joy and gloom don’t necessarily circulate around networks like viruses.
In any case we are used to the idea of mood manipulation in the mass media and in advertising, and public events like Festivals, the Commonwealth Games and the Olympics. Politicians and corporations can deploy public events to buoy people along towards some major decision point, like an election or referendum outcome.
Prominent amongst the complaints against the study, and Facebook, is the idea that the mood control in this case is directed at individuals, their behaviours and how they feel. Because the information stream (News Feed) is so personalised you can’t stand round the water cooler and share the mood, or talk about a shared anxiety. Apart from the privacy issues, the experiment shows that, for all its sociability, social media can leave you out in the cold.
Carey, James W. 1989. Communication as Culture: Essays on Media and Society. London: Routledge and Kegan Paul.
Kramera, Adam D. I., Jamie E. Guillory, and Jeffrey T. Hancock. 2014. Experimental evidence of massive-scale emotional contagion through social networks. Proceedings of the National Academy of Science, (111) 24, 8788-8790.
- Sometimes people comment on these blog posts on my Facebook Timeline.
- Nicholas Koumentakis passed on to me a link to a blog post discussing the Facebook algorithm and the problems of auditing algorithmically created content: http://civic.mit.edu/blog/natematias/uncovering-algorithms-looking-inside-the-facebook-news-feed [added 6 August 2014]