So while "emotional contagion" may sound like more of a sickness, it is really just a metaphor for describing large-scale transfers of emotion.
This research addresses emotions in the context of Facebook, and because of this connection to Facebook, the research team had the luxury (academically speaking) of being able to access gigantic numbers of test subjects - nearly 690,000 people, in fact. Users of Facebook agree to this kind of experimentation as part of the terms of service.
The study involved the use of control and experimental groups set up for one week. All data was collected by computer and the researchers never saw the posts themselves. They solved this by setting up a pre-selected group of keywords which indicated a "positive" or "negative" emotional tone in a Facebook post, and having software compile the results into statistical data.
Facebook doesn't control what you write, it controls what you see. To test how often their different groups would see posts that contained emotional words, researchers removed a proportion of the emotional posts to see if more emotional material would lead to users, in turn, making more emotional material.
The researchers found that removing a proportion of negative content from news feeds tended to make users post less negatively in the week that followed. Likewise, users whose news feed had fewer positive posts engaged in a less positive manner for that week. Simply put, if you see less bad stuff, you tend to say fewer bad things. While this might seem innocuous, the prospect of broadly affecting the way literally billions of people think is a fairly scary thought.
Fortunately, the effect that the researchers find is small. In fact, the total change was found to be as small as 0.1 per cent, much less than what we are accustomed to describing as significant.
The researchers argue, though, that this effect is still significant because of Facebook's gigantic user base. After all, 0.1 per cent of the total user base still refers to more than a million of the 1.28 billion active monthly users.
Certainly, yes, 0.1 per cent of Facebook's user base is a large number of people, but the effect itself is incredibly small. Indeed, because of the small effect size, the result cannot be tied to the research methods used. Instead, the change could be produced by incidental effects, such as copying and pasting bits of text, or as a result of insincere or sarcastic remarks.
The other issue with the research is that the emotional spectrum is limited to "positive" or "negative" attributes. It's easy to think of "sad" and "happy" as being opposite to each other, but this doesn't quite work in other situations, nor do such emotions really tie to particular individual words all that well.
To feel "desire" could be on either end of the spectrum, with positive or negative emotional consequences depending on the context. So too with words such as, "love", "nice" and "sweet", which are all a part of the software used in the research.
It is quite easy to imagine these words appearing out of context, or in deliberately negative or vindictive ways ("I would love you to be quiet", "Isn't she 'nice'" and so on). Yet big data research has to cut corners such as these, and it has to be up to each reader to accept such an approach.
The most interesting aspects of the article are not, in fact, a consequence of the research hypothesis. One is the fact that Facebook users appear to be weighted towards positive status updates:
47 per cent of posts contained positive words
22 per cent contained negative words.
The other finding that drew our attention is the "withdrawal effect". Facebook users who were exposed to fewer emotional posts - whether positive or negative - were less likely to engage (by posting, liking and so on) with Facebook later on.
Of course, the closer we are to people, the more likely we are to have other channels of communication: phone, email or even speaking in person.
This study is a reminder that the perspective of the world we see through Facebook is a partial one, and one that many software engineers are tinkering with all the time. We can't necessarily rely upon one particular form of communication to keep us emotionally connected to the world. The researchers note that emotions are not determined by Facebook but by all the other things going on in our life. But then again, if we assume that it's in Facebook's interest for users to be more engaged, then it seems likely that emotional content will be favoured in the news feed's algorithm.
Facebook is not a complete project. It's always changing and developing, and research like this - particularly research produced in tandem with Facebook Inc. - gives us an insight into the direction that Facebook might head in the future, and the protections we might need to develop to use it safely.
Luke van Ryn and Robbie Fordyce are PhD candidates at the University of Melbourne.