CHAPTER 4

HOW SOCIAL INFORMATION CHANGES THE WORLD

Information about what other people are doing is everywhere on social media. On Facebook users can see the most intimate details of what their friends are up to, who liked or shared what, when, and where, and simultaneously, on another part of the screen, find out what is most popular on Facebook across the nation or across the world. For any person users decide to follow on Twitter, they will know also how many other people have made the same choice; for any piece of information they view, they will know also how many times it has been shared; and at the same time, they will be made aware of what is trending on Twitter in real time. On YouTube, they will see how many people have watched any video they care to view and also the most popular videos on the YouTube platform as a whole, or in any category they select. Platforms vary in the way in which they alert people to what others are doing and the amount of such information they provide. YouTube, Facebook, and Twitter abound with it, while more private messaging systems like Snapchat and WhatsApp have less. But all provide some kind of social information about the actions of other people.

What effect on behaviour does all this social information have? In previous chapters, social information has been diagnosed as a form of influence exerted by social media on people deciding whether to participate. When citizens are invited to take part in a small way in some kind of political campaign or mobilization by friends, campaigning organizations, or some remote part of online social networks, these requests are accompanied by some kind of social information, and this information will influence their decision making. In the last chapter, we started to investigate the effect of one particular kind of social information, trending information, on people’s propensity to sign petitions. We used a natural experiment and aggregate data to establish that petitions included in a ‘trending’ list receive disproportionately more signatures than those that are not, making popular petitions even more popular and unpopular petitions more unpopular, reinforcing inequality in the competition for public attention. Here we develop our analysis of social information effects at the individual level, investigating how social information acts upon individual subjects in an experimental setting.

We borrow the term ‘social information’ from social psychology, where ‘social information processing’ refers to the study of the informational and social environments within which individual behaviours occur and to which they adapt.1 Social information may be presented in a variety of ways: as an aggregate count of participants, as the total amount contributed to a cause, or as some indication of relative popularity.2 Potential participants construe this information as representing the behaviour of a ‘generalized other’ or social aggregate and take it into account when they are deciding whether to participate.3 Social information may crowd out contributions (by making it appear that enough have been made already) or crowd them in by making them feel as if they also should be contributing, with evidence pointing to the latter as a result of conformity, social norms, or reciprocity.4 Social information can also operate by affecting the perceived viability of a political mobilization, thereby impacting upon the perceived benefits of joining and altering the incentives of individuals to participate.

As discussed in Chapter 2, in online environments there is a far greater possibility of the potential participants receiving real-time feedback about how many other people have participated in any kind of activity, however small, in terms of how many have signed, participated in, donated to, supported, shared, liked, or downloaded something. Before social media became so ubiquitous, such information was likely to be available only when undertaking a high-cost act of political participation, such as joining a political party or attending a demonstration. Even then, the accuracy of the information was often questionable, with political parties always inclined to overstate their membership numbers and organizers of demonstrations often arguing with other bodies (such as the police) about how many demonstrators were involved. In a 2003 London demonstration against the Iraq War, police estimated protesters at three-quarters of a million, while the organizers claimed two million.5

In this chapter we start to probe the mobilization patterns analysed in Chapter 3. Can the provision of social information and its effect on individual decision making at different points in time explain the shape of the mobilization curves? We present two experiments designed to tackle this question, by investigating social information effects on the propensity of people to engage with political representatives and to support global issues relating to public goods. First, however, we review previous work on social information and collective action, identifying four lines of argument on social information effects from across the social science disciplines of political science, economics, communication, psychology, and sociology. Second, we discuss some of the strengths and flaws of previous experimental work in this field, and review more generally the experimental method. Third, we outline and report the results of our first experiment, an online field trial to test the effects of social information on people’s propensity to write to their representatives on the social enterprise website writetothem.org. Fourth, we discuss our second experimental design, a quasi-field experiment where participants were asked to support global political issues under varying levels of social information. Finally, we discuss the implications of these experiments for political participation on social media more generally.

HOW SOCIAL INFORMATION PROMOTES COLLECTIVE ACTION

What do we know already about the impact of social information on political participation? There is a body of work on collective action in political economy, sociology, and psychology, where theorists and empirical researchers have considered the informational context of participatory decision making. We consider below four distinct arguments regarding the effects of social information that have been identified in previous work: social pressure, conditional cooperation, tipping points, and bandwagon effects. Research that uses these perspectives identifies different causal mechanisms to explain how citizens react to social information.

One argument about how social information can exert social pressure on individuals’ decisions about whether to participate can be linked to Olson’s classic work The Logic of Collective Action, in which he argues that individuals take into account information about the potential size of the group when they consider whether to participate.6 If they perceive the group to be small, they consider it worthwhile to contribute; but in a large latent group no member will be significantly affected by the actions of another, so no one will have an incentive to contribute. Although Olson does not discuss social information explicitly, he does consider the effect of social pressure to incentivize group members to participate in small groups, but discards it for larger groups: ‘In general social pressure and social incentives operate only in groups of smaller size, in groups so small that the members can have face-to-face contact with one another’.7 As discussed in Chapter 1, there is a body of work discussing how the widespread use of the Internet could affect Olson’s thesis, particularly by reducing the costs of coordination, which makes large groups more viable.8 Lupia and Sin point to a possible effect of the Internet’s capacity to provide social information as a form of coercion,9 highlighting a footnote in The Logic of Collective Action: ‘If the members of a latent group are somehow continuously bombarded with propaganda about the worthiness of the attempt to satisfy the common interest in question, they may perhaps in time develop social pressures not entirely unlike those that can be generated in a face-to-face group, and these social pressures may help the latent group to obtain the collective good’.10 Writing in the pre-Internet era, Olson argued that such social pressures would be prohibitively expensive for groups to exert, but decades later ‘evolving technologies reduce substantially the costs of communicating with large audiences’.11 We join Lupia and Sin in viewing this hypothesis as worthy of testing, which we perform below.

Economists studying the effect of information about the contributions of others on people’s willingness to undertake pro-social behaviour, in particular making charitable donations, have labelled it ‘conditional cooperation’ (that is, cooperation dependent on evidence of the contribution of others). Such work has shown that social information increases charitable giving and willingness to participate in public goods provision and that this effect increases with larger numbers of additional participants.12 It has also been shown that people are likely to increase their contribution (by donating more money, for example) if they know that other people are increasing the size of their commitment.13 This work provides robust evidence of social information effects, and we follow these researchers in using experiments to vary randomly the existence and level of social information provided to participants. But the experiment presented here diverges from this work in a number of ways. First, most of the work undertaken thus far on conditional cooperation looks at charitable donations, rather than an explicitly political context as we do here. Second, work on conditional cooperation has tended to focus on the influence of social information on contribution amount, rather than participation per se. Psychological research shows that ‘decisions about whether to act and about how much to act, although positively correlated, may be caused by different psychological motivations’,14 indicating that the specific question of how social information affects people’s decision whether or not to act politically is worthy of further investigation.

Other claims about social information effects can be drawn from sociological work on critical mass and tipping points. The sociologists Marwell and Oliver claim (in contrast to Olson) that larger groups find it easier to form, as their size ensures a critical mass of activists who organize around public goods and any indication of large group size will act as a sign of viability and increase the rate of mobilization.15 They claim that the costs of collective action do not vary with group size because they are the same regardless of the number of potential contributors, so it is irrelevant how many others there are over and above the critical mass, removing the freeriding problem: ‘It is not whether it is possible to mobilize everyone who would be willing to be mobilized.… Rather, the issue is whether there is some social mechanism that connects enough people who have the appropriate interests and resources so that they can act’.16 In this view, evidence of a critical mass sends a vital signal to potential participants about the viability of the group, and social information of this kind could act as a tipping point, making a movement self-sustaining and bringing a rapid increase in participation. As discussed in Chapter 1, both Schelling and Granovetter used threshold models to develop the concepts of critical mass and tipping points and to explain why some mobilizations succeed and some do not.17

The fourth argument supporting the idea of a dynamic relationship between collective action and social information is based on the bandwagon effect, a label given to a situation where the information about majority opinion will cause individuals to rally to that opinion. In the same way, some authors have argued that individuals who perceive themselves to be in the minority will feel pressure either to express the majority opinion or to remain silent, in what has been termed the ‘spiral of silence’,18 which would act to reinforce the bandwagon effect. Conversely, an underdog effect is held to exist if low levels of social information cause some people to support a minority view.19 Studies of the bandwagon effect are usually carried out on voting behaviour—where opinion polls provide the social information—and have also been applied to public opinion on key policy issues,20 reflecting the concern of such research with opinion formation. Researchers of the bandwagon effect are interested in whether potential participants change their views in response to knowing the views of others, rather than people’s willingness to participate at all, as in the research reported here. However, given that the effects of social information on these different parts of the decision-making process can be difficult to distinguish, we use the bandwagon idea to provide an alternative hypothesis for the effect social information might have. Empirical support for the bandwagon effect is sparse,21 and where an effect has been identified it seems to apply only to social information about trends rather than to current levels of support. Nadeau et al. find only an absolute effect of information about trends with no numbers.22 Marsh also shows that information about ‘static’ public opinion (that is, absolute percentages of support) has no effect on people’s willingness to participate, although information about dynamic public opinion trends (e.g., that support is rising, as in trending information) has an effect on support.23 A meta-analysis of survey studies from spiral of silence research finds little support for the theory.24

Previous research thereby provides four alternative arguments about the possible effects of social information on collective action: social pressure, critical mass, conditional cooperation, and the bandwagon effect. We test how social information provided online affects political participation by fostering conditional cooperation,25 by applying social pressure,26 by sending a signal of viability or critical mass,27 or by generating a bandwagon effect.28 We focus on anonymous information about other people rather than on effects deriving from individuals’ social and personal networks. Our expectation is that social information about the preferences of others will affect the decision whether to incur costs in the pursuit of collective action, with the influence varying according to the levels of participation. Previous work provides us with alternative expectations about the effect of different levels of social information. That is, work on conditional cooperation, critical mass, the bandwagon effect, and Lupia and Sin’s revision of Olson’s social pressure argument leads us to predict that a high number of social information would have a positive effect, either because it is an indicator that other people are cooperating and therefore encourages reciprocity or compliance with social norms; because it exerts social pressure; because it acts as a signal of viability, indicating the likelihood of attaining critical mass; or because it indicates majority opinion and exerts a bandwagon effect. Where social information is low, Olson would argue that evidence of a small number of petitioners could encourage individuals to incur the costs of signing up, while the other views would claim that low levels of social information would have a negative effect, in the early stages of a petition for example.

Within this generalised pattern, the different arguments discussed above would lead us to expect different effects from social information at the higher end. For conditional cooperation, we expect social information to have a greater effect for reports about high numbers of other participants. However, experimental work has indicated that the differential effects would not be very large. Frey and Meier found, for example, that for two treatment groups given information about a relatively high percentage of contributors to a charitable campaign (64 percent) or a relatively low percentage (46 percent), participation rates varied by only 2.3 percent, which was not a statistically significant difference.29 For arguments about social pressure, with the hypothesis that indications of large numbers of participants could exert the same type of social pressure as Olson observed for small groups, there is little available evidence to inform our expectation of the relative weights of such pressure; large numbers might have the same effect as small numbers, meaning that the effect of social information would be relatively consistent, or could dip for middling numbers. For bandwagon effects, we expect that there would be a continuously positive effect of information about the participation of others, which would increase in a cumulative way, yielding an exponential curve if the percentage of people participating were plotted against the percentage expected to participate.

EXPERIMENTING WITH SOCIAL INFORMATION

Experimental methodologies are the perfect way to investigate the effect of social information. As discussed in Chapter 1, by randomly varying the information provided to subjects and observing the resulting effect on their behaviour, we can obtain unbiased estimates of how different kinds of information affect participation. Provided they have been implemented without threats to internal validity, experiments can provide a causal inference of the effect of one variable on another.

Experiments have already been used to test some elements of the four groups of arguments outlined above, first to test the social pressure claim, particularly to investigate people’s willingness to undertake environmentally conscious behaviour, but also for charitable donations and voting turnout. In Goldstein et al.’s widely reported experiment with the recycling of towels in hotels, a treatment group received a message to say that 75 percent of guests had recycled their towels.30 This group was 26 percent more likely to recycle than those who saw the basic pro-environmental message about not laundering towels unnecessarily. Where participants were given more local information, which was feedback information on the past recycling behaviour of guests who had used the same room, the difference with the control group was even greater. Schultz conducted a randomized controlled trial examining the impact on recycling behaviour of providing written feedback on individual and neighbourhood recycling behaviour, finding statistically significant increases from baseline in the frequency of participation and total amount of recycled material.31 The most influential treatments were door hangers informing households of the average amount of material collected from householders and the percentage participating in recycling in their immediate locality. Similarly, Gerber, Green and Larimer ran a large-scale field experiment to show the positive effect of social pressure on voter turnout, by telling subjects which of their neighbours had voted,32 which has been followed up by a series of papers and tests.33 For conditional cooperation, economists have used laboratory experiments involving public goods and cooperation games,34 and, more recently, field experiments in which subjects are provided with varying levels of information about the participation of others.35

There is much less experimental research investigating either critical mass or bandwagon effects. Marwell and Oliver give a largely theoretical argument for the existence of critical mass and do not attempt to put a numerical value on it, either in terms of absolute numbers or percentages, nor do they test its existence empirically. However, empirical support for the bandwagon effect does come from an experiment, suggesting that there is an effect of around 5 to 7 percent;36 when subjects were told that opinion was growing for an issue, they were 5 to 7 percent more likely to support this issue themselves compared with a control group. The meta-analysis of spiral of silence research mentioned above found the field to be dominated by survey-based studies, but noted that ‘experimental studies are perhaps better suited’ to answer the type of questions asked.37

In the next section, we present the first of our two experiments aimed at answering these kind of questions by testing the effect of social information on people’s participatory behaviour. We examine how social information influences people’s propensity to engage with politicians by contacting them with a concern or regarding a policy issue.

AN EXPERIMENT ON CITIZEN ENGAGEMENT WITH POLITICIANS

The potential influence of social information extends to the whole range of political behaviours. One common citizen action is to make contact with their politicians, which can be for a variety of reasons: to express a viewpoint on a policy issue, to complain about something in the jurisdiction, to express a view about the quality of a public service from a recent experience, or to make a representation about a wider policy issue that the respondent cares about. Websites and social media make the process much easier than before, as previously it was harder to find out who was one’s local elected representative. It may have been possible in the past to have phoned the local council to ask for the information; or perhaps a leaflet had been dropped off and the citizen had kept it; or maybe the details were found in a telephone book. It is likely that many people who thought about writing to their representative did not follow through because of the associated costs, unless they were very angry. Now such information is readily available online.

As introduced in Chapter 2, the nonprofit organization mySociety has a number of web applications that aim to make it easier for citizens to contact or engage with representatives. For politicians in the United Kingdom, the associated website is WriteToThem. Respondents fill in their postcode and the screen lists their representatives, such as the local MP, member of European Parliament, and local government councillors. The website indicates the activities these representatives are responsible for. It is possible to click on a name, and the website generates a form starting with, for example, ‘Dear Emily Thornberry …’, an MP for Islington (where one of the authors lives), then some blank space for completing the letter. All the respondent has to do is fill out some personal details, then click on the ‘Preview and Send’ button, and the letter goes straight to the MP. Here is a case where an Internet-enabled application reduces the costs of participation.

The experiment utilizing WriteToThem was conducted from October to December 2013. It took advantage of a common phenomenon when people examine information and decide what to do next—only some follow through and make an enquiry through the website. The mySociety tools enabled us to vary randomly the information that the respondents saw on their screen. Respondents were randomly allocated to two groups. For the control group, the screen remained as before, with no social information; the intervention or treatment group was shown social information in the form of how many others had already written to the MP (see an example in Figure 4.1). Over the three-month period, this field experiment produced a sample of 18,818 people in the control and 11,818 in the treatment groups.

Image

FIGURE 4.1 Screenshot of example of the intervention in the WriteToThem experiment. Courtesy of Tom Steinberg, director, mySociety.

Our expectation is that overall, social information will not affect the numbers of people in the treatment group who write to MPs as different levels of social information will have different effects. That is, the information that many other people have written to an MP could increase the propensity to write for those users who saw information about that MP in comparison with the control group, while information that few other people have written could decrease that propensity, with the potential that the effect of high and low levels of social information will cancel each other out. Respondents experienced a lot of variation: at the lower end fifty-five subjects saw MPs to whom nobody had written, and at the top end fifty-one subjects saw an MP to whom 560 others had written (Gerald Kaufman, a former minister). The mean is 169 and the spread (standard deviation) is 83 (see Figure 4.2, a histogram of the numbers shown to the treatment group). The effect of positive social information should increase the numbers of people going forward, but we would expect this to be countered by the drag of people seeing low numbers who would be discouraged.

Image

FIGURE 4.2 Variations in the social information given to users in the treatment group

Moving to the results, overall 39 percent of visitors who went to the message page for their MP went on to send a letter. Out of these, 32.6 percent in the control group sent a message, compared to 49.5 percent for those who saw social information, a significant difference (z = 20.4, p < .001). This is a substantive social information effect, and runs contrary to our expectations that social information would not make a significant difference until we controlled for the actual numbers shown. This finding is rendered the more remarkable by the fact that we might expect a third variable to be dampening the difference between the control and treatment groups. That is, some MPs have higher profiles than others, for example through ministerial roles or high media exposure. We might expect these high-profile MPs to receive more contacts from members of both the control and treatment groups anyway, regardless of whether social information is provided, and this ‘third variable effect’ could dampen the impact of social information by reducing differences between groups in a non-random way. We proceeded, therefore, to test the effect of varying levels of social information, which might control for the effect of this third variable.

Image

FIGURE 4.3 Percentage of messages sent to MPs grouped into low, medium, and high

Grouping the numbers shown associated with each MP into three bins of equal size (low, medium, and high), we find that citizens writing to MPs in the medium and high groups send a greater percentage of their messages than do citizens writing to MPs in the low group (Figure 4.3). Within the treatment group, however, the percentages of citizens going on to submit their letter do not differ significantly between the groups.38 That is, while the overall effect of social information is substantive, a high level of social information has no greater effect than a low level. Again, the finding that all levels of social information had more or less similar effects is contrary to what we expected.

So social information seems to be affecting people’s propensity to contact their representative, in a field experiment setting that provides us with external validity and allows us to generalize to other real-life contexts. Social information has a strong positive effect overall, but we have not identified any variations in effect from different levels of social information, something observed in some of the experiments discussed in the second section. There are three possible explanations for this finding, all of which derive from the field experiment setting. First, the effect of the third variable noted above, the profile of MPs, might also be having a mediating effect, in that it is affecting the social information (the independent variable) as well as the propensity for citizens to write to that MP (the dependent variable). A second possible explanation operating at the higher level of social information is provided by the small numbers of participants in the higher groups; only fifty-one participants saw social information numbering over five hundred (those who saw Gerald Kaufman), which reduces our ability to test for the effect of social information at this level, particularly as even this level might not be regarded by participants as especially high, in a constituency of over seventy-six thousand. A third explanation is provided by the distinctive context of contacting a representative in a UK setting, where this action is an official part of the citizen redress process. Constituents who are writing to complain about an issue relating specifically to themselves, in this sense a private rather than public good, are unlikely to be affected by the information that very few or no other citizens have contacted the representative, and in this way are immune to the social information effect.

We proceed therefore to our next experiment, using a more controlled environment that allowed us to design out some of these issues deriving from the field setting and to test for varying levels of social information with no third variable effect as provided by the profile of the representative. We could also ensure that there were sufficient numbers of participants receiving higher levels of social information and use a context, support for global political issues concerning public goods, where the mechanism for reacting to social information was likely to be similar across issues.

AN EXPERIMENT ON WILLINGNESS TO SUPPORT GLOBAL POLITICAL ISSUES

In order to test for the effect of different levels of social information, we designed a quasi-field experiment,39 drawing on experiments that have tested for social information effects in non-political contexts, particularly those conducted by Goel et al. and Salganik et al.40 As discussed in Chapter 3, Salganik et al. explored the effect on cultural markets of information about other people’s preferences through an Internet-based field experiment that encouraged people to participate by providing them with free downloads of songs.41 For political activity, however, such incentives are difficult to build into an Internet-based environment, so we adopted a quasi-field design, where we used subjects from our own subject database, but they participated remotely in their own homes using a custom-built interface, making the environment more ‘natural’ and allowing us to lower the incentives in comparison with the laboratory. In this way, we could (to some extent) control people’s exposure to other forms of social information during the course of the experiment (which was strictly timed), but were able to recruit a far larger sample than would have been possible in a laboratory setting, thereby overcoming some of the external validity problems of laboratory experiments and some of the internal validity problems of field experiments.42

From our OxLab subject database we recruited a subject pool of 668 people who participated in the experiment remotely by considering six global issues successively through a custom-built web interface. They were invited (1) to express their willingness to sign a petition supporting the issue and (2) to donate a small amount of their participation fee to either supporting or fighting the issue. To express their willingness to sign a petition, subjects were required to provide their name, email, and address. This meant they had to incur some costs to support their statement even though they did not sign the petition. As a second step, we asked participants if they would like to donate twenty pence towards or against every issue, a sum that the experimenters matched in the final donation. We randomly allocated the subjects to a control group (of 173) and a treatment group (of 495).

All participants received the same six petitions with different levels of social information. In the control condition, participants received no information about how many people had already signed. For social information, we defined three categories; high, medium, and low. We defined high numbers as over one million (having observed a distinctive effect for numbers over one million in the pilot laboratory experiment), low numbers as below 100, and medium numbers as between one million and one hundred, after observing a weaker but still statistically significant effect of medium numbers in the pilot. We randomized the social information (low, medium, or high numbers of other signatories) across subjects for each petition, allowing us to control for the effect specific to particular issues as well as for specific levels of social information.

In the treatment groups, subjects were shown two petitions in each of the following categories:

•  Petitions with very large numbers of signatories (S > 1 million)

•  Petitions with medium numbers of signatories (100 < S < 1 million)

•  Petitions with very low numbers of signatories (S < 100)

We randomized the order in which participants saw the six petitions to eliminate systematic biases relating to ordering. Members of the control group were shown the same petitions, with no social information, again in random order.

The issues were all selected to be of international significance, and petitions were drawn from across different geographical spaces and points in time within the previous three years. The petitions were shown in a generic format (to control for the reputation effect that different web platforms would bring), but the numbers of signatories shown to the participants were taken from existing online petitions that had been created on these issues with different numbers of signatories (low, medium, and high). In this way we ensured that there was no deception in the experiment, because all were a representation of a real online petition with numbers that were current for that petition in some context.

The petitions were (with the high, medium, and low numbers shown in parentheses) the following:

1.  National governments should put pressure on the Chinese leadership to show restraint and respect for human rights in response to protests in Tibet (high: 1,682,242, medium: 1,189, low: 76)

2.  National governments should negotiate and adopt a treaty to ban the use of cluster bombs (high: 1,200,000, medium: 330,000, low: 7)

3.  Governments should lobby the Japanese government to stop commercial whaling of the humpback whale (high: 1,082,808, medium: 57,299, low: 98)

4.  Governments should support a stronger multinational force to protect the people of the Darfur region of Sudan (high: 1,001,012, medium: 5,978, low: 16)

5.  World leaders should negotiate a global deal on climate change (high: 2,600,053, medium: 575,000, low: 53)

6.  Governments should work to negotiate new trade rules—fair rules to make a real difference in the fight against poverty (high: 17,800,244, medium: 22,777, low: 25)

Subjects did not sign the (real) petitions in the experiment, but at the end of the experiment the interface directed them to a site where they could. The research team made the donations to the causes after the experiment. We incentivized the participants with a small payment (six to eight pounds), which varied according to the amount they chose to donate, which we paid with Amazon.co.uk vouchers. A pre-experiment questionnaire established the extent to which participants agreed (or not) with the issues in the petitions. We anonymized all subject information and did not collect addresses, in order to isolate the social information effects (visibility is investigated with a different experimental design in Chapters 5 and 6).

To calculate our results, we examined the variance across the 4,008 petitions completed by control and treatment groups, having stacked the data (showing one record for each person-petition). Overall, participants in the control group signed 61.5 percent of the petitions. When presented with low numbers, these participants signed slightly less often (0.9 percent less) than those in the control group, and when presented with medium numbers they signed slightly more (1.9 percent more). Neither of these results is statistically significant, however, so we do not find evidence to support our hypothesis that small numbers per se discourage participation. For those presented with high numbers, the treatment group signed 66.7 percent of the time, 5.2 percent more than the control group. This is a statistically significant difference (p = .015), confirming our first hypothesis that high numbers (in this case of one million or more) increase participation. Figure 4.4 shows the percentages of the treatment group signing each petition, compared with the control group, shown as the baseline. We found the effect of the high numbers treatment to be strongest for the petition on fair trade, which also had by far the highest number of signatories in this category (17.8 million), leading to a further hypothesis that the effect of high numbers varies according to the number of signatures in a linear way. But when we tested this hypothesis by using the logarithm of the number of signatures in a regression, we found no statistically significant effect.

Image

FIGURE 4.4 How different levels of social information affect signing behaviour

Note: Percentages of treatment group participants signing each petition, compared with the control group, shown as the baseline. Low, medium, and high refer to the social information provided.

Image

FIGURE 4.5 Logistic regressions showing the effect of varying levels of social information

Note: The figure combines three separate logistic regressions: one for high numbers, one for medium numbers, and one for low numbers. The two terms marked with asterisks (constant and agree with issue) are the averages from these three regressions. The remaining terms appeared in separate regressions but are shown on the same plot for readability.

To delve further into the relationship between social information and willingness to support an issue, we ran separate logistic regression models for subsets of the data comprising participants of one treatment together with the control group, using the high, medium, and low numbers as independent variables to represent social information.43 It seemed likely that the effect of social information on an individual’s likelihood to sign would vary according to the extent to which the person supports the issue at stake. So we looked at the variable indicating subjects’ agreement with a given petition, which we had measured in a pre-experiment questionnaire. Initial support varied across the issues; for example, climate change (92 percent) and fair trade (91 percent) were by far the most popular issues, while protect Darfur (77 percent) and end whaling (79 percent) were the least popular and also had the highest numbers of undecided subjects (11 percent).

As expected, initial support for the issue has a strong positive effect on people’s willingness to sign, so we controlled for this variable in all subsequent analysis. The summary of the three logistic regressions for different levels of social information is shown in Figure 4.5. For each variable, the point shows the size of the coefficient (measured on the x-axis) and the width of the line shows a 95 percent confidence interval for the coefficient. Only variables with confidence intervals that do not cross zero are significant at the 95 percent confidence level. Only for high numbers did we observe a consistent and statistically significant effect on the likelihood of signing, which confirms our prediction, and the results from the descriptive statistics shown in Figure 4.4.

We carried out a range of other tests, testing for results that would lend support to the bandwagon argument that the higher the number of signatures, the greater the social information effect, discussed earlier in the chapter. First, we investigated whether the numbers of signatories as an independent variable (rather than dummy variables for low, medium, and high numbers) yielded statistically significant results. However, using neither the number of signatories for each petition nor its natural logarithm transformation yielded statistically significant results.44

We also tested the effect of the ordering of petitions. Findings from social psychology and behavioural economics suggest that the order of presentation will affect respondents’ decisions.45 As participants were shown petitions with varying numbers of signatories in random order, we investigated whether—for instance—the fact of being prompted to consider an ostensibly unpopular petition just after a highly successful one could significantly alter a decision. Our tests did not substantiate such effects, but it remains a relevant hypothesis to examine in future research.

SOCIAL INFORMATION FUELS POLITICAL PARTICIPATION

We have investigated the effects of various types of social information on the willingness of a person to participate in an online political mobilization, applying models of social pressure, conditional cooperation, tipping points and viability, and the bandwagon effect. Internet-based social information does not challenge the assumptions of these models but, as with previous research, is shown to make a statistically significant difference to the likelihood that a person will participate in politically motivated collective action, varying across context and level of social information. We see how the social pressure exerted in the setting of small groups that Olson described but considered prohibitively expensive for large groups can now have much more widespread application in the age of social media, when real-time information about other participants is readily available. Our findings lend support to Lupia and Sin’s hypothesis that by facilitating communication with large audiences, the Internet and social media can replicate the social pressure applied in Olson’s small groups to very large groups indeed.46

Our results from both experiments support the findings of previous work on social information effects for more political contexts. That is, when potential participants are provided with an indication of how many other people have participated, they are more likely to participate themselves, confirming previous findings on conditional cooperation in charitable giving for a more political context.47 In the context of contacting representatives, as in the first experiment, this effect operated at all levels of social information. The field setting provided us with high external validity, but also introduced some dampening and moderating third variable effects, limited our range of social information applied to the treatment group, and, due to the different motivations for contacting representatives distinctive to the UK setting, made it difficult for us to investigate fully the effect of different levels of social information. In our second quasi-field experiment we were able to design away some of these issues, and in the context of supporting global policy issues, found evidence of social information effects only at the highest level. These effects seemed to be stronger than the effects on charitable giving observed by Frey and Meier,48 perhaps the most directly comparable study to those undertaken thus far because it also looked at the rate of participation rather than the contribution amount. In this second experiment, the social information effect was observed only when the number of participants reached the figure of one million, suggesting the existence of a threshold below which social information does not influence the behaviour of potential participants in this context.

We also found evidence in the second experiment for some kind of critical mass or tipping point, when participation reaches a million people. This figure seems to be higher than Oliver and Marwell envisaged (but did not enumerate) as the critical mass for a large group, but then the level of commitment required to sign an online petition in an experimental setting is far lower than anything they discussed.49 The importance of the one million figure and the mechanism by which it has an effect on behaviour remain open questions. The figures over a million were high in relation to the other social information provided to participants, but we did not ask subjects about expected levels of participation or whether they themselves considered these numbers to be high. We did not provide subjects with an estimate of the size of the latent group (that is, the potential number of people who might sign such a petition), so we could not expect them to estimate the figure as a percentage of potential participants, but one million is less than two percent of the UK adult population and insignificant in terms of the global population, more relevant to the international nature of these petition issues, so it seems unlikely that this was the mechanism at work. It could be just that one million is a large, significant, and memorable number, likely to attract media attention and act as a signal of viability for that reason. However, a further possibility could be that participants make some calculation of the absolute number of participants that a petition must attain to make a difference. One UK petition that has been shown to have a significant policy effect is a petition against road pricing, with 1.8 million signatories (noted in Chapter 3). In future experiments of this kind, this possible effect could be tested by saying to the participants beforehand, ‘Research suggests that petitions that attain two million signatures do lead to policy change’, which would give a much clearer indicator of viability.

Advocates of the bandwagon effect will find little comfort in the results from either experiment. Evidence of a uniform effect of social information (as in the first experiment) or crucial points where social information makes a difference, others where it does not, and others where it has a negative effect (as in the second experiment) goes against the bandwagon hypothesis, although the somewhat depressive effect of low numbers in the second experiment might lend some support to the ‘spiral of silence’ argument. As discussed above, researchers looking for bandwagon effects have tended to test the effect of dynamic information about trends rather than static information about actual numbers, meaning it is unclear where they would expect the bandwagon effect to start. Even if we were to hypothesise from our results that it impacts at one million, we would have expected to see a continuously increasing influence of social information after the crucial million was reached, which we did not. This finding corroborates our finding in Chapter 3 that it is the rank of a petition in the trending box shown on the government’s petition website (that is, relative popularity) rather than the actual number of signatures shown against the petition that has an influence.

These results provide insights into the influence of one type of social information—the number of participants—but there is potential for further investigation into the influence of other types of social information, such as the number of participants still required for a successful outcome, progress towards a target set by a pressure group, or what potential participants are willing to pledge if other people also participate, as in some pledge bank platforms. Other types of social influence might involve information about the personalities and preferences of other participants, their socio-demographic statuses, or their experiences of past participation. Social media regularly support the provision of feedback information, including recommender systems, reputation systems, user feedback applications, blogs, video sharing sites, and discussion streams, such as Twitter. When used for political activities, these applications allow participants to see many other types of social information, including comments and feedback in real time or information about people with similar preferences.

These experiments also highlight the sensitivity of collective action to the way that social media platforms are designed, in terms of what social information they present. In the context of contacting representatives, as in our first experiment, it seems that social information, whatever the level, increases participation. There may be different mechanisms at work, according to the level of social information, which reduce susceptibility to social influence. But regardless of the mechanisms at work it seemed that social information had a positive effect at all times in this context, and platforms that provide social information will be more successful in raising participation than those that do not. In the rather different context of expressing support for global political issues, we saw how different levels of social information may have different effects. In designing this second experiment we found a large range of online petitions set up by non-governmental organizations and individuals, some of which provided no information at all about how many people had participated and some of which gave full information. Our findings suggest that the use of social media that provide information when numbers reach critical mass, for example where one million people have participated, but withhold such information when numbers are low, may be more successful. Similarly, it may make sense for those who do not want a mobilization to succeed to withhold numbers, which may be one reason why police organizations and official authorities generally play down estimates of the number of people involved in demonstrations and protests, as noted above.

A key finding is the evidence of tipping points in online mobilization, as identified in our second experiment. Tipping points lead to instability, a flipping from inactivity to activity, from inaction to a flood of action. If collective action intermediated by social media is characterized by tipping points, then it fuels the argument that Internet-based collective action is a source of instability, unpredictability, and turbulence in political systems, introduced in Chapter 3. Both chapters used experimental methods to establish the importance of social information in shaping the way that people participate online. In the next chapter we compare the way that this information influences people’s political behaviour with the second form of social influence that we identified in Chapter 2 as an important feature of social media environments, that is, visibility of our actions by others.