Today, I had the opportunity to give a presentation based on some early analysis from my research to the International Communications Association Regional Conference in Brisbane. I've uploaded the slides and my presentation notes (which do not included the extemporary changes or comments I made during the presentation, or the questions and answers at the end of my presentation).
Today I’ll be discussing Social Networks and their Governmentality. I’ll talk briefly about my PhD research. Then I’ll give an example from one of the cases I’ve been examining in this research, and then use that case to talk more broadly about governmentality in the context of social media.
So, I think it’s fair to say that social media platforms like Facebook, Twitter and Google+ are growing significantly in both use and public prominence. Given their growth and their presence in public discourse, I think it’d be fairly uncontroversial to suggest that it is important for us as researchers to add some detail to the ways in which we understand these platforms as complex social systems.
My doctoral research focuses on the strategies of governance used by social media platform operators and the tactics of dissent used by social media users. I frame this research with a series of case studies that focus on user-platform conflict and inter-user conflict as significant events through which we can better understand the larger systems at play. My research methods include close readings and analysis of policies, and will I be engaging in interviews with users and platform operators in the near future.
One of these conflicts or ‘significant events’ is the long-running set of controversies surrounding Facebook’s policies towards photos of mothers breastfeeding. Facebook’s policies here have been cause for user protest since their first effects on users in 2008. With affected users often receiving regular attention from mainstream media outlets like the Sydney Morning Herald, the Guardian, and Canada’s National Post.
The way these policies have developed over time is, to my mind, a useful insight into Facebook’s attempts to rationalise and systematise their governance of user conduct and by extension, user expression. That is to say, they’re a key example of governmentality in social media platforms.
In 2008, a number of users found that Facebook had begun removing photos of mothers breastfeeding. Irked by this, they protested outside of Facebook’s Menlo Park campus bringing a rancour of media attention to the policy.
A spokesperson for Facebook told the media at the time that1;
“Photos containing a fully exposed breast, as defined by showing the nipple or areola, do violate [our] terms (on obscene, pornographic or sexually explicit material) and may be removed. The photos we act upon are almost exclusively brought to our attention by other users who complain.”
Three years later, the same spokesperson contested the idea that they had a policy at all, telling a reporter who argued that Facebook had a censorious influence on discourse that they had2:
“No policy against this at all. […] We have removed some pictures of naked women who happen to be holding a baby.”
In 2013, a formalised policy was announced in Facebook’s community guidelines. It told users3:
“We agree that breastfeeding is natural and beautiful and we're glad to know that it's important for mothers to share their experiences with others on Facebook. The vast majority of these photos are compliant with our policies. Photos that show a fully exposed breast where the child is not actively engaged in nursing do violate the Facebook Terms. These policies are based on the same standards which apply to television and print media. It's important to note that the photos we review are almost exclusively brought to our attention by other Facebook members who complain about them being shared on Facebook.”
And most recently, in May this year, Facebook amended this formalised policy, excising the part that distinguished between the photos that showed ‘active nursing’ and attributed their policies to broader media standards4.
You can see some distinct changes in attitude between each of these statements. Where the first two are confrontational — stating and enforcing the ‘Facebook’ perspective, the third is deferential — it furtively blames the standards of print and screen media and the users who use the ‘report’ tool to flag the content for a moderator’s attention. At the same time, it couches Facebook’s intermediary role in shallow platitudes.
We can definitely see change in the outward, public-facing communication of the platform’s policies. At present we can only really guess whether or not any substantive changes have occurred in the internal processes involved with these policies.
You might recall headlines in the mainstream media late last year or early this year celebrating Facebook’s developing policies. As users have continued to come into conflict with the new policies, it might be fairer to suggest that the policies changes are largely strategic and represent a change in policy rationale more than a change in policy deployment.
The difference between the first policy’s description of nipples and areolae, and the more recent policy's mandate that exposed breasts must be ‘actively engaged’ is largely arid semantics. A child ‘actively engaged’ in nursing will serve the same ends as censorship — acting as tiny human censor-bar for the verboten nipple.
What’s interesting is perhaps the development in awareness from Facebook that they can use less direct language to appease users and perhaps even to escape some degree of scrutiny.
The rationale also no longer alludes to Facebook's policy for nudity and pornography. Where the former policies mentioned this, users seemed to interpret that as Facebook implying that nursing was nudity and best not seen, or that breastfeeding was a pornographic act.
The new policies from 2013 and 2014 offer platitudes in their place. Beyond the platitudes they also offers others for blame. This, I find, is a particularly interesting development — as it seems as if Facebook is telling its users “Sorry! It’s not our fault!” – as if their policies and enforcement is mandated by the mainstream media or by prudish users.
The reality Facebook is the primary arbiter and executor of power in these conflicts. They are not beholden to the standards or norms of the societies around them — instead, they can create and enforce their own.
So Facebook has changed the letter of the policy but not necessarily its intended effect and, then blamed both the existence of the policy and its enforcement on an ill-defined Other. This allows Facebook’s operators the opportunity to hide their own strategic decision-making processes by adopting a ‘technocratic pose’. I borrow this term from Evgeny Morozov5, who in turn borrowed it from the historian Kenneth Alder6.
In their use, the technocratic pose is a rhetorical stance that conceptualises a politically neutral relationship between technology, its users and its engineers. The pose, in a variety forms, is a regular appearance in the communications of social media and IT platforms — especially when their operators seek to minimise their influence or their responsibility. Google Chairman Eric Schmidt, in particular has described technology as a mirror for humanity — which is perhaps a disingenuous metaphor if you consider Google’s business model...
None of this is to castigate Facebook for daring to police its platform. It is reasonable and expectable that platforms like Facebook create and enforce regulations and make judgements about the content that users contribute to the platform in response to pressures from aggrieved users, but this does not absolve Facebook of responsibility for its policies and their consequences.
Facebook has taken a custodial role7 in its users public and semi-public discourse and is in a determining position that allows it to create and enforce standards for what is acceptable as expression for over a billion users.
Of course, Facebook has a conventional right to govern their platform as they see fit, and without consultation with users. They are the conventional owners of the software, and the hardware that it runs on. This is complicated when we discuss the rights of users as Cynthia discussed in relation to the Universal Declaration of Human Rights in her keynote on Wednesday. This is also complicated when we consider that the participation of users can be understood as an affective form of labour, and the effects that labour can have on ownership. These platforms are complex social constructs with many stakeholders. Moreover, these are new communicative intermediaries in the social and political lives of hundreds of millions of people, worldwide. Our regulatory structures have not caught up with the changes that systems like Facebook bring.
Conflicts between users and operators can be useful for research — they can show us how systems of governance are orchestrated in social networking platforms — because they occasionally result in a pulling back of the curtain or a frank(er) engagement with the platform’s public about how these spaces are governed.
I’ve found it helpful to consider social media platforms as a type of non-electoral governed space; the software itself forming a jurisdiction of sorts, the presence of engaged users — a citizenry, and the platforms owners and operators typically constitute a governing body (or government).
It’s important to recognise that insofar as we can discuss governance and government on social media platforms, we must understand that these governments are not like our own. They are neither electoral, nor representative. The governmental operators in this space are only afforded authority by their continued ownership of the space. The effect to which internal or external regulation either provides or fails to provide checks and balances for their power is not yet well understood.
Government (in its typical form, or in the form that I discuss in a social media context), insofar as it is are afforded authority and power by their citizens, is a structure that attempts to shape aspects of the behaviour of its citizens according to certain types of norms for a variety of goals.
We must be concerned with the forms of knowledge, techniques and means used by them, the types of authority and agency they exhibit, the entities they attempt to govern as well as the ends they seek, the outcomes of their attempts and their broader consequences.
These constitute frames of research for what Michel Foucault terms ‘Governmentality’. Although Foucault struggled to consistently define Governmentality as a concept and area of knowledge, cycling through a number of concepts, theories and descriptors of governance and governmentalisation in lectures before his death; At the risk of offending Foucault purists, I use O’Farrell’s synthesis of the term8 “the rationalisation and systemisation of a particular way of exercising political sovereignty through the government of people’s conduct” for my research.
The very notion of government and governing entails the possibility that the governed are “to some extent capable of acting and thinking otherwise”9 — that is, capable of dissent and disagreement. Governments and their dissidents often have unequal power and unequal opportunities to affect change to the systems they operate or operate within, so studies of Governmentality are thereby concerned with strategic rationalisations and deployment of governmental power and how these influence the governed spaces.
One of the most fascinating aspects of governmentality in the context of social media is that whilst the platforms often present themselves as open and inviting of users free expression the platform’s policies and practices encourage a sort of classical self-regulation whereby users are mindful of the possibility for censorship and punishment and moderate their public conversation accordingly.
This is a recurring theme in online platforms. To borrow from Torres10;
“The paradox and contradiction of our contemporary governmentality is: on the one hand, it is open and tolerant, while on the other hand it deploys much more flexible, penetrating and exhaustive forms of control”.
Given the time constraints of this presentation I can’t go into great detail here, but you can also see aspects of Facebook’s Governmentality, by looking at what behaviours they see fit to police and regulate. This 'What Happens After You Click Report'11 document, shows a number of teams that police the platform, including a Safety Team, an Abusive Content Team, an Access Team and a Hate and Harassment team.
Much in the way that a police force’s ‘Vice Squad’ or ‘Drug Squad’ is symptomatic of a government’s prerogatives on law enforcement (and the publics commission of crimes in a jurisdiction) so too are Facebook’s teams likely to be representative of the types of behaviour that users participate in, and the types of behaviour that the platform operators seek to police.
I hope to accomplish two key things with this research, firstly; to create a clearer understanding of what’s going on behind the closed doors in these systems of governance, and secondly; to understand better how users interact with these quasi-governments and their structures.
I’m looking at the three big social platforms — Facebook, Twitter and Google+, and different types of user-platform and user-user conflict in each case.
Some of my colleagues were kind enough to tweet my presentation. Here are their tweets.
— Jean Burgess (@jeanburgess) October 3, 2014
— Axel Bruns (@snurb_dot_info) October 3, 2014
— Axel Bruns (@snurb_dot_info) October 3, 2014
— Axel Bruns (@snurb_dot_info) October 3, 2014
— Emma Potter (@ejpott) October 3, 2014
Sweney, M. (2008). Mums furious as Facebook removes breastfeeding photos. The Guardian. Retrieved October 10, 2013, from http://www.theguardian.com/media/2008/dec/30/facebook-breastfeeding-ban
Ingram, M. (2011). The downside of Facebook as a public space: Censorship. Giga Om. Retrieved October 10, 2013, from http://gigaom.com/2011/06/21/the-downside-of-facebook-as-a-public-space-censorship/
Dumenco, S. (2013). Facebook Now OK with Gory Beheading Videos. Ad Age. Retrieved December 19, 2013, from http://adage.com/article/the-media-guy/facebook-gory-beheading-videos/244886/
Facebook. (2014). Facebook Community Standards. Facebook. Retrieved December 10, 2013, from https://www.facebook.com/communitystandards
Morozov, E. (2013). To Save Everything, Click Here: The Folly of Technological Solutionism (Kindle Edition.). PublicAffairs.
Alder, K. (1999). Engineering the Revolution: Arms & Enlightenment in France, 1763-1815 (2nd ed.). Princeton, NJ: Princeton University Press.
Gillespie, T. (2012). The dirty job of keeping Facebook clean. Culture Digitally. Retrieved October 22, 2013, from http://culturedigitally.org/2012/02/the-dirty-job-of-keeping-facebook-clean/
O'Farrell, C. (2005). Michel Foucault. London: SAGE Publications.
Dean, M. (1999). Governmentality: Power and rule in modern society (1st ed.). London: SAGE Publications.
Gehl, R. W. (2013). What's on your mind? Social media monopolies and noopower. First Monday, 18(3). doi:10.5210/fm.v18i3.4618
Facebook Safety. (2012). What Happens After You Click “Report.” Facebook. Retrieved June 30, 2013, from https://www.facebook.com/notes/facebook-safety/what-happens-after-you-click-report/432670926753695