Issue 1, 2017

Embodiment and Affect in a Digital Age
Understanding Mental Illness among Military Drone Personnel

Alex Edney-Browne

Drone Personnel: Digital Age Soldiers

The US-led coalition’s increasing reliance on drone technology has provoked concern about the ‘virtualisation of violence’ (Der Derian 2009, 121). It is feared that technological mediation in drone warfare de-humanises victims and distances drone personnel, physically and psychologically, from the violent reality of their actions. The human victim of drone surveillance or attack is ‘reduced to an anonymous simulacrum that flickers across the screen’ (Pugliese 2011, 943), while drone personnel perpetrating that violence are ‘morally disengaged from [their] destructive and lethal actions’ (Royakkers and van Est 2010, 289). Discussions about physical and psychological distance are not new to twenty-first-century violence. Hannah Arendt (1963) and Zygmunt Bauman (1989), in their efforts to better understand the Holocaust, pointed to an intrinsic link between technology, distance and twentieth-century genocides. The Nazis, they argued, relied heavily upon technologies and techno-scientific discourses to justify, sanitise and commit mass violence. Perpetrators of violence were distanced from their victims: government bureaucrats and medical professionals became hyper-rational murderers with the help of techno-science’s distancing and de-humanising effects. This was one of Arendt’s (1964 [orig. 1963], 26) famous insights in Eichmann in Jerusalem: A Report on the Banality of Evil. Katherine Hall Kindervater has elucidated the historical trajectory of military drones, tracing their emergence back to the UK’s development of the ‘Larynx’ and the ‘Ram’ in World War II. These unmanned aircrafts were designed in the hope that they would, among other objectives, ‘extend the range and kind of attack’ to ‘limit risk to the pilot’s life’ and overcome conditions in which ‘the human pilot was at a disadvantage’ (2016, 4). Snipers, aerial assaults and other long-range weapons all attempt to distance perpetrators of violence from their victims, lowering the risk of return fire and, potentially, making killing physically, emotionally and psychologically easier for the soldiers.

Killing-from-a-distance appears only to have intensified in the ‘Digital Age’, as globally-networked technologies allow violent perpetrators to maim and kill their victims from a completely different spacio-temporal sphere. Academics of a more techno-utopian bent celebrate networked technologies for their ‘democratising’ effects: in the digital age, citizens can transcend the spacio-temporal borders of the nation-state to communicate with each other worldwide (Castells 1996; Held 1999; Beckett and Mansell 2008). The nation-state’s power to include and exclude people from the community – to manage national identity and the body politic – is undermined, as people use networked technology to create communities and mobilise politically across the globe. Conversely, nation-states can harness networked technologies to bolster their power, increasing their surveillance capacity and ability to inflict violence world-wide. This is what the United States has achieved through its National Security Agency programs, cyberattacks and military drones. As James Der Derian (2000, 775) puts it, ‘sovereignty […] now regains its vigour virtually’. Caren Kaplan likewise argues that despite ‘all the flashy theorising about cyberculture and its utopian potential, the technologies of war may seem to be the epitome of triumph of a world without boundaries or limits – where the subjects eliminate their objects without regret or discomfort of embodied proximity’ (Kaplan 2006, 397). Military drone technology is particularly effective at bolstering the US’s power in its amorphous ‘war on terror’. The mobility of drones, and their dual capability of surveillance and assassination, is perfectly suited to meeting the US government’s changing security imperatives, as ‘de-territorialised’ militant organisations shift locations and new organisations (or individuals) gain traction. From roughly 7000 miles away, drone personnel can collect vast amounts of signals intelligence and geolocation data in Iraq, Afghanistan, Pakistan, Syria, Yemen, Somalia and Libya, and maim and kill people when this surveillance identifies ‘suspicious’ activity.

Drone personnel – the soldiers of the digital age – are often constructed in academic literature as present-day Eichmanns or videogame players. Drone technology is allegedly ‘distancing soldiers from the consequences of their actions’ (Benjamin 2013, 87). Drone teams may be connected to the battlefield ‘via a wireless signal or fibre optic cable’, but they are not connected ‘emotionally or psychologically’ (Singer 2009, 335). As Joseph Pugliese (2016, 3) writes, ‘tele-techno mediations work to generate a type of causal disconnect […] of the US-based drone operator’s relation to the killing’. Medea Benjamin (2013, 86) warns that ‘undertaking operations entirely through computer screens and remote audio feed’ can ‘blur the line between the virtual and the real worlds’. ‘Suburban pilots’ work from the Nevada desert ‘in air-conditioned units and scan video screens, adjusting their soda straw digital view of the world with a joystick’ (Ian Shaw 2013, 545). The videogame analogy is also common: ‘from Afghanistan to Iraq, virtuous war has taken on the properties of a game, with high production values, mythic narratives, easy victories and few bodies’ (Der Derian 2011; 272). In his report to the UN, Special Rapporteur Philip Alston states that ‘because operators are based thousands of miles away from the battlefield […] there is a risk of developing a ‘Playstation’ mentality to killing’ (2010, 25). A 2010 non-government organisation (NGO) report titled Convenient Killing: Armed Drones and the Playstation Mentality likewise warns of ‘a culture of convenient killing’ whereby ‘at the touch of a joystick button the operator can fire missiles or drop bombs on targets showing on a computer screen’ (Cole, Dobbing and Hailwood 2010, 4; 6). ‘Rather than seeing human beings’, drone personnel ‘perceive mere blips on screens’ (Cole, Dobbing and Hailwood 2010, 4). Mediation is equated to an instrument of psychological distancing: one that allows drone personnel to de-humanise their victims and disconnect themselves from the violent reality of their actions.

The alleged psychological ease with which drone personnel carry out their work is undermined by psychological studies and the handful of available personal testimonies from drone personnel (more on these testimonies later). The phenomenon of drone personnel suffering Post-Traumatic Stress Disorder (PTSD) has been well-reported by journalists over the last five years. Psychological studies reveal equal, sometimes higher, prevalence rates of PTSD in active-duty drone pilots as manned-aircraft pilots, despite drone personnel’s complete spacio-temporal removal from the battlespace (Asaro 2013, 217). Otto and Webber’s study of 709 US drone pilots finds ‘1 of every 12 pilots received at least one incident MH [Mental Health] outcome’ (defined as diagnoses or counselling for anti-social behaviour, depression, anxiety or PTSD) between 2003 and 2011 (2013, 5). They conclude that there was ‘no significant difference in the rates of MH diagnoses, including post-traumatic stress disorders, between RPA [Remotely Piloted Aircraft] and MA [Manned Aircraft] pilots’ (2013, 3). Another comparative psychological study of 670 drone pilots and 751 manned-aircraft pilots found that 5% of drone pilots presented with symptoms that placed them at high risk of PTSD (Chappelle and McDonald 2012, 6). This was higher than their findings for manned-aircraft pilots, of whom only 1% were at high risk for PTSD (Chappelle and McDonald 2012, 6). A 2014 study of 1084 USAF drone operators found that 4.3% of respondents reported ‘clinically significant PTSD symptoms’ (Chappelle et al. 2014, 483). This was considered to be ‘on the low end of rates (4-18%) of PTSD among those returning home from the battlefield’ (Chappelle et al. 2014, 483). Despite this, it is still clear that drone personnel cannot be homogenously characterised as psychologically removed videogame players. Psychological studies on PTSD prevalence in drone personnel complicate the popular notion of the unfeeling videogame warrior.

There is a danger, however, in constructing PTSD prevalence rates as the primary point of entry for discursive engagement with drone personnel’s psychological health. The researchers of the above-mentioned studies note that their findings are limited, as respondents – particularly active-duty personnel – may avoid self-reporting PTSD symptoms. PTSD diagnoses require ‘severity and persistence’ of a cluster of symptoms (intrusive recollections of traumatic event/s, avoidance of stimuli and increased arousal) for over a month, and are considered more serious than other diagnoses such as anti-social behaviour, anxiety and depression. Active-duty drone personnel could be concerned that in-service PTSD counselling or a PTSD diagnosis would jeopardise their careers (increased at-work monitoring, stalled promotion or temporary disqualification). There is also a ‘strong cultural and community stigma’ towards mental health diagnoses in military institutions (Chappelle, McDonald and Salinas 2011, 5). Individuals are likely to downplay PTSD symptoms to avoid career-damaging effects and possible social stigma arising from a serious mental health diagnosis. Due to the secretive nature of their missions, drone personnel are additionally limited in who they can approach for psychological support both within and outside the military (Linebaugh 2012).

Not only are PTSD prevalence rates too contentious to invest heavily in, but there is also a much broader range of emotional and psychological harm that ought to be as significant when considering drone personnel’s emotional and psychological responses to mediated killing. Twenty percent of drone personnel reported suffering ‘high emotional distress’, defined as ‘anxiety, depression, emotional adjustment difficulties’, severe enough to indicate the ‘need for mental health care’ (Chappelle and McDonald 2012, 6). Ouma, Chappelle and Salinas’s study found that ‘approximately one out of every five active duty operators were twice as likely to report high levels of high emotional exhaustion when compared with National Guard/Reserve operators’ (2011, 12). A different study, on the necessary psychological attributes for drone personnel, states that the work ‘can be very taxing and stressful’, so it is important for recruits to possess ‘the ability to compartmentalise the emotional rigours of one’s job’ (Chappelle, McDonald and King 2010, 19; 20). Compartmentalisation, the study finds, ‘is an important trait for long term stability’ (Chappelle, McDonald and King 2010, 20). All of these studies state that long hours, shift work and shift changes contribute to high emotional distress, but it is important to also consider emotional and psychological stressors that active-duty personnel would feel less comfortable reporting in studies led by military psychologists – stressors that are far more likely to require ‘compartmentalisation’ than shift work fatigue.

This article will proceed in an interdisciplinary manner, drawing upon work within media, screen and cultural studies – including media theory, science and technology studies (STS) and phenomenology – to offer theoretical tools for understanding drone warfare’s mediating effects. It will argue that drone personnel’s psychological illnesses and emotional testimonies problematise common assumptions about mediated, high-technology war. Two hypotheses will be provided for why drone personnel experience unexpectedly high rates of emotional distress, anxiety, depression and PTSD, both of which challenge the idea that technology inherently causes psychological distancing. It is my intention that this paper is speculative in the best sense of the term; that the hypotheses I suggest will prompt further empirical research into drone personnel. My first hypothesis engages with media theory to consider how empathy can develop through surveillance technologies, thereby humanising the supposedly de-humanised victims of drone attacks. The second hypothesis draws on STS and phenomenology to suggest a ‘boundary’ collapse between drone personnel’s bodies and their equipment. I argue that this ‘boundary’ collapse, or ‘leakiness’, could cause psychological distress when it comes to technologies of killing. These hypotheses do not attempt to offer a complete explanation for how and why drone personnel experience mental illness, nor are they mutually exclusive. They do, however, seek to offer possible answers for phenomena evidenced in former drone personnel’s personal testimonies and those hinted at, but likely under-reported, in psychological studies of active-duty personnel.

Perhaps most importantly, this paper aims to encourage discursive acknowledgement and investigation of drone personnel’s mental illnesses. The psychological health of drone personnel has become a site of conflict for academic, NGO and activist critiques of drone warfare. Any academic research on violent perpetrators raises ethical concerns. Feminist Standpoint theory has demonstrated the social and political importance of situated knowledge, and the discursive power that comes with focusing on the lives of marginalised peoples and giving voice to the voiceless in academic research (Collins 1990; Smith 1990). To give voice to the perpetrator of violence (particularly state-sanctioned violence) can re-inforce their power in knowledge production, and can offer legitimacy to their actions. It can also draw attention away from the victims of violence and their pain and suffering. A recent review of Good Kill (a film about a US Air Force drone pilot) seems motivated by this concern, with its provocative title: ‘Drone Operators Get PTSD, Civilians Die Nameless’ (Gharib 2015). In the case of drone warfare, the mostly Muslim victims of drone strikes are already largely invisible in Western public discourses, where the deaths of white, non-Muslim Westerners are far more likely to be grieved (Butler 2003, 27). There is the risk, then, of playing into colonial ideologies whereby war is only worth protesting once harm to Western (mostly white and non-Muslim) lives is evidenced (Gregory 2015, 207).

The controversy surrounding the study of drone personnel is partly motivated by the same concerns as Feminist Standpoint theory, but is more pronounced for reasons unique to drone warfare. The ‘radical asymmetry’ of drone warfare has become the linchpin of drone warfare criticism, and it is this objection that resonates with the public (Enemark 2014, 367). Regardless of one’s knowledge of drone warfare, it is easy to identify the moral problem with US coalition soldiers being geographically removed from the battlespace and safe from physical and psychological harm, while people in targeted countries are vulnerable to both. Academic or journalistic work that takes interest in drone personnel’s psychological health is seen to complicate this neat asymmetry argument, broadening current understandings of risk and harm to include psychological harm and its physiological manifestations. In Drone Theory – to-date the most popular theoretical book on drone warfare – author Grégoire Chamayou expresses his scepticism towards counter-representations of drone personnel, in particular what he calls the ‘media picture of empathetic drone operators suffering psychic trauma’ (Chamayou 2015, 109). He writes that ‘whereas the attention drawn to soldiers’ psychic wounds was in the past aimed at contesting their conscription by state violence, nowadays it serves to bestow upon this unilateral form of violence an ethico-heroic aura that could otherwise not be procured’ (Chamayou 2015, 109). A joint report by numerous NGOs released in October 2016 echoes Chamayou, stating that it is ‘drone advocates’ (emphasis added) who challenge the ‘Playstation Mentality’ thesis (Drone Campaign Network 2016, 14). Highlighting drone personnel’s suffering thus becomes a pro-military move, as it undermines one of the most communicable and resonant ethical objections to drone warfare: its radical asymmetry.

It is important, however, that academics who find the US coalition’s use of drones objectionable draw on all available resources to mount their critique. This includes taking seriously psychological harm to drone personnel. Pentagon spokespeople and military academics argue that governments have a duty of care to protect their soldiers from unnecessary risk of harm (Strawser 2010; Weiner and Sherman 2014; Plaw 2012). Drone technology’s alleged ability to protect soldiers from harm is evoked to justify their use. A key weakness of these arguments is that their conceptions of harm do not account for psychological harm (anti-social behaviours, anxiety, depression and PTSD). As Alison Williams (2011, 387) argues, these commentators ‘mistakenly assume that it is only the physical body that can be damaged by warfare’. Furthermore, they advance a mind/body dualism that ignores the physiological effects of these psychological illnesses (including muscle tension, rapid heartbeat, breathlessness, increased blood pressure, gastrointestinal problems, nausea and body shaking) (McFarlane et al. 1994; Stahl 2002; Aldao et al. 2010; Craske 2012). Contrary to Chamayou’s argument in Drone Theory, there is often no difference in intention between those who illustrate soldiers’ psychological wounds today and those who did in past wars. The purpose is still to contest state violence: the (false) promises made to recruits to attract them into the drone program, the psychological illnesses they suffer as a result of their work, and the ways they are (mis)treated by the institution if or when they become psychologically unwell. Rather than comply with a false dichotomy of care – for either military personnel or civilians – or engage in debate about who suffers more, this article gives discursive attention to drone personnel out of concern for all human suffering in war. This shares commonality with a growing body of scholars such as Alison Williams, Caroline Holmqvist, Lauren Wilcox, Ian Shaw and Majed Akhter who highlight the importance of thinking about humans on both sides of drone technology. Holmqvist (2013, 536; 541) writes of the ‘need to centre human experience to the study of […] war’. She states that ‘drone warfare is “real” also for those staring at a screen and, as such, the reference to videogames is often simplistic’ (2013, 536; 541). Shaw and Akhter (2012, 1501) argue that academics must ‘intervene to dismantle the production and maintenance of the drone fetish […] to reinsert a disavowed corporeality’ into drone warfare discourse. It is crucial that academics increase the visibility of the bodies maimed, killed and psychologically tormented by drone attacks and surveillance. Discourse on the psychological and physiological effects of drone warfare on military personnel, however, does not inhibit this work. Instead, it plays another important role in de-fetishing the drone and reinserting corporeality into drone-warfare discourse.

The Need for an Interdisciplinary Approach

Most research on drone warfare has come from the field of International Relations (IR) (with critical geography a close second). Critical research on drone warfare would be significantly enriched through an interdisciplinary engagement with Media, Screen and Cultural Studies. IR scholars have not been completely blind to media, screen and cultural studies. Since the First Gulf War, IR has taken an interest in the application of media theory to the study of mediated, high-technology war. This application, however, remains pre-occupied with Information Age debates characteristic of the 1990s when many scholars thought that information technology led to a ‘loss of social bonds’ and ‘the demise of the proximate human being’ (Virilio 1999, 86). With this outdated view of media technologies still influential and often evoked in IR, drone technology is commonly described as an instrument of US coalition hegemony that can only de-humanise victims and turn users into unfeeling hyper-militarised warriors. The degree to which a victim is proximate and embodied is assumed to have a causal relation to drone personnel’s psychological and physiological responses to killing. Mediation is considered a barrier to affect, emotions, psychological reactions and physiological sensations. This neglects a plethora of earlier media, screen and cultural-studies theory and more recent ‘pervasive media’ theory that argues the opposite. Media, screen and cultural studies has a decades-long engagement with mediation, human-technology interaction, embodiment, phenomenology and affect. This work offers useful theoretical frameworks for making sense of mediated, high-technology war.

In developing its two hypotheses, this article engages with the above-mentioned range of media, screen and cultural theory to consider why drone personnel experience high emotional distress and other psychological illnesses. This interdisciplinary contribution is timely as critical and feminist international relations scholars lead an ‘affective turn’ within the IR discipline. The discipline’s Realist tradition of privileging the nation-state as the most appropriate unit of analysis is coming under close scrutiny. Many feminist and critical IR scholars argue that Realism has always been a limited approach to understanding the complexity of world politics and security, but is even more limited today when globally-networked technologies and de-territorialised political problems stretch the boundaries of nation-states. Diana Coole and Samantha Frost (2010, 5), in demonstrating the necessity of their ‘New Materialisms’ approach to studies of international political economy, write that traditional theoretical models fail to consider the ‘significance of complex issues such as climate change or global capital and population flows […] or the saturation of our intimate and physical lives by digital, wireless and networked technologies’. Coole and Frost draw attention to the interrelatedness of things (subjects and objects), emphasising the instability of categories assumed in Realism to be fixed: ‘the relationships of humans to the world, the very definition of the human to the nonhuman and the way shifting definitions of nature and life affect subjective experiences of selfhood or the forms and domains of politico-juridical regulation’ (2010, 21). In these critical/feminist IR frameworks, emotions, affect and the relationality of subjects and objects are recognised as having a significant bearing over political agency, mobilisation and violence. As Linda Ahall and Thomas Gregory (2016, 2) argue, ‘rationalist prejudices have traditionally dominated the discipline of IR’ to the point where ‘the role of emotions in global politics has been downplayed, ignored or denigrated’. Only recently have IR scholars increasingly ‘sought to re-centre emotions in our study of international politics’ (Ahall and Gregory 2016, 2). In the digital age, world politics, conflicts and security are deeply enmeshed with media technologies, making media, screen and cultural studies a necessary inter-disciplinary engagement – particularly for academics interested in discovering the political implications of emotions, embodiment and affect.

Hypothesis 1: In the digital age, mediation and disembodiment do not inhibit recognition and empathy

In the words, tone and body language of former drone personnel, it is often difficult to identify the digital age Adolf Eichmanns or videogame players evoked by many academics, journalists, NGOs and politicians. Derek Gregory (2011, 200), Caroline Holmqvist (2013, 542) and Lauren Wilcox (2016, 12) have written on drone personnel’s ‘identification of and […] with’ the ground troops they are supporting: how they are ‘emotionally and affectively connected’ to colleagues on-the-ground despite the technological mediation at play. It is clear from personnel testimonies that drone personnel can also recognise and empathise with their so-called ‘enemies’ as humans, and that this is profoundly affecting, too. These testimonies come from a small group of former drone personnel, but offer rich empirical information that may be generalisable to a wider group of active-duty and retired personnel (who, for above-mentioned reasons, either cannot or do not want to speak publicly about how drone warfare has psychologically affected them). In her The Guardian (2013) opinion editorial, former sensor operator Heather Linebaugh opens by asking: ‘How many women and children have you seen incinerated by a Hellfire missile? […] How many men have you seen crawl across a field trying to make it to the nearest compound while bleeding out from severed legs?’ She goes onto say:

‘I watched dozens of military-aged males die in Afghanistan, in empty fields, along riversides, and some outside the compound where their family was waiting for them to return home from the mosque.’ (The Guardian Dec 29 2013)

Former drone pilot Brandon Bryant, a PTSD-suffering former drone pilot, recounts one of his traumatic experiences of killing:

‘The smoke clears […] and there’s this guy over here, and he’s missing his right leg above his knee. He’s holding it, and he’s rolling around, and the blood is squirting out of his leg, and it’s hitting the ground, and it’s hot. His blood is hot. But when it hits the ground, it starts to cool off; the pool cools fast. It took him a long time to die. I just watched him. I watched him become the same colour as the ground he was lying on.’ (quoted in Power 2013)

In another description of the same experience, Bryant mentions that he ‘imagined his [victim’s] last moments’ as he watched him dying (Democracy Now! 2015). Former drone pilot Matt Martin describes an experience of similar emotional and psychological magnitude in his book Predator: his realisation that two young boys were in the firing line of a missile he had already deployed. The older boy was riding his bike while the younger boy sat on the handle bars. When the missile struck metres away from the boys, killing them, Martin vividly remembered riding his sister around on the handlebars of his bike as a child. He recalls ‘smelling her hair’ and ‘hearing her laughter’ (Martin 2010, 211). This flashback to childhood suggests Martin had the empathetic realisation that in another reality that could be me.

These testimonies undermine constructions of drone personnel as people who do not recognise or empathise with their victims, whereby technological mediation and disembodiment turn victims into ‘ones and zeros’ (Pugliese 2011, 64). To make better sense of technological mediation, recognition and empathy in drone warfare, it is important to consider the media technology environment of the twenty-first century. High-technology, mediated interaction is part of the fabric of everyday life in today’s digital age. This is the environment within which drone personnel live, work and play. It is therefore crucial for IR (and critical geography) academics interested in the lived experiences of drone warfare to engage with media, screen and cultural studies. Media scholar William Merrin argues (2009, 17; 22) that we live in a ‘post-broadcast era’, where ‘bottom-up, many-to-many, horizontal, peer-to-peer communication’ is commonplace due to the proliferation of networked media technologies. He writes that, where broadcast media were concerned with ‘informing and uniting “the social”’, networked media technologies allow people to ‘make their social’ in ‘media worlds […] of interaction, communication, mediation, experience and information (Merrin 2009, 24-25). Mark Deuze (2011, 137) uses the term ‘media life’ rather than ‘media worlds’, but similarly writes that media technologies are so pervasive in the twenty-first century that it makes better sense to think of our lives ‘lived in rather than with media’. Media technologies are imbricated so deeply in our lives – professional, social and intimate – that ‘they are becoming invisible’: ‘people in general do not even register their presence’ (Deuze 2011, 143). This means an ‘increasing immateriality of one’s experience of reality’ whereby the mediated and the unmediated, the ‘virtual’ and the ‘real’, inform one another so closely that it is difficult to tell where one ends and the other begins. In the 1990s, when ‘Information Age’ debates were rife, the internet was known as ‘cyberspace’: a ‘coherent place that you could immersively inhabit’ that was distinct from ‘reality’ (McCullough 2004, 9). Now, the ubiquity of networked media technologies undermines our ability to clearly distinguish between un-mediated, non-networked spaces and ‘cyberspace’. It is this pervasive media environment, wherein disembodied interaction is frequent even with the most intimate of contacts, that we must keep in mind as we attempt to understand drone personnel’s lived experiences of their work.

Understanding the effects of pervasive networked media technologies on surveillance practices, in particular, can help make better sense of drone personnel’s emotional and psychological experiences. The pervasiveness of networked media technologies has led to ‘always-on, ubiquitous, opportunistic ever-expanding forms of data capture’ (Andrejevic and Burdon 2015, 19). Where it previously made sense to think of an ‘unblinking, totalitarian Big Brother’ (the government) conducting surveillance, today there are ‘more like ten thousand little brothers’ (McCullough 2004, 15). Mark Deuze argues that surveillance has moved away from the centralised control of the state ‘to the much more widespread and distributed gaze of the many’ (2012, 126). Contacts made in the digital age, ranging from the professional to the intimate, are often initiated, maintained and monitored with and through media technologies. Social media platforms allow (even encourage) close monitoring of friends’ movements, dating apps inform users of the geographical proximity of their matches, key-stroke monitoring software alerts employers to employees’ procrastination and GPS tracking apps (such as ‘Find My Friends’ and ‘Couple Tracker’) provide the real-time GPS location of partners, children and friends. Message-read receipts, social media geolocation tags and ‘last active’ information are further evidence of the normalisation of surveillance in the era of pervasive media, as the distinction between our public and private lives is increasingly blurred. It is likely that drone personnel use one or more of these media technologies in their domestic lives, and these experiences could have significant impact on how drone personnel encounter their work. The US-led drone program is, of course, a vertical (or ‘top down’) form of surveillance: drones collect vast amounts of Intelligence, Surveillance and Reconnaissance (ISR) on people across the Middle East, North Africa and South Asia without their permission; secretive National Security Agency programs enable this and other coalition governments contribute through information-sharing and intelligence-processing support. It is important, however, to consider how peer-to-peer, horizontal surveillance practices might interact with this hegemonic, top-down form of surveillance in ways that could allow (even encourage) drone personnel to recognise or empathise with their victims as humans.

Mark Andrejevic (2006, 2010, 2015) has written extensively on how generalised suspicion and widespread data collection post-9/11 has intensified and normalised ‘mutual monitoring’ practices. He writes that the culture of suspicion has been transposed ‘from the realm of post-9/11 policing to that of personal relations’ (2006, 400). Andrejevic provides a useful framework for thinking about the militarisation of everyday life: how post-9/11 military and policing practices have permeated into our private, domestic lives. In the case of drone personnel, however, it is useful to think about how this permeation might occur in the other direction: how they might find it difficult to disentangle surveillance practices in their domestic spheres from their work surveilling the so-called ‘enemy’. Drone personnel are sometimes tasked with surveilling a potential target ‘for more than eight hours a day’ (Asaro 2013, 205). From their surveillance, they can ‘see and recognise the personal details and daily activities’ of the people they are ordered to kill (Asaro 2013, 205). One former pilot writes that ‘you start to understand people in other countries based on their day-to-day patterns of life. A person wakes up, they do this, they greet their friends this way, etc.’ (quoted in Bergen and Rothenberg 2014, 115). Brandon Bryant admits to having watched ‘targets drink tea with friends, play with their children, have sex with their wives on rooftops, writhing under blankets’ (Power 2013). Depending on the altitude of the drone and the feed that is watched (surveillance footage, thermal imaging, etc.), drone personnel see their victims from a bird’s eye view as tiny dots, pixelated blobs or heat signatures. It is clear from their testimonies that this does not prevent them from recognising, and in some cases empathising with, their victims as humans engaging in human activities. Imagination is crucial in this regard, but we also need to consider the possibility that humanisation occurs because similar visualities are at work in drone surveillance as in horizontal, peer-to-peer surveillance practices. The aerial viewpoint and use of digital signifiers to denote a human’s presence is a common visuality in myriad peer-to-peer monitoring interfaces, such as Foursquare, Swarm, Uber, UberEats, Find My Friends, Couple Tracker, MapMyFitness and Facebook’s ‘nearby friends’ feature. These peer-to-peer surveillance interfaces (where humans who are already, or are about to be, known to the user in an embodied sense are represented as disembodied digital signifiers) may be difficult for drone personnel to differentiate from the visuality of the drone. It is therefore important to consider how drone personnel’s experiences with media technologies outside their work could inform their experiences at work. Drone personnel’s emotional and psychological reactions to surveillance and killing could be informed by peer-to-peer, domestic surveillance practices in the digital age.

In addition to considering surveillance cultures in today’s media technology landscape, it is also useful to think about how mediated imagery is understood and experienced by viewers. The work of media, screen and cultural studies can offer useful insight into how drone personnel might experience the mediated imagery of drone surveillance, in ways that increases their likelihood of suffering psychological illness. Derek Gregory (2011, 190) and Kyle Grayson (2012, 123) both refer to the ‘scopic regime’ of drone surveillance: a modernist visual regime that empowers viewers, giving the impression of ‘hypervisibility’ and ‘epistemological and aesthetic realism’ (Gregory 2011, 193; Grayson 2012, 123). Grayson (2012, 123) takes this further, arguing that drone surveillance’s scopic regime ‘produces a form of pleasure that can be addictive for the one with the privilege of viewing’. Scopic regime was a term first coined by film scholar Christian Metz in 1982 to explain how the cinematic apparatus encourages particular viewing behaviours (identification and voyeurism) (Metz 1982, 61). It was later applied to technological apparatuses beyond the cinema by scholars such as Allen Feldman (1997, 30), who used the term to refer to any ‘ensemble of practices and discourses that establish truth claims […] of visual acts and objects and correct modes of seeing’. Metz (1999 [orig. 1974], 79) also wrote, however, that the visual elements of the moving image ‘are indefinite in number and undefined in nature’: ‘one can decompose a shot, but one cannot reduce it’. Johanna Drucker (2011, 6) similarly argues that ‘graphical features organise a field of visual information, but the activity of reading follows other tendencies’, according to the viewer’s ‘embodied and situated knowledge, cultural conditions and training [and] the whole gamut of individually inflected and socially conditioned skills and attitudes’. A scopic regime may direct certain viewing behaviours, but it cannot dictate them: there is a whole gamut of factors, indefinite in number and undefined in nature, that can provoke alternative modes of viewing. This is what cultural theorists Stuart Hall’s (1980 [orig. 1973], 136-138) and bell hooks’s (1992, 117) theories of ‘negotiated’ and ‘oppositional’ reading refer to: possibilities (outside the hegemonic or dominant reading) for unintended, subversive or counter-hegemonic readings of mediated content. The scopic regime of drone surveillance may direct drone personnel to feel omniscient and powerful and to experience pleasure, but this is by no means the only available reading.

Thinking about alternative readings of mediated texts can help make better sense of why former drone personnel suffer from psychological illnesses, despite being directed (by the scopic regime’s dominant/hegemonic reading and military culture) to feel emboldened by their work. Drone personnel suffering with psychological illness have likely engaged in alternative readings of drone surveillance’s scopic regime – readings that encouraged recognition and empathy of their victims, or otherwise sowed the seeds of doubt regarding the (im)morality of their work. Ruptures in the scopic regime would encourage these alternative readings: moments where the so-called omniscience of the drone apparatus comes into question. Alison Williams (2011, 386) and Lauren Wilcox (2016, 9) question the ‘imperfect’ or ‘god-like’ vision of drone surveillance, arguing that the operator’s or analyst’s eye ‘cannot remain unblinking in its gaze, nor can the drone assemblage provide peripheral vision’. Furthermore, Wilcox writes, ‘the visual imagery in drone warfare is often not as clear as purported’ (Wilcox 2016, 11). These ruptures – increasing the likelihood for alternative or counter-hegemonic readings – are evident in Heather Linebaugh’s personal testimony, where she recounts feeling far from omniscient:

“The feed is so pixelated, what if it’s a shovel, and not a weapon?”. I felt this confusion constantly, as did my fellow UAV analysts. We always wondered if we killed the right people, if we endangered the wrong people, if we destroyed an innocent civilian’s life all because of a bad image or angle. (Linebaugh 2013)

In addition to these ruptures, there are also the ‘individually inflected and socially conditioned skills and attitudes’ that drone personnel bring to their viewing of drone surveillance imagery (Drucker 2011, 6). The modernist assumption that the documentary image provides ‘epistemological and aesthetic realism’ is increasingly uncommon in the digital age (Grayson 2012, 123). Digital media technologies allow users to ‘read, edit and write their codes, programs, protocols and texts’ (Deuze 2011, 137). ‘Reality’ is revealed to be ‘malleable’ by digital media technologies: it can be ‘manipulated, fast-forwarded, panned, scanned and zoomed in on’ (Deuze 2011, 137). It is this postmodern understanding of the malleability of reality – an awareness of the ‘constructed-ness’ of mediated images – that drone personnel might bring to their reading of drone surveillance’s scopic regime.

Hypothesis 2: Drone personnel experience a boundary collapse or ‘leakage’ with drone equipment

Possible causes of drone personnel’s psychological illnesses could also be identified by examining their relationship with their technological equipment. A cyborgian ‘leakage’ between human and technology could be particularly affecting when it comes to technologies of killing. Such a leakage would encourage drone personnel to transcend the self/other ‘boundary’, recognising and empathising with their victims. I draw on Donna Haraway’s work to elucidate this hypothesis. There is also the possibility that experiences of proximity with and through drone equipment are felt in relation to experiences of distance, and vice versa. Transitions between states of proximity and distance would increase the likelihood of recognition and empathy, and would provoke drone personnel to confront the violent reality of their actions. I use Martin Heidegger’s and Vivian Sobchack’s phenomenological work to develop this idea.

Posthumanist and cyborg theory scholars argue that instrumentalist accounts of technology fail to understand the porousness of the human/technology ‘boundary’. Learning from these scholars, it would be mistaken to try to make sense of drone personnel’s relationship with drone technology through an instrumentalist framework. Instrumentalist accounts of the human-technology relationship establish a false binary between bodies and technology. Marshall McLuhan (2013 [orig. 1964], 64-70) wrote in Understanding Media that technology can be thought of as an ‘extension’, an ‘amplification’ and an ‘amputation’ of the human body: a multitude of porous formations united only in their imbrication of humans and technology. Rather than discrete entities, humans and technology are enmeshed with one another in myriad ways and often lack clear definition. In ‘A Cyborg Manifesto’ (1991 [orig. 1985]), Donna Haraway offers the provocation that ‘we are cyborgs’: ‘theorised and fabricated hybrids’ of ‘human and animal’ and ‘machine and organism’ (150). Haraway invites the reader to think of distinctions between humans and machines, humans and animals, and the physical and non-physical, as ‘leaky’ (Haraway 1991, 152). Haraway’s cyborg is not a literal figure, contained within a clearly defined human-technology assemblage, although often misinterpreted as such (Phan 2015, 5). Haraway’s cyborg is political, referring to human-technology ‘leakages’ that facilitate feminist boundary-crossings between militaristic, patriarchal and colonialist dualisms: ‘self/other, mind/body, culture/nature male/female, civilised/primitive, reality/appearance…’ (Haraway 1991, 177). To approach humans and technology as discrete entities, as the instrumentalist logic does, is to ignore this cyborgian leakiness between humans and technology. Military equipment is often spoken about as discrete tools, contained within non-leaky formations. Academics will name different types of technologies used by soldiers, without further argument, as if their presence is evidence enough that de-humanising processes are at work. Going to war in the twenty-first century involves sitting ‘behind computer screens’, ‘pushing a button’ and ‘dragging a mouse’ to kill people who appear as ‘infrared heat-sensored images and laser-guided targets’ (Royakkers and Van Est 2010; Singer 2009; Masters 2005). An instrumentalist approach is not useful for understanding why drone personnel suffer with psychological illness, as it assumes an inherent link between high-technology work, de-humanisation and psychological distantiation. While controversial, it is useful to consider the feminist boundary-crossing leaks Haraway describes occurring within the (otherwise highly militaristic and patriarchal) human-technology assemblage of the drone apparatus. Drone personnel may be experiencing cyborgian leaks with the drone apparatus whereby the militaristic and colonialist dualisms of self/other and civilised/primitive are transgressed. This seems to be the case for an anonymous active-duty drone pilot, who writes: ‘you feel like you are a part of what they’re doing every single day’ (quoted in Bergen and Rothenberg, 115).

Joseph Pugliese has already initiated the application of posthumanism and cyborg theory to the study of drone technology, drawing on Donna Haraway’s work. In State Violence and the Execution of the Law, Pugliese argues that drone personnel develop a ‘prosthetic’ relationship with their equipment (2013, 203). Prosthesis is the process by which drone personnel’s bodies are extended through the technology in use: the joystick and controls are experienced, through sustained interaction, as extensions of their arms and hands. Thus the alleged ‘boundary’ separating drone personnel from drone technology is revealed as myth. Pugliese acknowledges Haraway’s utopian reading of human-technology assemblages, wherein the cyborg’s boundary-crossing nature offers opportunities to transgress militaristic, patriarchal and colonialist dualisms. However, he aims to ‘recode’ the cyborg descriptor to ‘evidence its violent assimilation and co-option by the very […] militaristic and instrumentalist authorities it was designed to contest’ (2013, 204). He argues that human-technology leakiness, rather than opening up opportunities to challenge the dualisms at work in the ‘war on terror’ (self/other, civilised/primitive, male/female, and so on) simply ‘instrumentalises’ drone personnel’s bodies into ‘lethal machines’ (205). Pugliese (2013, 204) is still convinced by an instrumentalist logic, whereby drone personnel are turned into hyper-militaristic robot warriors who are emotionally ‘disassociated’ and ethically ‘disjoined’. The relationship between human and machine is posited as unidirectional, with drone technology permanently ‘injecting’ personnel with colonialist militarism. This constructs drone technology as all-powerful – a fetishing discourse – and misrepresents Haraway’s cyborg (which sees humans and technology as porous and non-discrete).

Martin Heidegger’s phenomenological work offers a different interpretation of the human-technology relationship, but also considers it porous. Heidegger could provide another useful theoretical framework for understanding why drone personnel suffer psychological illness. Heidegger (1978, 97) argues that ‘there “is” no such thing as an equipment’ because any ‘piece’ of equipment belongs to a ‘totality of equipment’. Assertions that drone personnel are merely ‘fighting from behind a computer’ neglect this (Royakkers and van Est 2010, 292), opting instead to describe equipment in isolated terms. Heidegger contends that the totality of equipment works together ‘in order to’ carry out a function (1978, 97). Drone personnel work with their computers, joysticks, keyboards, chairs and headsets, which all refer to each other, in addition to referring to the room, the building, the military, the US government, its counterterrorism discourses, and myriad other physical and non-physical influences. Any supposed ‘boundary’ separating drone personnel’s bodies and the technology in use is surpassed: the body is extended through the totality of equipment in order to carry out the surveillance or killing of a person or people. Drone personnel’s concern is therefore not with a single piece of equipment (the mouse, the joystick, the computer screen, etc.), but subordinates itself to the in-order-to – regardless of physical distance from the person surveilled or killed. Likewise, when we Skype loved ones overseas, they feel – in every relevant sense – more proximate than the cup of tea or coffee merely an arm’s reach away. Our concern subordinates itself to the in-order-to – communication with our family member, partner or friend – and we become immersed in that activity. This subordination to the in-order-to seems evident in former drone personnel’s surprisingly detailed descriptions of the people they observed. An active-duty drone pilot, referred to simply as ‘Mike’, talks about watching ‘an old man startled by a barking dog’ (quoted in Hurwitz 2013). Brandon Bryant describes watching a group of three men through the drone’s thermal-imaging camera. ‘The two individuals in the front were having a heated discussion’, he says, and ‘the guy at the back was kind of watching the sky’ (Democracy Now! 2013). The detailed descriptions of these moments – the old man’s ‘startled’ reaction, or the man ‘looking at the sky’ while his friends had a heated argument – suggests drone personnel are immersed within the lifeworlds of the people they surveil. Instrumentalist logics fail to explain these immersive experiences.

In addition to examining how immersion might increase drone personnel’s likelihood of psychological illness, it is also useful to consider how moving between experiences of proximity and distance could be particularly traumatic when it comes to technologies of killing. Heidegger further develops his concept of the ‘in-order-to’ with the terms ‘ready-to-hand’ and ‘present-at-hand’ (1978, 97). Drone equipment is ‘ready-to-hand’ when it is all referring to each other harmoniously in order to surveil and kill (1978, 103). If there is a breakage or disruption, the ready-to-hand equipment withdraws and ‘reveals itself’ as obtrusive, becoming ‘present-at-hand’ (1978, 103). Human and technology thus undergo temporary distantiation. For example, a pen ‘reveals itself’ when it runs out of ink; a pair of reading glasses when they fog up. As Mark Weiser (1994, 7) once put it, ‘a good tool is an invisible tool […] it does not intrude on your consciousness’. Moments of breakage could thus be highly anxiety-inducing when it comes to technologies of killing, as drone personnel are provoked to confront the violent reality of their action (the ‘in-order-to’) and question the extent of their culpability within that action. Studies have found that a significant source of stress for drone pilots stems from ‘human-machine interface difficulties’, particularly the ‘ergonomic design of equipment and Ground Control System’ (Ouma, Chappelle and Salinas 2011, 11). These moments of digital friction are likely to be highly stressful for drone personnel because they are required to move from experiencing their equipment as ready-to-hand to confronting it as present-at-hand. Drone personnel are therefore repeatedly encouraged to reflect upon their body’s imbrication with technologies of killing. Heidegger’s phenomenology allows us to think about how moments of separation from drone technology are likely felt in relation to moments of proximity. The constant transitioning between proximity and separation is likely to be a highly emotional experience for drone personnel, as they struggle to situate the ‘boundaries’ of their bodies in relation to, and culpability within, a technological apparatus of killing.

Media, screen and cultural studies theorist Vivian Sobchack’s (2004) work on the phenomenology of inter-objectivity, and the theory of empathy she derives from this, is also useful for thinking about how experiences of proximity and distance might interact to psychologically affect drone personnel. Sobchack argues that empathy results from a person’s recognition that they are both an ‘objective subject’ and a ‘subjective object’ (Sobchack 2004, 288). That is, we are most capable of empathy when we see ourselves as subjects but also acknowledge the capacity for other things (animate or inanimate) to treat us as objects. We experience objectification when we are ‘acted on and affected by external agents and forces, usually adversely’ (Sobchack 2004, 287). An earthquake that destroys one’s house, for example, is an external force, putting the homeowner into a situation whereby their objectivity becomes apparent. A thief who steals one’s car is an external agent who spares no thought for one’s need to get to an important meeting. Sobchack suggests that our ‘reversibility as subjects and objects’ is what allows us to empathise with others (human or otherwise) external to ourselves, as we know what it is like to lose our subjecthood at times of objectification (Sobchack 2004, 287). It is possible, then, that empathy is provoked rather than undermined when drone personnel experience moments of distance between themselves and their victims. Drone personnel are aware, from moments of proximity, that their targets are humans (subjects). Moments of distantiation could therefore render objectifying processes unavoidably obvious for drone personnel, highlighting their role as an external agent. Sobchack’s theory radically changes the way empathy and distance is thought about. Sobchack develops a clear link between our capacity for empathy and our recognition of objectifying processes. It will be fruitful for future research on drone personnel to consider how transitions between cyborgian immersion (as subjectifying) and distantiation (as objectifying) could encourage drone personnel to empathise with their victims.

Conclusion

This article has suggested two hypotheses for why drone personnel suffer psychological illness; first, that technological mediation and disembodiment does not inhibit recognition and empathy – particularly in our digital age – and, second, that drone personnel may experience a ‘boundary’ collapse or ‘leakiness’ between their bodies and their equipment. This leakiness could encourage drone personnel to recognise and empathise with their victims, and provoke them to confront the violent reality of their actions. This article has advocated further empirical research on people who work(ed) in the drone program, despite their psychological health becoming a site of conflict for ethical discussions about military drones. Drone personnel require discursive attention, as they are also victims of drone warfare (albeit in a different way to the people surveilled, maimed and killed by drones across the Middle East, North Africa and South Asia). It is pertinent to acknowledge that drone personnel face psychological harm, and that this harm is also manifested physiologically, particularly when the responsibility to protect soldiers from harm is evoked by the Pentagon and its military academics to convince the public of drone warfare’s virtues.

Continued journalistic and academic investigation into drone personnel will also help to uncover alternative (possibly subversive and counter-hegemonic) readings of the drone apparatus. People can interact with articulations of hegemonic power in ways that expose ‘their porousness and malleability, their incompleteness and their transformability’, and this is no different for drone personnel (Butler 2006, 533). As Judith Butler argues, there is always the possibility for ‘radical rearticulations’ of power through counter-normative relations (1990, 16). It already seems clear, from the handful of testimonies cited in this article, that many drone personnel are far less convinced by the mythology of the drone – as an ethical and omniscient technological apparatus – than the public. It is therefore important their experiences are discovered and communicated. Counter-hegemonic potential can be found within drone personnel’s testimonies, but that potential is foreclosed when academic and journalistic discourses construct the drone apparatus as invulnerable. Opening up this potential aligns with Caroline Holmqvist’s (2013, 542) project to consider how drone personnel’s experiences can ‘seep out in a wider social corpus’, and with Ian Shaw and Majed Akhter’s (2012, 1502) directive to ‘dismantle the production and maintenance of the drone fetish’. Lauren Wilcox (2015, 11) similarly compels us to think about how ‘bodies are both constraining (insofar as they are imposed upon by relations of power) and enabling (as they possess creative or generative capacities to affect the political field)’. Drone personnel’s embodied experiences possess generative capacities to affect the political field, but they first have to be taken seriously by journalists, academics, NGOs and anti-drone politicians before that affective potential can be realised.

This article has also argued for the usefulness of an interdisciplinary approach to studying mediated, high-technology warfare. Media, screen and cultural studies has a decades-long engagement with mediation, human-technology interaction, embodiment, phenomenology and affect. Media, screen and cultural studies offer many useful theoretical tools which international relations scholars can use to make better sense of mediated, high-technology war. Moreover, the introduction of recent media theory – particularly work on pervasive media in the digital age – into a discussion currently dominated by Information Age debates is necessary. It is not that international relations theory has completely ignored media, screen and cultural studies, but that it continues to draw upon 1990s literature focusing on high-speed, high-technology’s role in enacting biopolitical control. To a large extent, this remains relevant; indeed, nation-states and corporations have increased their reliance on big data mechanisms to measure, map and control citizen-consumers. The relentlessly instrumentalist logic of such work, however, neglects the leakiness of human-technology interaction, including the possibility for counter-hegemonic resistance within hegemonic technological apparatuses. Lastly, this article’s posthumanist and phenomenological approaches represent an important contribution to the affective turn led by feminist and critical theorists within IR theory. Embodiment, emotions and affect are burgeoning areas of inquiry in international relations, complicating age-old realist and instrumentalist understandings of agency, mobilisation and power. A unified approach between feminist/critical international relations and media, screen and cultural studies would be most effective in uncovering human experiences of high-technology, mediated war.

Acknowledgments

I would like to thank Thomas Gregory and Thao Phan for reading and providing excellent feedback on earlier drafts of this article. Luke Goode and Neal Curtis were inspiring teachers and great sounding-boards during my Honours year, when a very early version of this piece was written: thank you both. Lastly, I would like to thank the anonymous reviewers at Krisis: Journal for Contemporary Philosophy, whose comments on my initial submission have been very helpful.

 

References

“A Drone Warrior’s Torment: Ex-Air Force Pilot Brandon Bryant on His Trauma from Remote Killing.” 2013. Democracy Now!, Oct 25.

Ahall, Linda, and Thomas Gregory. 2015. Emotions: Politics and War (Interventions). London/New York: Routledge.

Aldao, Amelia, Douglas S. Mennin, Eftihia Linardatos, and David M. Fresco. 2010. “Differential Patterns of Physical Symptoms and Subjective Processes in Generalised Anxiety Disorder and Unipolar Depression.” Journal of Anxiety Disorders 24: 250-259.

Alston, Philip. 2010. Report of the Special Rapporteur on Extrajudicial, Summary of Arbitrary Executions; Addendum: Study on Targeted Killings. Geneva: United Nations.

Andrejevic, Mark. 2006. “The Discipline of Watching: Detection, Risk, and Lateral Surveillance.” Critical Studies in Media Communication 23, no. 5: 391-407.

Andrejevic, Mark. 2010. “Reading the Surface: Body Language and Surveillance.” Culture Unbound 2: 15-36.

Andrejevic, Mark, and Mark Burdon. 2015. “Defining the Sensor Society.” Television & New Media 16, no. 1: 19-36.

Arendt, Hannah. 1964 (orig. 1963). Eichmann in Jerusalem: A Report on the Banality of Evil. New York: Viking Press.

Asaro, Peter. M. 2013. “The Labour of Surveillance and Bureaucratised Killing: New Subjectivities of Military Drone Operators.” Social Semiotics 23, no. 2: 196-224.

Bauman, Zygmunt. 1989. Modernity and the Holocaust. Cambridge: Polity Press.

Beckett, Charlie. and Robin Mansell. 2008. “Crossing Boundaries: New Media and Networked Journalism.” Communication, Culture & Critique 1, no. 1: 92-104.

Benjamin, Medea. 2012. Drone Warfare: Killing by Remote Control. London: Verso.

Bergen, Peter L., and Daniel Rothenberg. 2014. Drone Wars: Transforming Conflict, Law and Policy. Cambridge: CUP.

Butler, Judith. 1990. Gender Trouble: Feminism and the Subversion of Identity. New York & London: Routledge.

Butler, Judith. 2003. “Violence, Mourning, Politics.” Studies in Gender and Sexuality 4, no. 1: 9-37.

Butler, Judith. 2006. “Response to Special Issue – Troubling Identities: Reflections on Judith Butler’s Philosophy for the Sociology of Education.” British Journal for the Sociology of Education 24, no. 7: 529-534.

Castells, Manuel. 1994. The Rise of the Network Society: The Information Age: Economy, Society and Culture. Oxford: Blackwell Publishing.

Chamayou, Gregoire. 2015. Drone Theory. London: Penguin Books.

Chappelle, Wayne, Kent McDonald, and Raymond E. King. 2010. “Psychological Attributes Critical to the Performance of MQ-1 Predator and MQ-9 Reaper U.S Air Force Sensor Operators.” Air Force Research Laboratory Report: 1-30.

Chappelle, Wayne, Amber Salinas and Kent McDonald. 2011. “Psychological Health Screening of Remotely Piloted Aircraft Operators and Supporting Units.” USAF School of Medicine Department of Neuropsychiatry Report: 1-12.

Chappelle, Wayne and Kent McDonald. 2012. “Prevalence of High Emotional Distress and Symptoms of Post-Traumatic Stress Disorder in US Air Force Active Duty Remotely Piloted Aircraft Operators.” Air Force Research Laboratory Report: 1-15.

Chappelle, Wayne, Tanya Goodman, Laura Reardon, and William Thompson. 2014. “An Analysis of Post-Traumatic Stress Disorder in United States Air Force Drone Operators.” Journal of Anxiety Disorders 28: 480-487.

Cole, Chris, Mary Dobbing and Amy Hailwood. 2010. Convenient Killing: Armed Drones and the ‘Playstation’ Mentality. Oxford: Fellowship of Reconciliation.

Collins, Patricia H. 1990. Black Feminist Thought: Knowledge, Consciousness and the Politics of Empowerment. UK: Hyman Unwin.

Coole, Diana, and Samantha Frost. 2010. New Materialisms: Ontology, Agency, and Politics. Durham: Duke University Press.

Craske, Michelle G. “Cognitive Behaviour Therapy for Anxiety Disorders.” American Psychological Association. APA PsychNET.

Der Derian, James. 2000. “Virtuous War, Virtuous Theory.” International Affairs 76, no. 4: 771-788.

Der Derian, James. 2009. Virtuous War: Mapping the Military-Industrial-Media-Entertainment Network. New York: Routledge.

Deuze, Mark. 2011. “Media Life.” Media, Culture & Society 33, no. 1: 137-148.

Drone Campaign Network. (2016) Drone Wars: Out of Sight, Out of Mind, Out of Control. Oxford: Drone Campaign Network.

Drucker, Johanna. 2011. “Humanities Approaches to Interface Theory.” Culture Machine 12: 1-20.

Enemark, Christian. 2013. Armed Drones and the Ethics of War: Military Drones in a Post-Heroic Age. London: Routledge.

“Exclusive: Air Force Whistleblowers Risk Prosecution to Warn Drone War Kills Civilians, Fuels Terror.” 2015. Democracy Now!, Nov 20.

Feldman, Allen. 1997. “Violence and Vision: The Prosthetics and Aesthetics of Terror.” Public Culture 10, no. 1: 24-60.

Gharib, Ali. 2015. “‘Good Kill’: Drone Pilots Get PTSD, Civilians Die Nameless.” The Nation, May 15.

Grayson, Kyle. 2012. “Six Theses on Targeted Killing.” Politics 32, no. 2: 120-128.

Gregory, Derek. 2011. “From a View to a Kill: Drones and Late Modern War.” Theory Culture Society 28, no. 7-8: 188-215.

Gregory, Thomas. 2015. “Drones, Targeted Killing and the Limitations of International Law.” International Political Sociology 9: 197-212.

Hall, Stuart. 1980 (orig. 1974). “Encoding/Decoding.” In Culture, Media, Language: Working Papers in Cultural Studies, edited by Stuart Hall, Doothy Hobson, Andrew Lowe and Paul Willis, 128-138. London: Hutchinson.

Haraway, Donna. 1991. “A Cyborg Manifesto: Science, Technology and Socialist-Feminism in the Late Twentieth Century.” Simians, Cyborgs and Women: The Reinvention of Nature, 149-181. London: Free Association Books.

Heidegger, Martin. 1978. Being and Time. New Jersey: Wiley-Blackwell.

Held, David. 1999. Global Transformations: Politics, Economics and Culture. California: SUP.

Holmqvist, Caroline. 2013. “Undoing War: War Ontologies and the Materiality of Drone Warfare.” Millennium 41, no. 3: 535-552.

hooks, bell. 1992. Black Looks: Race and Representation. Boston: South End Press.

Hurwitz, Elijah, S. 2013. “Drone Pilots: ‘Overpaid, Underworked and Bored’.” Mother Jones, June 18.
Kaplan, Caren. 2006. “Mobility and War: The Cosmic View of US ‘Air Power’.” Environment and Planning A 38, no. 2: 395-407.

Kindervater, Katharine H. 2016. “The Emergence of Lethal Surveillance: Watching and Killing in the History of Drone Technology.” Security Dialogue: 1-16.
Linebaugh, Heather. 2013. “I Worked On The US Drone Program. The Public Should Know What Really Goes On.” The Guardian, Dec 29.

Masters, Cristina. 2005. “Bodies of Technology: Cyborg Soldiers and Militarised Masculinities.” International Feminist Journal of Politics 7, no. 1: 112-132.

McCullough, Malcolm. 2004. Digital Ground: Architecture, Pervasive Computing and Environmental Knowing. Massachusetts: MIT Press.

McFarlane, Alexander, Michelle Atchison, E. Rafalowicz and P. Papay. 1994. “Physical Symptoms in Post-Traumatic Stress Disorder.” Journal of Psychosomatic Research 38, no. 7: 715-726.

McLuhan, Marshall. 2013 (orig. 1964). Understanding Media: The Extensions of Man. California: Gingko Press.

Merrin, William. 2009. “Media Studies 2.0: Upgrading and Outsourcing the Discipline.” Interactions: Studies in Communication and Culture 1, no. 1: 17-34.

Metz, Christian. 1982. The Imaginary Signifier: Psychoanalysis and the Signifier. Bloomington: Indiana University Press.

Metz, Christian. 1999 (orig. 1974). “Some Points in the Semiotics of the Cinema.” L. Braudy and M. Cohen (eds) Film Theory and Criticism: Introductory Readings (5th Edition). Oxford: OUP.

“‘Numbing and Horrible’: Former Drone Operator Brandon Bryant on His Haunting First Kill.” 2015. Democracy Now!, Nov 20.

Otto, Jean L., and Bryant. J. Webber. 2013. “Mental Health Diagnoses and Counselling Among Pilots of Remotely Piloted Aircraft in the United States Air Force.” MSMR 20, no. 3: 3-8.

Ouma, Joseph A., Wayne Chappelle, and Amber Salinas. 2011. “Facets of Occupational Burnout Among U.S Air Force Active Duty and National Guard/Reserve MQ-1 Predator and MQ-9 Reaper Operators.” Air Force Research Laboratory Report: 1-17.

Phan, Thao. 2015. “A Manifesto for Cyborgs at 30: Introduction.” Platform: Journal of Media and Communication 6, no. 2: 4-7.

Plaw, Avery. 2012. “Drone Strikes Save Lives, American and Other.” The New York Times, Nov 14.

Power, Matthew. 2013. “Confessions of a Drone Warrior.” GQ, Oct 22.

Pugliese, Joseph. 2013. State Violence and the Execution of the Law: Biopolitical Caesurae of Torture, Black Sites, Drones. New York: Routledge.

Pugliese, Joseph. 2016. “Drone Casino Mimesis: Telewarfare and Civil Militarisation.” Journal of Sociology: 1-22.

Royakkers, Lamber, and Rinie van Est. 2010. “The Cubicle Warrior: The Marionette of Digitalised Warfare.” Ethics and Information Technology 12, no. 3: 289-296.

Shaw, Ian. 2013. “Predator Empire: The Geopolitics of Drone Warfare.” Geopolitics 18: 536-559.

Shaw, Ian, and Majed Akhter. 2012. “The Unbearable Humanness of Drone Warfare in FATA, Pakistan.” Antipode 44, no. 4: 1490-1509.

Singer, Peter. 2009. Wired for War: The Robotics Revolution and Conflict in the Twenty-First Century. New York: Penguin Press.

Smith, Dorothy. 1990. The Conceptual Practices of Power: A Feminist Sociology of Knowledge. New Hampshire: Northeastern University Press.

Sobchack, Vivian. 2004. “The Passion of the Material: Toward a Phenomenology of Interobjectivity.” Carnal Thoughts: Embodiment and Moving Image Culture, 286-318. Berkeley: University of California Press.

Stahl, Stephen M. 2002. “Does Depression Hurt?” Journal of Clinical Psychiatry 63, no. 4: 273-274.

Strawser, Bradley J. 2010. “Moral Predators: The Duty to Employ Uninhabited Aerial Vehicles.” Journal of Military Ethics 9, no. 4: 342-368.

Virilio, Paul, Friedrich Kittler and John Armitage. 1999. “The Information Bomb: A Conversation.” Angelaki: Journal of the Theoretical Humanities 42, no. 2: 81-90.

Weiner, Robert, and Tom Sherman. 2014. “Drones Spare Troops, Have Powerful Impact.” The San Diego Union-Tribune, Oct 9.

Weiser, Mark. 1994. “The World is Not a Desktop.” ACM Interactions: 7-8.

Wilcox, Lauren. 2015. Bodies of Violence: Theorising Embodied Subjects in International Relations. Oxford: OUP.

Wilcox, Lauren. 2016. “Embodying Algorithmic War: Gender, Race, and the Posthuman in Drone Warfare.” Security Dialogue: 1-18.

Williams, Alison. 2011. “Enabling Persistent Presence? Performing the Embodied Geopolitics of the Unmanned Aerial Vehicle Assemblage.” Political Geography 30: 381-390.

Biography

Alex Edney-Browne

Alex Edney-Browne is a PhD researcher in International Relations at the University of Melbourne. She graduated with a BA Hons (First Class) in Media, Film and Television and Politics and International Relations from the University of Auckland in 2015. Her thesis investigates people’s lived experiences of drone warfare: the emotional, psychological and physiological affects of military drones on people living in targeted areas and US Air Force drone personnel. It posits the drone as an ‘affective interface’, which facilitates human-technology interaction and cross-cultural human-to-human interaction – sometimes in unexpected and subversive ways. Alex’s research is interdisciplinary, engaging with critical international relations, science and technology studies and media, screen and cultural studies.