
- 248 pages
- English
- ePUB (mobile friendly)
- Available on iOS & Android
eBook - ePub
About this book
What happens when media technologies are able to interpret our feelings, emotions, moods, and intentions? In this cutting edge new book, Andrew McStay explores that very question and argues that these abilities result in a form of technological empathy. Offering a balanced and incisive overview of the issues raised by 'Emotional AI', this book:
- Provides a clear account of the social benefits and drawbacks of new media trends and technologies such as emoji, wearables and chatbots
- Demonstrates through empirical research how 'empathic media' have been developed and introduced both by start-ups and global tech corporations such as Facebook
- Helps readers understand the potential implications on everyday life and social relations through examples such as video-gaming, facial coding, virtual reality and cities
- Calls for a more critical approach to the rollout of emotional AI in public and private spheres
Combining established theory with original analysis, this book will change the way students view, use and interact with new technologies. It should be required reading for students and researchers in media, communications, the social sciences and beyond.
Ā
Ā
Frequently asked questions
Yes, you can cancel anytime from the Subscription tab in your account settings on the Perlego website. Your subscription will stay active until the end of your current billing period. Learn how to cancel your subscription.
No, books cannot be downloaded as external files, such as PDFs, for use outside of Perlego. However, you can download books within the Perlego app for offline reading on mobile or tablet. Learn more here.
Perlego offers two plans: Essential and Complete
- Essential is ideal for learners and professionals who enjoy exploring a wide range of subjects. Access the Essential Library with 800,000+ trusted titles and best-sellers across business, personal growth, and the humanities. Includes unlimited reading time and Standard Read Aloud voice.
- Complete: Perfect for advanced learners and researchers needing full, unrestricted access. Unlock 1.4M+ books across hundreds of subjects, including academic and specialized titles. The Complete Plan also includes advanced features like Premium Read Aloud and Research Assistant.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, weāve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes! You can use the Perlego app on both iOS or Android devices to read anytime, anywhere ā even offline. Perfect for commutes or when youāre on the go.
Please note we cannot support devices running on iOS 13 and Android 7 or earlier. Learn more about using the app.
Please note we cannot support devices running on iOS 13 and Android 7 or earlier. Learn more about using the app.
Yes, you can access Emotional AI by Andrew McStay,Author in PDF and/or ePUB format, as well as other popular books in Social Sciences & Media Studies. We have over one million books available in our catalogue for you to explore.
Information
1 Introducing Empathic Media
Emotions matter. They are at the core of human experience, shape our lives in the profoundest of ways and help us decide what is worthy of our attention. The idea behind this book is to explore what happens when media technologies are able to interpret feelings, emotions, moods, attention and intention in private and public places. I argue this equates to a technological form of empathy. As we will see, there are many personal and organisational drivers for using technologies to understand how individuals and groups of people feel and see things. These include making technologies easier to use, evolving services, creating new forms of entertainment, giving pleasure, finding novel modes of expression, enhancing communication, cultivating health, enabling education, improving policing, heightening surveillance, managing workplaces, understanding experience and influencing people. This is done through ācapturingā emotions. In computer science parlance ācaptureā simply means causing data to be stored in a computer, but ācaptureā of course has another meaning: taking possession by force. This book is in many ways an account of the difference between these two understandings.
Overall I suggest that we are witnessing a growing interest in mediated emotional life and that neither the positive or negative dimensions of this have been properly explored. This situation is becoming more pressing as society generates more information about emotions, intentions and attitudes. As a minimum, there is the popularity of animojis, emojis and emoticons on social media. These facilitate non-verbal shorthand communication, but they also allow services insight into how content, brands, advertising campaigns, products and profiles make people feel. The vernacular of emoticons increasingly applies both online and offline, as we are asked for feedback about our perspectives, how we are and whatās happening. The emotionalising of modern mediated life is not just about smileys, however. Rather, the sharing of updates, selfies and point-of-view content provides valuable understanding of life moments, our perspectives and individual and collective emotions.
Interest in feelings, emotions, moods, perspectives and intentions is diverse. Political organisations and brands trace how we feel about given messages, policies, candidates and brand activity through online sentiment analysis. Similarly, advertising agencies, marketers and retailers internally research what we say, post, listen to, our facial expressions, brain behaviour, heart rate and other bodily responses to gauge reactions to products, brands and adverts. Increasingly, digital assistants in the home and on our phones are progressing to understand not just what we say, but how we say it. In terms of affective media experience, virtual reality has raised the bar in unexpected ways. As well as generating emotional responses that can be measured, this also tells analysts a great deal about what captures our attention. Augmented reality promises something similar, albeit in public and commercial spaces. Wearables attached to our bodies track all sorts of biofeedback to understand emotions and how we feel over short and extended periods of time. As we will see, this potential is being applied in novel, surprising and perhaps alarming ways. Indeed, some of us even insert ātechnologies that feelā into our bodies to enhance our sex life. At a macro-level, cities are registering the emotional lives of inhabitants and visitors. This book assesses all of these phenomena, and more.
I call for critical attention and caution in the rollout of these technologies, but I should state upfront that I do not think there is anything innately wrong with technologies that detect, learn and interact with emotions. Rather, the practice of reading and detecting emotions is a step forward in improving how we interact with machines and how they respond to us. For example, as will be explored, games are enhanced through use of biofeedback and information about how we feel. The issue is not the premise of using data about emotions to interact with technology, but the nature of engagement. In short, while all might enjoy and appreciate the focus on āexperienceā (user, consumer, patient and citizen), it is paramount that people have meaningful choice and control over the ācapturingā of information about emotions and their bodies.
This book was researched and written during an interesting period. I have been writing about moods and technology since 2009 (McStay, 2011) and introduced the principle of empathic media in another book (McStay, 2014). This refers to the capacity for emergent media technologies to sense and discern what is significant for people, categorise behaviour into named emotions, act on emotional states, and make use of peopleās intentions and expressions. With financial assistance from the UKās Arts and Humanities Research Council from mid-2014 through to the end of 2016 I began researching and interviewing high-value individuals developing and employing emotion-sensitive technologies. Over the course of the research period the technology sector has begun to address emotions and affective computing in a much more serious fashion. When I first started interviewing it was largely start-ups finding commercial opportunities in technologies sensitive to emotional life. As the project progressed, I found that more recognisable names such as Amazon, Apple, Facebook, Google, IBM and Microsoft are now publicly developing emotional AI and empathic media products. Many of the original start-ups I spoke with are now looking forward to lucrative exits.
Technologically, the rise of interest in emotional life is indivisible from the increase in applications of artificial intelligence (AI) and machine-learning methods. While we are undergoing a hype cycle that brings with it inflated expectations, this should not detract from the fact that these technologies are here to stay. Of course, they will also improve. Indeed, in as far as AI systems interact with people, one might reason that AI has no value until it is sensitive to feelings, emotions and intention. This includes home assistants and headline grabbing humanoid robots, but the important development is how emotion recognition systems are progressively permeating humanācomputer interactions. If the reader agrees there is personal, inter-personal, organisational, economic and surveillance value in understanding emotional engagement with self, others, objects, services and content, emotional AI and empathic media are worth our attention.
Artificial emotional intelligence is achieved by the capacity to see, read, listen, feel, classify and learn about emotional life. Slightly more detailed, this involves reading words and images, seeing and sensing facial expressions, gaze direction, gestures and voice. It also encompasses machines feeling our heart rate, body temperature, respiration and the electrical properties of our skin, among other bodily behaviours. Together, bodies and emotions have become machine-readable. What I am not arguing is that these systems experience emotions. Instead, I am interested in the idea that the capacity to sense, classify behaviour and respond appropriately offers the appearance of understanding. I suggest this form of observation involves a form of empathy. To develop the thesis that media and technologies are progressively showing signs of empathy, I begin with two propositions:
- We increasingly ālive withā technologies that feel and these are sensitive to human life in ways hitherto not seen.
- Empathic media provide opportunities for new aesthetic experiences that not only draw upon information about emotions, but also provide new means for people to āfeel intoā aesthetic creations.
If the reader agrees that technologies are increasingly capable of gauging emotional behaviour and that there is personal, inter-personal, commercial and other organisational value in understanding emotions, we should agree that ādataficationā (Mayer-Schƶnberger and Cukier, 2013) of emotional life is unavoidable. Deconstructed, proposition 1 suggests that: a) technologies that make use of data about emotions are increasing; b) we live alongside technologies such as digital assistants rather than simply āuseā them; and c) we will encounter these technologies in unexpected places (such as shops). Proposition 2 is based on the simple fact that new media technologies offer content creators new affordances. Although much of this book addresses the scope and implications of emotion tracking, the principle of empathic media encompasses applications that allow people to viscerally understand places, periods, cultures, objects and real and fictional worlds.
Good Enough: Verisimilitude and Emotional Truth
One or all three of the following questions might be in the readerās mind: first, āare machines really capable of empathy ⦠isnāt this a bit of a stretchā; followed by, āisnāt this a very limited view of emotionsā; and lastly, āwhat about compassion and sympathy in empathy?ā The last question is most easily dealt with. Although we typically connect empathy with sympathy and compassion, the connection is not a necessary one. To interpret an emotional state and make predictions about a personās perspective and disposition does not require that we want the best for that person. Sympathy is not a necessary criterion for empathy, but instead empathy is simply an interpretive act. Cognitive empathy, which may entail sadism and mental as well as physical cruelty, is a brutal example of this.
This is a ātheory-theoryā approach to empathy, where emotion is theorised through observation (Goldman, 2008). Put otherwise, it is to understand another personās condition by means of what we survey, measure and remember as well as what rules are made for subsequent engagement. This neo-behaviourist approach means that systems sense, discern patterns of behaviour, make judgements by means of algorithms and heuristics (if person A is behaving in X manner then do Z), provide content and feedback and learn from peopleās reactions. What is key here is that empathic media systems do not employ āmentalisticā processes. Instead they āsimplyā observe, classify, allocate, adapt and modify their behaviour.
Accordingly, it is reasonable to say that computers can recognise emotions when āthe group of computers and the group of humans respond with the same distribution of answersā (Picard, 1997: 51). Imbued within this is recognition that people do not judge correctly each time, and nor should we expect machines to either. This is a simple but important point. If we are to critique machines and say they do not have access to our āauthentic emotional statesā (whatever this may denote), it cannot be because they misdiagnose and sometimes read people incorrectly.
On whether this is a limited view of emotions, it is not clear what emotions are. Empathic media employ a particular account of what emotions are through their use of psychological, anthropological and neuroscientific research, largely deriving from Paul Ekman and his forerunners. As we will see, there is an attractive simplicity to ābasic emotionsā (Ekman and Friesen, 1971) that technologists have latched onto. To an extent this is because of expediency and that this account of emotional life works well with sensing techniques that classify facial and bodily behaviour. Indeed in a telling line from early proponents of emotion-sensing technology, they say, āChoosing a physiological or behavioural measure can be relatively easy, in that technology or methodology will often dictate a clear preferenceā (Bradley and Lang, 1994: 49). The ābasic emotionsā view contrasts with the messier idea that emotions might not be fixed objects, but culturally constructed experiences and expressions defined through historical and situational circumstances.
On whether machines can really understand us, we have two possibilities: a) genuine empathy (which is the capacity to truly know what another is undergoing); and b) simulated empathy (the capacity to approximate, to contextualise within what one can comprehend, to make educated judgments and to respond in an appropriate manner). Much of this has been rehearsed in debates about whether machines can really think, what it is for a person to think and the philosophical knots associated with knowing the lives of others. Yet, if we allow for the possibility of a simulated and observational version of empathy, the door is very much open for machinic empathy.
This has less to do with authenticity and more with what I have termed elsewhere āmachinic verisimilitudeā or the appearance of intimate insight (McStay, 2014). It allows us to elide the debate of true versus false because a simulated and theory-based form of empathy may be tested on the basis of appropriateness of feedback. As such, we do not have to engage with the question of whether people have privileged access to a real understanding of emotions and intentions. Rather, we can simply judge by effectiveness. Of course, people clearly have the upper hand in instinctively reading and perceiving the significance of events for other people. However, machines have strong cards of their own because they can record, remember and interpret detail that is inaccessible to human senses (such as physiology). In fact, under closer inspection, perhaps the real question is not ācan machines empathise?ā but āis machinic empathy that different from human empathy?ā I propose that it is reasonable to say that empathy is an interpretive act for people and machines involving observation, identification, contextualisation, learning and fast reactions.
Aims and Methods
My goal is to explore life with technologies that are sensitive to emotions, assess their political and social implications, and consider the ethical, legal and regulatory consequences. I do this by balancing empirical observations with insights from three sets of literatures: media and critical theory; science and technology studies (STS); and the works of a diverse range of philosophers. While the first two sets of literatures are fairly obvious starting points for a book on the phenomenon of empathic media, my recourse to the philosophical literature perhaps needs some explication. In short, the philosophers I have consulted help unpack the social and experiential significance of empathy. Often those selected have phenomenological interests (such as Husserl, Merleau-Ponty and Heidegger), but others such as Lipps, Scheler, Hume, Bentham and Adam Smith help situate the discussion of āfeeling-intoā. Foucault also assists through his insistence that knowledge should be tested in relation to the context and interests that generated it.
My main corpus of data comes from over 100 open-ended one-hour interviews conducted to elucidate views on emotion detection. These draw from industry, national security, law, policy, municipal authorities and privacy-oriented NGOs (for a list of these organisations, see Appendix 1). Although many interviews do not explicitly feature in the book, each has implicitly shaped my thinking. In-person interviews primarily took place in Europe, the United States and the United Arab Emirates, but they also included face-to-face Skype calls with companies from Israel, Russia and South Korea. The scale of companies ranged from Alphabet (Verily), Facebook and IBM to start-ups by students.
Interview questions were co-created with key stakeholders from: the UKās Information Commissionerās Office (a data protection regulator) who were interested in implications for data protection; the advertising agency M&C Saatchi who were interested in creative opportunities; the UKās Committee of Advertising Practice (a self-regulatory body) who were concerned about protecting the reputation of the advertising industry; and the NGO Privacy International who were interested in meaningful consent, data ethics and data security. Interviewees were mostly chief executive officers (CEOs) and people in strategic positions from companies working on: sentiment analysis; virtual and augmented reality; facial coding; voice analytics; social networking; the emotion-enhanced Internet of Things (IoT); emotion-enhanced smart cities; and a wide range of companies developing wearables that track usersā moods through respiration, electroencephalograms (EEG), heart rates and galvanic skin responses (GSR). End-user sectors include: advertising; policing; national security; education; insurance; human resources; the sex-tech industry; psychosexual therapy; experiential marketing; mental health; branding agencies; media agencies; ethical hackers; venture capitalists; artists; interactive film-makers; games companies; in-car experience and navigation companies; and sports software companies.
Each interviewee was selected on the basis of current work in emotion detection, or likelihood of interest in these applications. In addition to industrialists and public sector actors, I interviewed people working in privacy-friendly NGOs (Electronic Frontier Foundation, Open Rights Group and several staff members from Privacy International) to obtain a critical perspective. I also met with media and technology law firms to discuss the legal dimension of these developments, and European policy-makers in the field of data privacy to ascertain their awareness of the topic. A multi-tiered consent form was employed that allowed interviewees to select a level of disclosure they were comfortable with. Options ranged from willingness to speak in a named capacity on behalf of an organisation, to full anonymity.
Other research tools include a workshop with industrialists, regulators, NGOs and academics to develop codes of conduct for using data about emotions (discussed in Chapter 12). I also conducted a demographically representative UK nationwide online survey (n=2067). This assessed citizen attitudes to the potential of emotion detection employed in contexts they are familiar with.1 (I will discuss this where relevant but see Appendix 2 for the overview.) Approaches also include analysis of patent filings, which affords critical media scholars insight into the objectives, hopes, technical intentions, and worldviews of companies and owners. Similarly, textual analysis of product packaging and promotional content also reveals assumptions about ideal users and the ideological outlooks of organisations.
Chapter Breakdown
The arc of the book begins with a theoretical, historical, philosophical and technological framing in Chapter 2. Clarifying principles that will recur in this book, it identifies that empathy is a social fact of living in groups. It also addresses the industrialisation of emotions by noting not only that āemotionā is a surprisingly recent psychological premise, that emotions are economically valuable, but also how emotional life is undergoing ābiomedicalisationā due to applications of emotional AI and affective computing. Although the bookās emphasis on machine-readable emotions may appear somewhat novel, the roots are relatively old. Technological antecedents reach back to the 1800s. The chapter accounts for these, the debates that surrounded them and their significance for my own case study of modern empathic media.
Chapter 3 addresses collective emotions by considering sentiment analysis. Unlike later chapters, this does no...
Table of contents
- Cover
- Half Title
- Title Page
- Copyright Page
- Contents
- About the author
- Acknowledgements
- 1 Introducing Empathic Media
- 2 Situating Empathy
- 3 Group Sentimentality
- 4 Spectrum Of Emotions: Gaming The Body
- 5 Leaky Emotions: The Case of Facial Coding
- 6 Priming Voice-Based AI: I Hear You
- 7 Affective Witnessing: VR 2.0
- 8 Advertising, Retail and Creativity: Capturing the Flâneur
- 9 Personal Technologies that Feel: Towards a Novel form of Intimacy
- 10 Empathic Cities
- 11 Politics of Feeling Machines: Debating De-Identification and Dignity
- 12 Conclusion: Dignity, Ethics, Norms, Policies and Practices
- Appendices
- Appendix 1: Table of organisations and numbers of people interviewed
- Appendix 2: Tables of Results from UK National Survey on Emotion Detection in Existing and Nascent Media Technologies
- References
- Index