Summary
Contents
Subject index
What happens when media technologies are able to interpret our feelings, emotions, moods, and intentions? In this cutting edge new book, Andrew McStay explores that very question and argues that these abilities result in a form of technological empathy. Offering a balanced and incisive overview of the issues raised by ‘Emotional AI’, this book: • Provides a clear account of the social benefits and drawbacks of new media trends and technologies such as emoji, wearables and chatbots • Demonstrates through empirical research how ‘empathic media’ have been developed and introduced both by start-ups and global tech corporations such as Facebook • Helps readers understand the potential implications on everyday life and social relations through examples such as video-gaming, facial coding, virtual reality and cities • Calls for a more critical approach to the rollout of emotional AI in public and private spheres Combining established theory with original analysis, this book will change the way students view, use and interact with new technologies. It should be required reading for students and researchers in media, communications, the social sciences and beyond.
Chapter 12: Conclusion: Dignity, Ethics, Norms, Policies and Practices
Conclusion: Dignity, Ethics, Norms, Policies and Practices
This book has accounted for the ways in which media technologies are showing qualities of ‘empathy’. That is, they have the capacity to gauge emotions, intentions and attention through analysis of writing, images, speech, voice, facial expressions, bodily movement and physiology. Although some of the technologies in question pre-date the 1990s, the subject of empathic media is indivisible from technical developments in ‘affective computing’. This involves computational processes that sense and respond in kind to people’s emotions (Picard, 1995, 1997, 2007). The purpose of this brief final chapter is to recap the key themes of the book, clarify its ethical position, consider future research questions, and state what needs to be ...
- Loading...