Perry / Cameron | Empathy and Artificial Intelligence | Buch | 978-1-009-63903-3 | www.sack.de

Buch, Englisch, 302 Seiten

Reihe: Studies in Emotion and Social Interaction

Perry / Cameron

Empathy and Artificial Intelligence

Challenges, Advances and Ethical Considerations
Erscheinungsjahr 2026
ISBN: 978-1-009-63903-3
Verlag: Cambridge University Press

Challenges, Advances and Ethical Considerations

Buch, Englisch, 302 Seiten

Reihe: Studies in Emotion and Social Interaction

ISBN: 978-1-009-63903-3
Verlag: Cambridge University Press


As artificial intelligence chatbots offer increasingly sophisticated emotional support, society faces a profound question: can a machine truly empathize? Empathy and Artificial Intelligence provides the first comprehensive roadmap for this pivotal moment. Moving beyond simple binaries of 'hype' or 'doom,' this interdisciplinary volume unites leading psychologists, philosophers, and engineers to explore the tangled web of synthetic care. Key chapters investigate the 'AI Advantage' – where machines often outperform humans in perceived empathy – alongside the 'AI Penalty,' where discovering the artifice corrodes trust. The text navigates the distinct landscapes of text-based LLMs and embodied robots, addressing urgent ethical dilemmas and exploring whether reliance on AI risks the atrophy of our moral capacities or enables synthetic agents to scaffold stronger human relationships. Essential for researchers, students, and curious observers, this book investigates whether outsourcing our emotional labor saves us time, or costs us our humanity.

Perry / Cameron Empathy and Artificial Intelligence jetzt bestellen!

Weitere Infos & Material


Introduction AI and empathy at a crossroads: a primer for an interdisciplinary field C. Daryl Cameron and Anat Perry; Part I. Understanding Empathy: 1. A very human history of artificial empathy Shai Satran; 2. Try to see things from my point of view: Empathy, AI, and the right to be an exception Will Kidder, Jason D. Cruz and Kush Varshney; 3. Does it take two to empathize? Sean Laurent and Iris Sooyun Chung; 4. Models match or surpass objective human performance on various tasks Ariel Goldstein and Gabriel Stanovsky; 5. Synthetic support: Empathy and human–machine communication Austin Beattie and Andrew High; 6. Is empathic AI possible? Mohammad Atari, Firat Seker and Aliah Zewail; Part II. Perceiving AI Empathy: 7. Get real Paul Bloom; 8. Humans and AIs may fulfill different empathic needs Anat Perry and Jamil Zaki; 9. Empathic AI will undermine human kindness Madhulika Shastry, Sharlene Fernandes and Kurt Gray; 10. What LLMpathy can tell us about received empathy Eliana Hadjiandreou, Tatiana Lau and Desmond Ong; 11. Machines that care: on receiving and providing AI-driven empathy Elena H. Lee, Yidan Yin and Cheryl Wakslak; 12. What's so special about human empathy? Examining AI empathy and its trade-offs Joshua D. Wenger, C. Daryl Cameron, Martina Orlandi and Michael Inzlicht; Part III. Beyond Text – Embodiment and Robots: 13. Examining the capacity of human beings to experience empathy towards AI Lasana Harris; 14. Trust and social affiliation may influence evacuee decisions during robot-guided evacuation Alan R. Wagner and Colin Holbrook; 15. Does embodiment influence empathy toward artificial agents? Ilkay Ari and Agnieszka Wykowska; 16. Your robot will feel you now: empathy in robots and embodied agents Angelica Lim and Özge Nilay Yalçin; 17. Children's judgments of and interactions with empathetic AI Teresa Flanagan and Tamar Kushnir; 18. Anticipating children's beliefs about artificial empathy Madeline G. Reinecke; 19. From social media to empathic artificial intelligence: applying past lessons to future technologies Micaela Rodriguez, Matt Motyl and Juliana Schroeder; Part IV. Moral Implications: 20. Contemporary AI and the value of empathy Carlos Montemayor; 21. Can empathy guide ethical AI? Opportunities, challenges, and the path forward Özge Nilay Yalçin; 22. Does AI need empathy in medicine? Walter Sinnott-Armstrong; 23. AI in healthcare: comparing ethical standards for humans and AI Michael Laakasuo, Kathryn Francis, Marianna Drosinou and Ivar Hannaiken; 24. AI, empathy, and moral status Joshua August Skorburg and Dylan White; 25. AI should develop human empathy, not replace it Ethan Landes and Jim A. C. Everett; 26. Practical capacities, empathy, and human-centered artificial intelligence Brett Karlan; 27. Stories of empathy: empathy, interactions with AI, and identity building Leda Berio.


Cameron, C. Daryl
C. Daryl Cameron is an Associate Professor in Psychology, a 2023–2026 Sherwin Early Career Professor, and Senior Research Associate in the Rock Ethics Institute at Penn State University. He directs the Empathy and Moral Psychology Lab, and studies why and how people choose to empathize during interactions with humans, animals, and AI. He also directs the interdisciplinary Consortium on Moral Decision-Making. In 2022, he received the Early Career in Affective Science Award from the Society for Affective Science.

Perry, Anat
Anat Perry is an Associate Professor of Psychology at the Hebrew University of Jerusalem, and is a 2025–2026 Harvard Radcliffe Fellow. She studies various facets of empathy and related social processes, and lately – what we can learn from human–AI interactions about what we value in human empathy and relations. Her work appears in leading journals, including Nature Human Behaviour and Nature Machine Intelligence.



Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.