Tettegah / Gartmeier | Emotions, Technology, Design, and Learning | E-Book | sack.de
E-Book

E-Book, Englisch, 332 Seiten

Reihe: Emotions and Technology

Tettegah / Gartmeier Emotions, Technology, Design, and Learning


1. Auflage 2015
ISBN: 978-0-12-801881-1
Verlag: Elsevier Science & Techn.
Format: EPUB
Kopierschutz: 6 - ePub Watermark

E-Book, Englisch, 332 Seiten

Reihe: Emotions and Technology

ISBN: 978-0-12-801881-1
Verlag: Elsevier Science & Techn.
Format: EPUB
Kopierschutz: 6 - ePub Watermark



Emotions, Technology, Design, and Learning provides an update to the topic of emotional responses and how technology can alter what is being learned and how the content is learned. The design of that technology is inherently linked to those emotional responses. This text addresses emotional design and pedagogical agents, and the emotions they generate. Topics include design features such as emoticons, speech recognition, virtual avatars, robotics, and adaptive computer technologies, all as relating to the emotional responses from virtual learning. - Addresses the emotional design specific to agent-based learning environments - Discusses the use of emoticons in online learning, providing an historical overview of animated pedagogical agents - Includes evidence-based insights on how to properly use agents in virtual learning environments - Focuses on the development of a proper architecture to be able to have and express emotions - Reviews the literature in the field of advanced agent-based learning environments - Explores how educational robotic activities can divert students' emotions from internal to external

Tettegah / Gartmeier Emotions, Technology, Design, and Learning jetzt bestellen!

Weitere Infos & Material


1;Front Cover;1
2;Emotions, Technology, Design, and Learning;4
3;Copyright;5
4;Contents;6
5;Contributors;12
6;Foreword;14
6.1;References;17
7;Preface;18
7.1;Emotions, Technology, Design, and Learning;18
7.1.1;Envisioning the Empathic Learning Environment;20
7.2;Emotions and Affect Recognition Systems;21
7.3;Reviews on Emotions, Affect, and Design;22
7.4;Interactions, Design, and Learning;23
7.5;References;24
8;Section I: Emotions and Affect Recognition Systems;26
8.1;Chapter 1: Emotions in Adaptive Computer Technologies for Adults Improving Reading;28
8.1.1;Affect-Sensitive ITSs for College Students Learning STEM Topics;30
8.1.1.1;What Affect States Do Learners Experience?;30
8.1.1.2;Automated Tracking of Affective States During Learning;33
8.1.1.3;How Does the Computer System Respond to Learners Affect States?;35
8.1.2;Building ITS with Dialogs and Trialogs for Struggling Adult Learners;36
8.1.2.1;Dialogs;37
8.1.2.1.1;Give Students Choices;39
8.1.2.1.2;Assign Texts Within the Students Ability;40
8.1.2.1.3;Give Supportive Short Feedback;40
8.1.2.1.4;Gauge the Prospects of Active Learning;41
8.1.2.1.5;Gauge Who Should Summarize;41
8.1.2.2;Trialogs;41
8.1.3;Challenges and Limitations;45
8.1.4;Acknowledgments;46
8.1.5;References;47
8.2;Chapter 2: A Real-Time Speech Emotion Recognition System and its Application in Online Learning;52
8.2.1;Introduction;52
8.2.2;Real-Time Speech Emotion Recognition System;55
8.2.2.1;Graphic User Interface;55
8.2.2.2;Major Functions;55
8.2.2.3;Methods and Procedure;57
8.2.2.3.1;Voice Activity Detection;58
8.2.2.3.2;Speech Segmentation;60
8.2.2.3.3;Signal Pre-processing;60
8.2.2.3.4;Acoustic Features;61
8.2.2.3.5;Support Vector Machines (SVMs)-Based Learning Model;62
8.2.2.3.6;Statistics Analysis of Emotion Frequency;63
8.2.3;Experiments;63
8.2.3.1;Offline Experiment;63
8.2.3.2;Real-time Recording and Recognition;64
8.2.4;Application in Online Learning;65
8.2.4.1;Use of Emotion Detection in Online Learning;65
8.2.4.2;Challenges;67
8.2.4.3;Experiment Scenario and Results;67
8.2.4.3.1;Experiment Scenario;68
8.2.4.3.2;Speech Data;68
8.2.4.3.3;Results;68
8.2.4.3.4;Discussion;68
8.2.4.3.5;Future Work;69
8.2.5;Conclusion;70
8.2.6;References;70
8.3;Chapter 3: Pedagogical Agents and Affect: Molding Positive Learning Interactions;72
8.3.1;Introduction;72
8.3.2;Pedagogical Agents: A Brief History;73
8.3.3;Design and Implementation of Pedagogical Agents;74
8.3.4;Emotions During Tutorial Interactions;75
8.3.4.1;Politeness;76
8.3.4.2;Emotion and Empathy;76
8.3.5;Assessing the Impacts of Pedagogical Agents;78
8.3.5.1;Cognitive Outcomes;78
8.3.5.2;Affective Outcomes;79
8.3.5.3;Self-perceptions and Feelings Toward Learning;80
8.3.5.4;Motivation, Interest, and Self-efficacy;80
8.3.6;A Path Forward for Pedagogical Agents;81
8.3.6.1;Increase Nonverbal Fidelity;82
8.3.6.2;Strengthen Links Between Agent Behaviors and Learner Emotions;82
8.3.6.3;Build Real Relationships;83
8.3.7;Conclusion;83
8.3.8;References;84
8.4;Chapter 4: Implementation of Artificial Emotions and Moods in a Pedagogical Agent;88
8.4.1;Introduction;88
8.4.2;Theoretical Approaches to Emotions;89
8.4.2.1;The Discrete Approach;89
8.4.2.2;The Two-dimensional Approach: Circumplex Model;89
8.4.2.3;The Three-dimensional Approach;90
8.4.2.4;Plutchik's Multidimensional Approach;90
8.4.2.5;The OCC Model;91
8.4.3;Capturing the Student's Emotions in the Learning Process;93
8.4.4;Artificial Emotions;97
8.4.5;Architecture of Emotional Agent;99
8.4.6;The Emotional Pedagogical Agent for a Multiple Choice Questions Test;103
8.4.7;Conclusions;109
8.4.8;Acknowledgment;110
8.4.9;References;110
9;Section II: Reviews on Emotions, Affect, and Design;112
9.1;Chapter 5: Measuring Emotions: A Survey of Cutting Edge Methodologies Used in Computer-Based Learning Environment Research;114
9.1.1;Introduction;114
9.1.2;Emotions: A Primer;117
9.1.3;How Are Learners' Emotional States Measured in Research with CBLEs?;118
9.1.3.1;Facial Expressions;118
9.1.3.2;Body Posture;121
9.1.3.3;Physiological Patterns;122
9.1.3.4;Self-Report Measures;124
9.1.3.5;Log-Files;127
9.1.3.5.1;Context Feature Mining;128
9.1.3.5.2;Language and Discourse Feature Analyses;129
9.1.4;Multimethod Emotion Classification: Is It Worth It?;130
9.1.5;Theoretical and Analytical Considerations in Measuring Emotions;133
9.1.6;Conclusions and Recommendations;134
9.1.7;Acknowledgments;136
9.1.8;References;136
9.2;Chapter 6: Designing Tools that Care: The Affective Qualities of Virtual Peers, Robots, and Videos;140
9.2.1;The Integral Nature of Affect and Cognition;140
9.2.2;Virtual Peers;141
9.2.2.1;Virtual Peer Affect;142
9.2.2.2;Affective Role Models;143
9.2.3;Humanoid Robots;144
9.2.3.1;Robots and Affect;145
9.2.3.2;The Robot Friend Atti;146
9.2.3.2.1;Designing for Children: Interplay Between Designers and Children;147
9.2.4;Online Videos;148
9.2.4.1;Instructor-Learner Interaction in Online Learning;148
9.2.4.2;Compensatory Strategies;149
9.2.5;Conclusion;151
9.2.6;References;152
9.3;Chapter 7: Emotional Design in Digital Media for Learning;156
9.3.1;Introduction;156
9.3.2;Defining Emotion, Mood, Affect;156
9.3.3;Emotion and Cognition;159
9.3.4;Emotions and Learning;160
9.3.5;Emotional Design in Digital Media for Learning;162
9.3.5.1;Emotional Design Through Information Representation;163
9.3.5.2;Emotional Design Through Interaction Design;164
9.3.5.2.1;Situational Interest;165
9.3.5.2.2;Guided Activity Principle—Animated Pedagogical Agents;166
9.3.6;The Theoretical Foundation of Emotions and Learning;167
9.3.6.1;Pekruns (2000) Control Value Theory of Achievement Emotions;168
9.3.6.2;Moreno and Mayers (2007) Cognitive Affective Theory of Learning with Media;170
9.3.6.3;Picards (1997) Affective Computing;171
9.3.7;Toward an Integrated Cognitive-Affective Model of Multimedia Learning;172
9.3.8;Research Agenda for the Study of Emotional Design;176
9.3.8.1;Research on Design Factors Impacting Learners' Emotion;177
9.3.8.2;Research on the Impact of Emotion on Learning Outcomes;178
9.3.8.3;Research on Measuring Emotions;179
9.3.8.4;Research on Appropriate Responses to a Learners Emotional State;180
9.3.9;Conclusion;181
9.3.10;References;182
9.4;Chapter 8: What Sunshine Is to Flowers: A Literature Review on the Use of Emoticons to Support Online Learning;188
9.4.1;Introduction;188
9.4.2;Method;190
9.4.3;Emoticons and Electronically Mediated Communication;190
9.4.4;Emoticons and Group Differences;191
9.4.5;Emoticons and Social Contexts;192
9.4.6;Emoticons and Online Learning;193
9.4.6.1;Improving Communication;193
9.4.6.2;Enhancing Social Presence;195
9.4.6.3;Building Community;196
9.4.7;Limitations and Gaps;199
9.4.8;Future Research;200
9.4.9;Instructional Recommendations;201
9.4.10;Conclusion;202
9.4.11;References;203
9.5;Chapter 9: Robots, Emotions, and Learning;208
9.5.1;The Role of Emotions in Education;209
9.5.1.1;Why Should We Value Emotions Along with Cognition in Education?;209
9.5.2;What Role Can Technology Play?;211
9.5.3;Cases Where Emotions Influence Students Learning in a Technology Environment;213
9.5.3.1;Fixing Work Instead of Fearing Learning;214
9.5.3.2;Displacement of Emotions;216
9.5.3.3;Affection and Gratitude Display Toward a Robot Versus Disengagement;217
9.5.3.4;Pity—a Caring Emotion from Student to Robot;218
9.5.3.5;Existing Emotions Will Emerge even in Robotics Technology;218
9.5.4;Conclusion;219
9.5.5;References;220
10;Section III: Interactions, Design, and Learning;224
10.1;Chapter 10: Virtual Avatar as an Emotional Scaffolding Strategy to Promote Interest in Online Learning Environment;226
10.1.1;Introduction;226
10.1.2;Emotional Scaffolding and VAs;227
10.1.2.1;Emotional Scaffolding;227
10.1.2.2;Virtual Avatars;229
10.1.2.3;Interest Development in Online Learning;230
10.1.2.3.1;Individual Interest;231
10.1.2.3.2;Situational Interest;232
10.1.3;VA Design as an Emotional Scaffolding Strategy;234
10.1.3.1;VA Persona;234
10.1.3.2;VA Message;235
10.1.3.3;Virtual Agent Scaffolding Model for Interest Development;236
10.1.4;The Case Study;237
10.1.4.1;Aim;237
10.1.4.2;Participants;238
10.1.4.3;Study Design;238
10.1.4.4;Measures;240
10.1.4.5;Study Material;242
10.1.4.6;Procedure;242
10.1.4.7;Data Analysis;243
10.1.4.8;Findings;245
10.1.5;Conclusion;246
10.1.6;References;247
10.2;Chapter 11: Animated Pedagogical Agents and Emotion;250
10.2.1;Introduction;250
10.2.2;What Are Animated Pedagogical Agents?;251
10.2.3;The Role of Animated Pedagogical Agents;251
10.2.4;Benefits of Animated Pedagogical Agents;252
10.2.5;Examples of Animated Pedagogical Agents;253
10.2.5.1;Facilitate Tutoring Systems Architecture;254
10.2.5.2;Provide Assistance in a Virtual World;254
10.2.5.3;Act as a Co-Learner;254
10.2.6;Design of Animated Pedagogical Agents;255
10.2.7;Agent Design and Emotion;256
10.2.8;Research Evidence;257
10.2.9;Conclusion;259
10.2.10;References;260
10.3;Chapter 12: Investigating Students' Feelings and Their Perspectives Toward Web 2.0 Technologies in a Teacher Education Course;264
10.3.1;Introduction;264
10.3.2;Theoretical Background;265
10.3.2.1;Social Presence;265
10.3.3;Types of Web 2.0 Technologies;266
10.3.4;Methodology;269
10.3.4.1;Participants and Context;269
10.3.4.2;Data Collection;270
10.3.4.2.1;Instrument;271
10.3.4.2.2;Interviews;271
10.3.4.3;Data Analysis;271
10.3.5;Results;272
10.3.5.1;The Quantitative Results;272
10.3.5.1.1;Demographic Data;272
10.3.5.1.2;Research Question 1: What Were Students Feelings in Use of Each of the Web 2.0 Technologies?;273
10.3.5.2;The Qualitative Results;273
10.3.5.2.1;Research Question 2: What Were the Students Perspectives on the Different Types of Web 2.0 Technologies?;273
10.3.6;Discussion;288
10.3.7;Affective Dimensions of Web 2.0;289
10.3.8;Conclusion;293
10.3.9;References;293
10.4;Chapter 13: Engagement, Emotions, and Relationships: On Building Intelligent Agents;298
10.4.1;Introduction;298
10.4.2;The Role of Engagement in Collaborations;301
10.4.2.1;Affective Expression;304
10.4.2.2;Affective Expression and Collaboration;307
10.4.3;Relationships with Intelligent Agents;313
10.4.4;Closing Thoughts;317
10.4.5;Acknowledgments;317
10.4.6;References;317
11;Index;320
12;Back Cover;329


Chapter 1 Emotions in Adaptive Computer Technologies for Adults Improving Reading
Arthur C. Graessera; Whitney Baera; Shi Fenga; Breya Walkera; Danielle Clewleya; David P. Haysa; Daphne Greenbergb    a University of Memphis, Memphis, Tennessee, USA
b Georgia State University, Atlanta, Georgia, USA Abstract
In the United States, more than 15 % of adults are struggling with low literacy skills. Adults who struggle with reading are a heterogeneous group with an extremely varied set of skills and experiences. Difficult circumstances often dictate their ability to attend classes regularly thus leading to problems in their ability to engage with and retain learning strategies. The Center for the Study of Adult Literacy (CSAL) has designed an adaptive, web-based instructional tutor that could meet the struggling adult learners' needs for individualized instruction and emotional support. Each CSAL AutoTutor lesson utilizes two computerized agents (a peer agent and a tutor agent) in a trialogue conversation with the adult learner. These agents give the learner guidance, support, and a narrative through which to view each lesson. The agents also allow the adult learner the opportunity to interact and collaborate in a range of scenarios. Texts and examples have been carefully selected for multiple difficulty levels. Based on performance, learners are given easier passages and increased feedback to provide the learner with additional support through moments of confusion. More challenging material with decreased amounts of feedback is provided to learners who are succeeding with the strategies in order to negate boredom and improve motivation. Keywords Conversational agents Intelligent tutoring systems Trialogs Learning Tutoring Acknowledgments
This research was supported by the National Science Foundation (SBR 9720314, REC 0106965, REC 0126265, ITR 0325428, REESE 0633918, ALT-0834847, DRK-12-0918409, 1108845); the Institute of Education Sciences (R305H050169, R305B070349, R305A080589, R305A080594, R305G020018, R305C120001); Army Research Lab (W911INF-12-2-0030); and the Office of Naval Research (N00014-00-1-0600, N00014-12-C-0643). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of NSF, IES, or DoD. The Tutoring Research Group (TRG) is an interdisciplinary research team comprised of researchers from psychology, computer science, physics, and education, at University of Memphis (visit http://www.autotutor.org, http://emotion.autotutor.org, http://fedex.memphis.edu/iis/). One in six adults in the USA has low-level literacy skills (OECD, 2013). Some of these adult readers are from poor populations and others from countries with a different language. All face difficulties with daily literacy tasks (National Research Council, 2011). This chapter explores how intelligent tutoring systems (ITSs) with conversational agents (i.e., talking heads) have the potential to help them overcome these barriers in a way that is sensitive to both their cognitive and emotional states. Imagine being in charge of helping a group of the struggling adult readers (hereafter called “adult readers,” as a technical term). One of the adult readers is 40 years old, with two full-time jobs, making less than the minimum wage. There is very little time to learn or practice reading. A second adult has many hours of free time because she is unemployed and trying to find a job, but has trouble comprehending and filling out the job application forms. A third adult comes from China, barely speaks English, and depends on relatives for support. A fourth adult is 18 years old, but has never received an adequate education in high school because he dropped out at the age of 15. He finds it difficult to pass the driving test to get a license to get to work. These four adults have very different profiles, but all of them experience stress as they attempt to reach their potentials in a print-rich environment. Technology can rise to the occasion to help these adult readers. During the last 3 years, we have been developing an ITS with conversational agents to help these readers in our Center for the Study of Adult Literacy (CSAL, www.csal.gsu.edu), a major research effort that includes the University of Memphis, Georgia State University, University of Toronto, and Brock University. The computer can recommend texts that fit with their interests and reading abilities; can help them communicate with peers for social interaction; and can deliver interventions that optimize learning. These computer applications are interactive and adaptive to the learner. Interestingly, the adaptivity is not confined to the knowledge, skills, and cognitive capacities of the learner. The computer application also adapts to emotions. All of the computer applications discussed in this chapter attempt to accommodate the emotions and moods of the learners. This chapter describes our attempts to build affect-sensitive computer technologies that try to help these adults improve their reading comprehension. We have already documented, in previous publications, how we have developed, tested, and successfully applied affect-sensitive ITS to help college students learn STEM (science, technology, engineering, and mathematics) topics (D’Mello & Graesser, 2012; Graesser & D’Mello, 2012; Graesser, D’Mello, & Strain, 2014; Lehman et al., 2013). The first section summarizes this body of research. The second section describes how we have used what we have learned from college students on STEM topics, to develop computer applications for adult readers attempting to improve their reading comprehension. In this chapter, we also focus on ITS applications with conversational agents (talking heads, hereafter called “agents”) that communicate with the learner in natural language. Agents have become increasingly popular in ITS and other contemporary learning environments. Examples of adaptive agents that simulate dialogs and that have successfully improved student learning are: AutoTutor (Graesser, 2011; Graesser et al., 2004, 2012; Nye, Graesser, & Hu, 2014); DeepTutor (Rus, D’Mello, Hu, & Graesser, 2013); Coach Mike (Lane, Noren, Auerback, Birth, & Swartout, 2011); Crystal Island (Rowe, Shores, Mott, & Lester, 2011); My Science Tutor (Ward et al., 2013); and Virtual Patient (Ferdig, Schottke, Rivera-Gutierrez, & Lok, 2012). These systems have covered topics in STEM (e.g., physics, biology, computer literacy), reading comprehension, scientific reasoning, and other difficult topics and skills. These systems engage the learner in a dialog, in which the human learner interacts with only one agent. The agent can be either a peer (approximately the same level of proficiency as the human); a student agent with lower proficiency (so that the learner can teach the agent); or an expert tutor agent. AutoTutor is a pedagogical agent that simulates the dialog moves of human tutors, as well as ideal pedagogical strategies. The approach of getting agents to simulate a tutor is a sensible first design for agents because human tutoring is known to be a very effective environment for improving student learning and motivation. Meta-analyses that compare tutoring to classroom teaching and other suitable comparison conditions report effect sizes between s = 0.20 and s = 1.00 (Cohen, Kulik, & Kulik, 1982; Graesser, D’Mello, & Cade, 2011; VanLehn, 2011). Empirical evidence also supports the claim that AutoTutor and similar computer tutors with natural language dialog yield learning gains comparable with trained human tutors, with effect sizes averaging 0.8, range 0.3-2.0 (Graesser, 2011; Nye et al., 2014; Olney et al., 2012; Rus et al., 2013; VanLehn, 2011; VanLehn et al., 2007). In addition to dialogs, there can be multiple agents interacting with the human. For example, three-party conversations, called trialogs, involve two agents and a human learner (Graesser, Li, & Forsyth, 2014). The two agents take on different roles, such as peers and tutors. For example, learners can observe vicariously two agents interacting, can converse with a tutor agent, while a peer agent periodically comments, or can teach a peer agent while a tutor rescues a problematic interaction. Agents can argue with each other over issues and ask what the human learner thinks about the argument. Examples of trialogs appear in the CSAL tutoring environments for adult readers, as well as several successful ITS with multiple agents, such as Betty’s Brain (Biswas, Jeong, Kinnebrew, Sulcer, & Roscoe, 2010); Tactical Language and Culture System (Johnson & Valente, 2008); iDRIVE (Gholson et al., 2009); iSTART (Jackson & McNamara, 2013; McNamara, O’Reilly, Rowe, Boonthum, & Levinstein, 2007); and Operation ARA (Forsyth et al., 2013; Halpern et al., 2012; Millis et...



Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.