Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
Proceedings of the 2005 International Conference on Active Media Technology, 2005. (AMT 2005).
This paper reflects on some of our research on embodied agents from the viewpoint of non-verbal behavior. In previous studies we aimed to investigate the utility of embodied interface agent by applying novel evaluation methods. One study tracks bio-signals in order to evaluate the impact of affective agent behavior on the stress level of users [13]. In another study, users' eye movements were recorded to demonstrate the benefit of an embodied interface agent as a navigational guide [6]. Since the encouraging results of these two studies mostly relied on the use of non-verbal agent behaviors, including non-verbal means to express affect and empathy as well as deictic gestures, we want to answer the question posed in the title of the current paper in the affirmative: non-verbal agent behavior matters for effective human-computer interaction.
Lecture Notes in Computer Science, 2004
In this paper, we report on our efforts in developing affective character-based interfaces, i.e. interfaces that recognize and measure affective information of the user and address user affect by employing embodied characters. In particular, we describe the Empathic Companion, an animated interface agent that accompanies the user in the setting of a virtual job interview. This interface application takes physiological data (skin conductance and electromyography) of a user in real-time, interprets them as emotions, and addresses the user's affective states in the form of empathic feedback. We present preliminary results from an exploratory study that aims to evaluate the impact of the Empathic Companion by measuring users' skin conductance and heart rate. 1 Introduction The idea of a computer 'sensing' the user's autonomic nervous system (ANS) activity is becoming increasingly popular in the human-computer interface community, partly because of the availability of affordable high-specification sensing technologies, and also due to the recent progress in interpreting users' physiological states as affective states or emotions [Picard, 1997]. The general vision is that if a user's emotion could be recognized by a computer, human-computer interaction would become more natural, enjoyable, and productive. The computer could offer help and assistance to a confused user or try to cheer up a frustrated user, and hence react in ways that are more appropriate than simply ignoring the user's affective state as is the case with most current interfaces. Our particular interest concerns interfaces that employ embodied agents, or lifelike characters, as interaction partners of the user. By emulating multi-modal human-human communication and displaying social cues including (synthetic) speech, communicative gestures, and the expression of emotion, those characters may also trigger social reactions in users, and thus implement the "computers as social actors" metaphor [Reeves and Nass, 1998,Bickmore, 2003]. This type of 'social interface' has been demonstrated to enrich human-computer interaction in a wide variety of applications, including interactive presentations, training, and sales (see [Prendinger and Ishizuka, 2004] for a recent overview).
2008
Until recently, the area of human-computer interaction was based on a traditional, cognitive approach, which separated the study of usability from that of emotions. The recent research has shown that emotions play an important role in our life, which led to focusing on the need of studying the emotions in the domain of interactive design. This paper underlines the role
Revista Informatica Economică nr, 2008
Until recently, the area of human-computer interaction was based on a traditional, cognitive approach, which separated the study of usability from that of emotions. The recent research has shown that emotions play an important role in our life, which led to focusing on the need of studying the emotions in the domain of interactive design. This paper underlines the role of emotions as part of the interactive human-computer process, reinstating the importance of nonverbal communication in this domain. The main issues of this paper are concerned with aspects such as: emotional design approach, the importance of nonverbal as an instrument of usability evaluation and the role of emotions in human-computer interaction.
Intelligent Virtual Agents, 2010
Since embodied agents are considered as equally usable by all kinds of users, not much attention has been paid to the influence of users´ attributes on the evaluation of agents in general and their (nonverbal) behaviour in particular. Here, we present evidence from three empirical studies with the agent Max, which focus on the effects of participants` gender, age and computer literacy. The results show that all three attributes have an influence on the feelings of the participants during their interaction with Max, on the evaluation of Max, as well as on the participants` nonverbal behavior.
Intelligent virtual agents, 2007
Against the background that recent studies on embodied conversational agents demonstrate the importance of their behavior, an experimental study is presented that assessed the effects of different nonverbal behaviors of an embodied conversational agent on the users´ experiences and evaluations as well as on their behavior. 50 participants conducted a conversation with different versions of the virtual agent Max, whose nonverbal communication was manipulated with regard to eyebrow movements and selftouching gestures. In a 2x2 between subjects design each behavior was varied in two levels: occurrence of the behavior compared to the absence of the behavior. Results show that self-touching gestures compared to no self-touching gestures have positive effects on the experiences and evaluations of the user, whereas eyebrow raising evoked less positive experiences and evaluations in contrast to no eyebrow raising. The nonverbal behavior of the participants was not affected by the agent's nonverbal behavior.
International Journal of Affective Engineering, 2013
The visual representations of intelligent agents draw increasing attention recently. Additionally, the first impression of humans toward agents is the visual representations. How these man-made creatures' looks determine users' will to interact with. This research focuses on the appearances of embodied agents to investigate the affective interactions between users and agents. This study conducted one experiment to study the affective influences of embodied conversational agents (ECAs) on users in the learning tasks. A one-factor-at-three-levels counterbalanced by within-subjects design is employed in this study. This study asks whether there are significant experiential differences between ECAs when they are represented by different character classifications, such as character preference, user engagement and user-agent relationships. The main contributions of this study are summarised as follows: (1) character classifications can be related to human affective factors; and (2) there may be significant relations among character preference, engagement and user-agent relationships.
2004
Abstract The ability of artificial characters to express emotions is essential for the natural interaction with humans. Their absence could be interpreted as coldness towards the user. Artificial characters can have different embodiments. Screen characters and robotic characters are currently among the most widely used. This study investigates the influence of the character's embodiment on how users perceive the character's emotional expressions.
ACM Transactions on Interactive Intelligent Systems, 2012
Complex and natural social interaction between artificial agents (computer generated or robotic) and humans necessitates the display of rich emotions in order to be believable, socially relevant and accepted, and to generate the natural emotional responses that humans show in the context of social interaction, such as engagement or empathy. Whereas some robots use faces to display (simplified) emotional expressions, for other robots such as Nao, body language is the best medium available given their inability to convey facial expressions. Displaying emotional body language that can be interpreted whilst interacting with the robot should significantly improve naturalness. This research investigates the creation of an Affect Space for the generation of emotional body language to be displayed by humanoid robots. To do so, three experiments investigating how emotional body language displayed by agents is interpreted were conducted. The first experiment compared the interpretation of emotional body language displayed by humans and agents. The results showed that emotional body language displayed by an agent or a human is interpreted in a similar way in terms of recognition. Following these results, emotional key poses were extracted from an actor's performances and implemented in a Nao robot. The interpretation of these key poses was validated in a second study where it was found that participants were better than chance at interpreting the key poses displayed. Finally, an Affect Space was generated by blending key poses and validated in a third study. Overall, these experiments confirmed that body language is an appropriate medium for robots to display emotions and suggest that an Affect Space for body expressions can be used to improve the expressiveness of humanoid robots.
In this paper, we motivate an approach to evaluating the utility of animated interface agents that is based on human eye movements rather than questionnaires. An eye tracker is employed to obtain quantitative evidence of a user's focus of attention. The salient feature of our evaluation strategy is that it allows us to measure important properties of the user's interactio n experience on a moment-by-moment basis. We describe an empirical study in which we compare attending behavior of participants watching the presentation of an apartment by three types of media: an animated agent, a text box, and speech only. Users' eye movements may also shed light on their involvement in following a presentation.
2013
The visual representations of intelligent agents draw increasing attention recently. Additionally, the first impression of humans toward agents is the visual representations. How these man-made creatures’ looks determine users’ will to interact with. This research focuses on the appearances of embodied agents to investigate the affective interactions between users and agents. This study conducted one experiment to study the affective influences of embodied conversational agents (ECAs) on users in the learning tasks. A one-factor-at-three-levels counterbalanced by within-subjects design is employed in this study. This study asks whether there are significant experiential differences between ECAs when they are represented by different character classifications, such as character preference, user engagement and user-agent relationships. The main contributions of this study are summarised as follows: (1) character classifications can be related to human affective factors; and (2) there ...
2003
The aim of the experimental study described in this paper is to investigate the effect of an Embodied Conversational Agent (ECA) on the affective state of users. The agent expresses affect by verbal and nonverbal behaviors such as linguistic style and gestures. In this study, we focus on users' emotional state that is derived from physiological signals of the user. Our results suggest that an agent with appropriate verbal and nonverbal behaviors may decrease the intensity of users' negative emotions.
2005
Human-computer relationships are social and emotional in nature. We willingly (often unconsciously) interact with computers as though they are social entities and as such, our interaction with computers can have a strong impact upon our emotional states. This socialemotional interaction with computers forms the basis of my research interests and the work which I have investigated to date.
Intelligent Virtual …, 2009
This study investigated whether emotional expressions of an ECA influence the participants´ nonverbal and verbal behavior. 70 participants took part in a small talk (10 min.) situation with the ECA MAX who was presented with two different types of feedback: emotional feedback (EMO), which provided a feedback about the emotional state of MAX (including smiles and compliments) and envelope feedback (ENV), which provided a feedback about the comprehension of the participants´ contributions. In general we found that participants showed frequent behavior known from human-human-communication, such as proactive greetings or waving goodbye. Additionally, with regard to some behaviors, the agent´s behavior had an impact: Participants in EMO significantly gave more compliments and exhibited more phrases of politeness ("Thank you!") than in ENV.
Journal of Pragmatics, 2010
In the majority of existing applications, users are enabled to interact with embodied conversational agents (ECAs) with keyboard and mouse. However, while using a keyboard to communicate with an agent that talks and simulates human-like expressions is quite unnatural, a speech-based user input is a more natural way to interact with a human-like interlocutor. In addition, spoken interaction is likely to become more common in the near future. Humans proved to align themselves in conversations by matching their nonverbal behavior and word use: they instinctively converge in the number of words used by turn and in the selection of terms belonging to 'social/affect' or 'cognitive' categories (Niederhoffer and Pennebaker, 2002). ECAs should be able to emulate this ability; understanding whether and how the interaction mode influences the user behavior and defining a method to recognize relevant aspects of this behavior is therefore important in establishing how the ECA should adapt to the situation.
2005
Virtual agents are used to interact with humans in a myriad of applications. However, the agents often lack the believability necessary to maximize their effectiveness. These agents, or characters, lack personality and emotions, and therefore the capacity to emotionally connect and interact with the human. This deficiency prevents the viewer from identifying with the characters on a personal level. This research explores the possibility of automating the expression of a character's mental state through its body language.
2000
This paper addresses fundamental research problems concerning the implementation of nonverbal communication channels in so-called conversational interfaces: As a solution to these problems a new methodology is introduced that makes possible the generation, systematic variation and experimental evaluation of nonverbally interacting interface agents in different task contexts. A taxonomy of research designs is presented and first applications of the new methodology are summarized and discussed.
Affective Computing, 2008
Human computer intelligent interaction is an emerging field aimed at providing natural ways for humans to use computers as aids. It is argued that for a computer to be able to interact with humans it needs to have the communication skills of humans. One of these skills is the affective aspect of communication, which is recognized to be a crucial part of human intelligence and has been argued to be more fundamental in human behaviour and success in social life than intellect (Vesterinen, 2001; Pantic, 2005). Embodied conversational agents, ECAs (Casell et al., 2000), are graphical interfaces capable of using verbal and non-verbal modes of communication to interact with users in computerbased environments. These agents are sometimes just as an animated talking face, may be displaying simple facial expressions and, when using speech synthesis, with some kind of lip synchronization, and sometimes they have sophisticated 3D graphical representation, with complex body movements and facial expressions. An important strand of emotion-related research in human-computer interaction is the simulation of emotional expressions made by embodied computer agents (Creed & Beale, 2005). The basic requirement for a computer to express emotions is to have channels of communication such as voice, image and an ability to communicate affection over those channels. Therefore, interface designers often emulate multimodal human-human communication by including emotional expressions and statements in their interfaces through the use of textual content, speech (synthetic and recorded) and synthetic facial expressions, making the agents truly "social actors" (Reeves & Nass, 1996). Several studies have illustrated that our ability to recognise the emotional facial expressions of embodied computer agents is very similar to that of identifying human facial expressions (Bartneck, 2001). Related to agent's voice, experiments have demonstrated that subjects can recognize the emotional expressions of an agent (Creed & Beale, 2006) whose voice varies widely in pitch, tempo and loudness and its facial expressions match the emotion it is expressing. But, what about the impact of these social actors? Recent research focuses on the psychological impact of affective agents endowed with the ability to behave empathically with the user (
2000
This paper addresses fundamental research problems concerning the implementation of nonverbal communication channels in so-called conversational interfaces: As a solution to these problems a new methodology is introduced that makes possible the generation, systematic variation and experimental evaluation of nonverbally interacting interface agents in different task contexts. A taxonomy of research designs is presented and first applications of the new methodology are summarized and discussed.
Speech Communication, 2009
Our aim is to create an affective Embodied Conversational Agent (ECA); that is an ECA able to display communicative and emotional signals. Nonverbal communication is done through certain facial expressions, gesture shapes, gaze direction, etc. But it can also carry a qualitative aspect through behavior expressivity: how a facial expression, a gesture is executed. In this paper we describe some of the work we have conducted on behavior expressivity, more particularly on gesture expressivity. We have developed a model of behavior expressivity using a set of six parameters that act as modulation of behavior animation. Expressivity may act at different levels of the behavior: on a particular phase of the behavior, on the whole behavior and on a sequence of behaviors. When applied at these different levels, expressivity may convey different functions.
2009
Embodied agents have received large amounts of interest in recent years. They are often equipped with the ability to express emotion, but without understanding the impact this can have on the user. Given the amount of research studies that are utilising agent technology with affective capabilities, now is an important time to review the influence of synthetic agent emotion on user attitudes, perceptions and behaviour.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.