Image Image Image Image Image Image Image Image Image Image

Bartlett School of Architecture, UCL

Scroll to top

Top

No Comments

@heyhexx The Emotive Bot – Towards Emotionally Expressive Robots

@heyhexx The Emotive Bot – Towards Emotionally Expressive Robots

Computer-mediated communication lacks the deep meaning of non-verbal cues. My question is what techniques can be used to design a non-humanoid agent which goal is to represent and project the client’s emotion in behavioral language to others in online communication? @heyhexx is an interactive robot puppet theatre installation. This installation is existing on Twitter and its aim is to convert the abstract concept of emotion to physical behavior. Hexx (the main character) and the environment it lives in will display physical behavior in response to the emotion within the text that is sent by the user in Twitter in real- time. I designed the behavior based on narratives, roles of making the illusion of life in animation characters, roles of puppeteering non-humanoid abstract robots and principles of affective objects. In addition, I implemented the hardware and simulated the behavior in real-time. The physical design was evaluated. Based on my findings, with the techniques I used, people founded Hexx the robot and the environment expressive and they tended to make more interactions with them. However, the time utilized to highlight the expressed emotion was important to be considered wise to give the chance to the observer to acknowledge the new character he is encountering for the first time. Moreover, the design of texture, shapes, light and subtle movements are essential to the way abstract objects express emotions.

 1.Introduction

The existing era in which online communication is extremely prevalent, humans depend on online communication more than ever as a medium to connect with the others halfway around the world. Online communication which is mostly based on computer-mediated communication (CMC) is very convenient in terms of connecting with other people beyond the boundaries of time and location, however, it’s questionable when it comes to efficiency (Tettegah, Noble, 2016). We miss the layers of conversations in which the emotions are embedded. We lose the perceptional depth we find in face to face communication. Even video chat is not considered as an alternative for face to face communication (Nguyen, 2008). moreover, we communicate in different linguistics. Even through translation, our text loses conversational depth such as sarcasm and joke. Studies have shown the disruptions CMC causes due to lack of non-verbal cues (Kiesler, Siegel, & McGuire, 1984; Walther, 1992; Walther, Slovacek, & Tidwell, 2001).

Nonverbal cues offer valuable information such as tone and context (Loglia, Bowers, 2016). They help us differentiate a happy thanks from rigorous and sarcastic thanks. We can perceive meanings outside whatever that has been told to us through verbal communication or text. We use facial expressions, voice tone, hand gesture, body gesture and language, and other social cues to understand the deep meaning of the F2F (face to face) communication. The main function of nonverbal language is to convey emotions in person to person. Moreover, based on a research by Wei(2012) nonverbal cues are to “emphasize, contradict, substitute, or regulate verbal communication”. We seek out for these nonverbal cues to give us more emotional information in a conversation. It seems vivid to us how we seek out for these cues in F2F communication. Facial expressions are the prime examples. However, how do we seek out for this information in computer-mediated conversation?

In the theory of Social Information Processing, Walther (1992, 1996) suggests that people try to find non-verbal cues in CMC through alternative cues. Emoticons for the last decade are filling the gap of nonverbal cues in online communication by simulating nonverbal emotional behaviors (Loglia, Bowers, 2016).

Emoticons are filling that missing gap in CMC. Social Information Processing Theory (Walther, 1992, 1996) states people compensate for missing communication cues. Lo in Channel Expansion Theory (2008) suggests how people come to learn how to use emoticons in their online conversations. In this theory, he states that after a while, people come to realize the nonverbal cues emoticons bring with themselves in a CMC interface. So, it does not seem surprising that emoticons have found their way into our daily online communication. They enrich our online conversation with nonverbal cues. They have become cultured in western and eastern (Katsuno & Yano, 2007). In recent years, GIFs, Memes and stickers have found their way as animating emotion communicators.

Research Question:

But the existing tactics for making the abstract concept of emotions perceivable for both sides of a conversation do not represent the self and his/her character in this dispute. Throughout this thesis, I want to put one step forward and search for the ways we can make agents that can visualize and communicate human emotions by analyzing them in the input you feed to them. My question is what techniques can be used to design a non-humanoid agent which goal is to represent and project the client’s emotion in behavioral language to others in online communication?

This thesis focuses on building a character with emotions and narrative which behavior can be manipulated and changed by the inputs it gets by the people interacting with. This character’s (Hexx the robot) goal is to represent the abstract feelings and emotion through the tangible behavioral language which could be understandable and perceivable by the majority of people.  In the first chapter, I will go through the related research areas in emotive and empathic agents in animation, data visualization, social robots, abstract robots in social encounter et cetera and derive the methods related to the project. Firstly, I will start by giving an introduction to the @heyhexx project, calling the relevant questions and problems to the thesis question. Each question will be answered in the relevant current research or methodology. The journey starts with the current method of communicating emotions with emoticons and a plugin which acknowledges the participant in a conversation about his/her emotional role on the second person using data analysis. Then, I will look through the role of emotion in believable agents, what brings the illusion of life in robots and animation characters. How people perceive an abstract object with subtle behaviors and movement changes. The role of affective computing in human-robot interaction. The techniques of designing affective objects and the role of narrative in self-representing animations. And finally, how to puppeteer robots.

The third chapter is dedicated to the methodology we used in the current project (@heyehxx) to establish our goal as a physical agent representing the abstract meanings of emotion. The technical problems we had to encounter in the project and the solution to establish, the design aspects of the project and the process we went through and the implementation of behavior in a character will be demonstrated. The improvements of the project during the iterations and future approach to the problems yet we need to solve will also be shown. The system will be put to a test by participants and evaluation method with the outcomes will be revealed in the fourth chapter. In the last chapter, I will be focusing on the discussion, limitations, and conclusion.

 

 

2.Communicating emotion (background and research)

I’m done with smart machines, I want a machine that’s attentive to my needs. Where are the sensitive machines?”

 

–Tweet available at dig_natRT @tigoe via @ramonapringle (Sherry Turkle, 2010).

What does make humans want more than technology from technology? Why do we want sensitive machines which we can relate to? New technologies not only bring humans from halfway around the world together, but they also fall them apart. Computer-mediated communication (CMC) are highly efficient in terms of directly connecting us via text, chat or even video conference. But they steal away the deep quality which lies within the face to face (FTF) communications. In face to face communication, we can see facial expressions, tone, and context. We can witness body language and even touch the second person in the conversation. We will interpret these perceptions in our mind, process and reply back to the person through all of these signs including verbal language. These deep symbols are embedded in emotions. Not only in online communication, some humans still have difficulties expressing their emotions in FTF communication. Culture, language, mental disorder et cetera can cause these struggles. Virtual agents, emoticons, social robots come to reveal emotion to humans while they cannot communicate with emotion.

 

 2.1.Introducing @heyhexx:

@heyhexx is a project promoted to visualize the abstract meaning of emotions lied within the text people post on social media. It’s an interactive puppetry theatre piece. It’s a cross-section among theatre, puppetry, interactive art, data science and emotion analysis. In order to investigate the possibility of projecting the abstract deep meaning of emotion in behavioural language by a puppet made of cardboard and the paper crafted world around it, we animated and translated words to behaviour and movement.

As the emotion in machines are modeled in a different way than humans and we consider robots as another species than humans (Hoffman, Ju,2014), a series of design strategies were demonstrated based on affective object studies, puppetry rules, and current existing interaction systems. We decided for Hexx the robot (the main character of the piece) to be simplistic in terms of design and mechanism because although the limitation in mechanism contributes to the limitation in movements, it broadens the possibility of simple movements and deep influence on the audience (Hoffman, Ju, 2014). The simple design of Hexx, a semi-humanoid non-humanoid robot, and the expressive paper craft environment are because non-humanoid robots are mechanically simpler, have less design difficulty due to their dissimilarity to the human mechanism, also have fewer degrees of freedom, are more reliable and therefore easier to design, control and manufacture. In interaction, they don’t raise the expectations of being realistic including human-like behaviour. They live in a world independent of rules of physics with the free life that only exists in fairy tales.On the other hand, the capability of communication modalities in non-humanoid robots is limited. Moreover, most of the human social interaction is done through body gesture. Hence, the viewers of this piece should be able to interpret the non-verbal cues for emotion in non-humanoid and semi-humanoid robots living in the piece.

The interaction system is as follows below which will be fully demonstrated in the following sections:

Input:

The input of this interactive system is text. The user can interact with Hexx the robot through his/her Twitter handle, which includes the variety of collective conversations among people, politicians and the personal tweets of a person to Hexx.

Emotion Analysis:

The text goes through the API, linking the Twitter handle, to an online AI service using natural language processing. This online service will give the percentage of the 5 core emotions lying within the text. The core emotions are based on the model proposed by Plutchik (2001), the emotion model for humans. Then the emotion will be processed based on the same model. If each of the emotion with the highest percentage is more than fifty percent, the resultant emotion will be one of the core emotions. Otherwise, if we have two dominant core emotions with compatible intensities, the final emotion will be the sum up of the two dominant core emotions.

Translating the emotion into behaviour:

This section is by far the most appreciated in this thesis as the result of the research question I proposed. In this section, I have to establish questions which answers lead to methods to implement behaviours in Hexx the robot and other elements that will be fully described in the following chapters, design process. For translating human emotion into robotic behaviour, first, we have to acknowledge the model of emotion for humans. What can cause into an emotion? How humans express their emotions? And how machines and abstract and non-humanoid objects express emotions in a way that is acceptable to humans? I will give a demonstration of the answer to these questions in puppetry for social robots, digital animation and rules of designing affective objects.

Output:

The output of the process is a 10 seconds video shoot of the piece (set and Hexx the robot) and sent back to the tweeter to the person who originally tweeted @heyhexx. The emotion triggered by the system will manipulate behaviour in Hexx and the reactive world of Hexx which we designed to display emotions of Hexx within a context (by giving Hexx a narrative in an environment which Hexx lives in). Worth mentioning that a robotic arm puppeteers Hexx’s behaviour. The environment is a semi-circular set with paper craft origami and folding paper which are animated by servos and motors. For this section, in the following chapters, I will demonstrate the technical obstacles and solutions we faced along with the future approach.

2.2.Online communication

For online communication, emoticons have played the role of expressing emotions in conversations. Emoticons have been described by many researchers in the last decade. Rezabek and Cochenour (1998) defined emoticons as “visual cues formed from ordinary typographical symbols that when read sideways represent feelings or emotions”. Walther and D’Addario (2001) described them as “graphic representations of facial expressions”. Disregarding the fact that emoticons can change and adapt due to culture, And the fact the people of each country can have their own language of emoticons, they are still known as the facial expressions in CMC. To sum up, Jibril and Abdullah (2013) concluded that emoticons are filling the gap of non-verbal social cues in CMC. Emoticons are the representation of non-verbal cues. For example, each emoticon mimics facial expressions to visualize emotion.

This research depicts the fact that emoticons communicate human emotion by mimicking or imitating facial expressions. “a face expressing a particular emotion” and “graphic representation of facial expressions” are the proof to this. But the question is, what if the agent which is communicating the emotion is an abstract or non-humanoid object, in our case, is Hexx the robot and the animating environment.  This question is demonstrated in the following examples:

US+ (McCarthy,2013) the brain child from the collaboration between Lauren McCarthy and Kyle McDonald, is a plug-in for Google Hangout, a video chat service. This project was one of our inspirational projects due to its similarity to our system. This plug-in has two emoticons with different colours that clarify the emotion of two participants in the conversation. There is also a chart bar that clarifies a series of emotions for both participants: positivity, self-absorption, femininity, aggression, honesty. This system uses audio, facial recognition and linguistic analysis to analyse emotion and optimize conversation. As the conversation starts, the emoticons are neutral, and the emotion bars are grey. As soon as a participant starts talking, the system starts analysing at this time, starts visualizing the emotion and giving the participants acknowledgement of their role of emotion in the conversation. This plug-in also has an auto mutation option which activates when a participant talks too much.

The present project is inspiring for us in terms of the input, the success in analyzation and acknowledging the user of his/her state. Despite of the fact that in this project emotion visualization is limited to slider and smiley faces.

 

2.3.Digital animations

Oz

In a study by Bates et al (1994), the Oz Project investigated artificial intelligence and animating non-humanoid agents to establish the possibility of making interactive environments.  The main characters of this world were called Woggles. They had jelly spherical shape and were controlled by the personality, reactivity, goal-directed behaviour, and emotion. The four characters were specified with colour and size, one of which was controlled by the user. Woggle’s emotions were relative to their interaction with other Woggles and the user. Goal success, failure, prospective failure, and decisions that other Woggles made gave rise to happiness, sadness, fear, gratitude, and anger in varying intensities(Studio for Creative Inquiry, 2012) Bates got the inspiration for the project from the OCC model of emotion (Orthony, Clore & Collins, 1988). Goal driven agents that their emotion arose from the sequence of events, repetition of events happening to the agent and the intensity was based on the desirability of that event.

Bates et al mentioned three attributes to believable agents

1- The emotional state of the character must be clearly defined: the designer must know the state of emotion of his character at each moment and demonstrate it well in the model so that the user will be able to attribute the emotional status.

2– Thinking process of that character must reveal the emotion: this research used the 12 Principles of Animation by Thomas and Johnston(1981) that the viewer should see the emotions in the character while thinking which is influenced and defined by the emotional state.

3- Highlight the emotion: the designer must take the time to establish the emotion in the character. The emotional states are not usually grasped immediately by the viewer. Hence the character should depend on various mechanisms and movement to convey the emotional state.

The three rules of believable agents are clearly derived from the original study of animation in Walt Disney (Thomas, Johnston,1981). The 12 principals of animation which are consisting of:

1-    Squash and Stretch

2-    Anticipation

3-    Staging

4-    Straight ahead action and pose to pose

5-    Follow through and overlapping action

6-    Slow in and slow out

7-    Arcs

8-    Secondary action

9-    Timing

10-    Exaggeration

11-    Solid drawing

12-    Appeal

Let us not forget the fact that these principles are for flat animation drawing and not physical real-time interactive pieces. The problems and issues with real-time machinery are the real-time physics they have to face. Physics and nature are simulated in flat or 2-dimensional animations, although in the real-time machinery all of these aspects affect the movement of the robot and can make the chance of having soft and humanoid movements harder. The study of making smooth movements in robots is beyond the ground of this research. Hence, the first, fourth, fifth, sixth, seventh, eleventh and the twelfth are out of the list of the rules we need to use for the project.

Trip To The Moon

The other case study in emotive animation can be the recent study “A Trip to the Moon” by Peng et al (2018). A reflective animation in which a dog is the main character with a specific goal (reaching the moon) of the collected self-reported mood and behaviour data from each user and created in Unity ( software for modeling, animation, physical simulation, and virtual reality).

This research emphasis is on the power of story-telling and narrative on interpreting self-tracking data. This research and the related researches on HCI suggest the principles for communicating affective states in animation as “illusions of life” to bring lifelike qualities to characters within these animations. Based on the research, the core character should reflect intelligence, emotion, and personality. In this particular research, the narrative plays a big role in reflecting these qualities. Having an agent with a specific goal, overcoming obstacles and achieving the goal at the end of its journey. The self’s mental health during the week manipulates the personality, emotions and the obstacles the character should encounter within the narrative. According to this research, personalized animated videos are more emotionally engaging when they are based on a solid, powerful narrative. Another reference could be Inside Out animation by Pixar in which each of the emotions was represented in a character and the embodiment of a person (Pixar,2015).

Humans’ instinct requires giving life and relating any animating thing to alive creatures. The level of provoking livelihood feelings in humans by animating objects depends on the level of extreme behaviour, the stimuli from which the behaviour is caused, and the emotional contexts of the person who is in counter with the animating objects (Heider, Simmel,1944). It worth mentioning the psychological experiment in which a simple animation of geometric objects with simplistic behaviours was tested with people. In most of the cases, people related the animation and the characters (a big triangle, a small triangle, a small circle, and a rectangle) to living characters with personalities. Only in one case the person perceived and described the animation entirely in geometrical terms. This research study proves that no matter if the object is abstract or geometric, as long as behaviour and subtle changes in behaviour are implemented in the agent, it can be perceived as a living creature and observers will still relate life, emotion, personality, and character to it.

 

2.4. Affective Objects

From the primer subtle movements, various feelings can be provoked in humans. Affective objects are objects which are designed to provoke certain emotions in the observer or the interactant. The use of affective design strategies can be seen in performance, light design, theatre and paintings. Jocelyn Scheirer and Picard (1999) proposed a 2D diagram for designing affective objects and behaviour and grounded in this field of study. This model is based on the dimensional model of emotion used by Schlossberg, and Lang, et al in which the horizontal axis represents valence and the vertical axis represents arousal. the author doesn’t mention the time in this model. Most of the examples the author counts that can refer to this model for affective design are long duration performance pieces or art pieces such as paintings in which the duration is not a factor. However, in Hexx the robot installation, duration of the video taken and sent to twitter is limited to 10 seconds. Hence, the affective space model which the collaboration should refer to is a 3D model with time included. I put Russell’s Circumplex Model of Affect (Russell, 1980) that is based on experiences and sequences of events for people which might not have the same levels for everyone. This diagram can be a good comparison between the design of affective objects and the emotion which is being provoked in viewers. The diagram of Abstract Expressive Space draw inspiration in section 3 of the design, Translating the emotion into behaviour, and helped me with choreographing, in addition, a better understanding of movement, colour, speed, and intensity of light in designing affective objects.

 

 

2.5.Social Robots

“Robots could be thought of as one of those ‘other species’, not trying to copy what we do but interacting with us with their own language, tapping into our own instincts.”

 __ Guy Hoffman. Retrieved from online Dezeen Magazine (Aouf,2018)

Thinking of robots as other species brings up the possibility of implementing behaviours in them not by imitating human instinct, but designing behaviours inspired by expressive behaviours in the other species of creatures. Guy Hoffman has been working on abstract objects in social encounters, as human companion and co-workers for more than a decade. By studying his work, we will learn how to animate non-humanoid robots for social encounters and expressing emotion which is perceivable by humans.

Bashan et al (2018) presented a work called the Greeting Machine, an abstract non-humanoid robot to open social encounters. They designed and generated gestures and evaluated the robot in an opening social encounter. The success point of this work is that when the robot was put to test, people found the robot welcoming in a social encounter.

Guy Hoffman finds the robots from another species that humans should try to understand, and he suggests that robots are not to mimic and imitate human gestures and social cues and behaviours (Hoffman, 2014).  They possess their own language and should interact with humans in their own social cues which is perceivable by them. Hu et al (2018) have recently designed a social robot which mimics animals’ reactions to emotions. The robot has an expressive skin that is made of multiple sheets with different patterns. When different emotions are triggered by the robot, different patterns will be actuated in order to make various textures based on the emotion. They inspired the work based on animal behaviour to emotions such as goosebumps and spikes (Aouf,2018).

 

 

3.Design Process

3.1.Structure

The design process of @heyhexx, an interactive puppetry theatre physically visualizing the emotion within the text in Twitter, was inspired by the work of Lauren McCarthy (US+). We also drew inspiration from the studies by Heider and Simmel and physical character animation by Bashan et al (2018), Peng et al(2018) and Bates et al (1994). Especially in reducing the emotional expressions into non-humanoid objects (the set) and semi-humanoid objects (Hexx the robot). From a variety of geometric shapes,  we chose a cubic structure for the main puppet character. I focused on building a character in simplistic geometry and the potential for expressiveness. With a series of initial sketches and prototypes, Hexx turned out to be a cubic cardboard with two ping-pong balls as eyes, and two smaller cubic boxes as feet which were attached to the main body by two strings. After trying several prototypes in work in progress shows, the mechanical structure morphed from being fully controlled with micro servos in eyes, one for feet and two for tilt motion in the body, to having a robot arm (UR10) controlling the body motion (jump, walk, gaze et cetera). The other two further design changes in Hexx are not implemented in the physical structure and are the aim for future tests. These changes are consisting of adding two servos to each foot for tapping move. Hexx did not get involved in design changes and prototypes because of the team’s intention in keeping this character as simple as possible with subtle movements in eyes and feet due to the inspiration of simple geometry can bring various expressions by Bashan et al (2018).

The environment of Hexx was inspired by the metaphor that Hexx is a human being and the world around it is only an illusion connecting its inner mind conceptions via body to the outer world which projects its perception of the world and emotions. I studied dynamic geometries in origami tessellations by Eric Gjerde (2018) and folding structures. The principles of movement, affective design (Scheirer, Picard, 1999), and narrative in self-reflection(Peng et al, 2018) shed light on the design of the Hexx’s interactive environment. First, I made dynamic origamis and folding papers and studied the dynamics of them. I looked for modules that were dynamic that I could generate the elements of the environment with. My first prototype was a study of geometry based on origami tessellation.

 

The set structure was a circular stage with a height of 80 centimeters and a radius of 240 centimeters. For the simple transition between emotions physically, the sweet spot for perspective on the camera and the position of the robot arm in the set, we decided to have a circular set. For 5 core emotions and one neutral (indifferent), the set was divided into 6 locations. 60 degree was allocated to each location.

 3.2.Behaviour Design

Designing behaviours for Hexx and the environment as a physical translation of emotion consisted of the research on emotional systems in machines and agents, OCC model (1984), the principles of animation (1981) and the Oz project(1994). Let’s not forget the self-reflecting animation by Peng et al (2018) as a study of an expressive personal animation to project the mental health of self. Storyboarding and analyzing the emotion in built up situations were the first steps toward translating emotion to behaviour. I started with storyboarding the five core emotions into behavior. The OCC model of emotions helped us with a goal-driven agent in sequences of events which led to emotion(s). I started building up events and short narratives for making the behaviours more believable in the sequence of time and events. Due to the limitation of the time we had to convince the viewer to an emotion and highlighting it, our designed events had to be exaggerated, convincing and fast enough to be captured in 10 seconds video uploaded in Twitter. The starting point was by making questions about what will contribute to events and situations which provoke feelings in Hexx. The recognition of emotion is better when the situation and the scenario are fully provided (Peng et al, 2018).

The chart led to initially designing separate behavior for Hexx and storyboarding for the environment. The following video freeze of Hexx for each emotion shows the initial behaviour test.

For the environment, the first task that had to be done was the initial storyboarding for each scenario to build the atmosphere for each scene.

The primary sketches of environment depict the fact that I intended to build an abstract origami world to prevent the uncanny valley ( the unsettling feeling observer experience when the simulation of an artificial environment/agent resembles human-like environments but not quite the same)  and to have the freedom of movement of abstract object behaviour. As we went further, we found it harder and harder to build the whole world abstract and intuitive. Ad hoc decisions led to building a polygon (paper craft) world resembling the real world of humans which was animated by mechanisms to resemble a live world where everything was empathic and expressive.

The last stage of designing environment behaviour, I generated detailed storyboards for every moment of the 10 seconds scene of emotion expression scenarios. The following features are the final result of storyboarding and set design into the final set.

3.3.Robot Arm as Puppet Master

The second iteration for Hexx’s behaviour included the robot arm (UR10) as a puppet master. Hexx body got attached to the robot arm’s flange by wires and the robot arm manipulated the tilt motion, jumping, walking, swinging, gazing et cetera. Mechanism system for eyes and feet remained the same as they were before the improvements. Universal Robot arms have 6 joints which make the 3-dimensional movements easy to program. Base, shoulder, elbow, wrist1, wrist2, wrist3 are the 6 joints of the robot arm with the range of 360-degree movement. By using a robot arm as the puppet master, we could generate more complicated behaviors for Hexx. Before the second iteration, Hexx was static, unable to walk as a creature, unable to jump, lie down, fly or gaze. Robot arm gave us the possibility of moving in the 3-dimensional space and broadened the capability of Hexx in more interacting physically with the environment’s elements.

 

One of the disadvantages that made the behaviours looking mechanical in the robot arm was the transition from one target to another. When UR10 is programmed in a software other than its own panel. It doesn’t have the capability of blending the transition among targets. Moreover, the speed in UR10 in a single path is constantly the same. As a known fact, movements not in humans nor other species is at a constant speed. According to the principles of animation, actions have a beginning and an end (Thomas and Johnston, 1981). The transition between them goes from slow to fast to slow again at the end. The motion becomes very detailed both in the beginning and close to the end.

3.4.Technical and Software implementation

Hexx’s mechanism (eyes and feet) and the environment included 32 servos, 4 directional servos and 28 standard servos. We controlled the system with SSC-32U, a 33-channel servo driving board, via serial communication with a system through Processing (an open source platform for coding, visit www.processing.org) . The behavioural system was triggered via OSC message from the JavaScript (an open source platform for coding, visit www.javascript.com) by the emotion and the intensity of the emotion. The system would go through the behaviours based on the intensity rated minimum, average and maximum. The emotion would trigger the location specified for visualizing the triggered emotion. At the same time, a message including the emotion would be sent to the grasshopper file controlling the robot arm.

 

 

 

 4.Evaluation

One of the challenges we had to face in designing emotionally expressive agent was how to evaluate the system and how actually people would react to behaviours generated by @heyhexx. We put our system to a test for 5 days in Ars Electronica exhibition. We put the piece in the context of a physical interactive puppetry in front of the audience where they could tweet at it and see the process of triggering the piece, generating behaviours and finally the video recorded and sent to them via Twitter. We put the setup in a dim space and lightened it with projectors to make the scene more dramatic and to draw more attention to Hexx the robot and the environment.  Over 100 people interacted with @heyhexx while only 42 of them filled out the evaluation questionnaire. We also had open discussions about how the audience felt about the character and the environment. People showed more interest in talking and having open discussions rather than filling out the form.

 

4.1.method

Participants: 40 participants agreed to fill out the questionnaire. Since the evaluation form was anonymous, there was no concern about consent. I didn’t categorize our participants based on gender or age due to our goal for the project to be self-representing on twitter for the users of Twitter. The questionnaire was carried out in English.

Procedure: the experiment was conducted in an exhibition show where a variety of people would visit. Visitors first were given information about the piece, the system and how to interact with it in words. Afterward, we would ask them to tweet @heyhexx on their tweeter page. One of our limitations was that Twitter handle is not viral among all other social media applications, hence, we would ask some of the audience to use our own tweeter handle. The only conditions for tweeting were the language to be English, and characters to be more than 50. After tweeting, the robot arm accompanying set and Hexx were triggered to the behaviour set to emotion, meanwhile, a video would be captured from the piece and after 2 to 3 minutes later, the participant could see the video in the @heyhexx Twitter page. It worth mentioning that the video contained a caption with the resultant emotion in so that the observer could see what his/her text had been carrying in terms of emotion.

For the stability of the system, we had to reduce the variety of behaviours down to one behaviour per emotion. The factor of intensity was omitted as the result of not being very accurate nor visible to the observers we tested with before the exhibition.

After the completion of interaction, we asked the participants to fill out the anonymous evaluation form. In the form, they were asked whether Hexx’s behaviour was emotionally expressive. Whether the environment was expressive. If the interaction between Hexx and the environment was effective to the expressiveness. If the location and narrative suited the emotion the piece was expressing. Whether the videos were qualitatively approved in terms of length, quality, and if they found videos for other participants expressive and efficient. And finally, whether they are tending to send more tweets @heyhexx in the future. An open-ended question was also asked of them where they could express themselves about the piece and Hexx the main character.

4.2.Findings

42 questionnaires were collected and analyzed, leading to the following 4 categories:

Expressiveness:

Efficiency in the design of the environment:

The tendency to making new interactions:

 

4.3.Discussion

Virality:

 

As I analyzed the evaluation forms, I came to this conclusion that likewise emoticons, emojis and GIFs, @heyhexx could be a good emotion communicator in the Twitter handle. The positive results on the design of the set, environmental elements, Hexx’s behaviours and more importantly, the desire of tweeting @heyhexx again by people in the questionnaire are the proof of the project’s success.

 

More than that, participants suggested more interfaces for @heyhexx. They mentioned the website, as a snapchat feature, filters and interfaces in which they could see the world as Hexx sees. They saw the movements, colours and the papercraft and the metaphor for the relationship between Hexx and its world a unique way to communicate emotion in social media.: “it’s amazing how it expresses itself. I can say anything to them and they will express to my text no matter what it is. I can feed it with a celebrity quote.”Â  some people saw it as an emotive companion: “I want it as a companion, I want it personalized, just for me.” These suggestions gave us the possibility of growing the chance of using @heyhexx as an emotive companion for people in the online world. Where they can easily express themselves to it and get feedback, or a way to communicate their emotions to other people. A participant sent a tweet at Hexx showing his hatred toward a person and when he received the video, he said: “Oh! This is awesome. Now I can show this video to that person.”Â  When I specifically asked them whether they want to tweet @heyhexx more often, they mentioned they could even talk to Hexx and they wished there were more elements like sound or other characters that Hexx could interact with. They specifically mentioned neighbours.

The other aspect of @heyhexx was that they didn’t find Hexx judgmental: “It’s not judging me, it’s simply expressing me as I am.”Â  We received messages to Hexx that showed the freedom of the participant in expressing him/herself without shame or the feeling of being judged.

Design aspects:

 Participants described Hexx as a simplistic and yet very expressing creature. One worth mentioning the fact that they couldn’t assign gender to the character. Yet one participant came to me and said: “ I find this design feminine, a male character (Hexx) in a very colourful papercraft world.” They mentioned, form, colour, paper craft and visual aesthetics. Participants appreciated the animating objects in the scene. They mentioned that the moving elements make the environment as an independent expressing piece. Furthermore, there were some cases that resembled the set design to Wes Anderson’s movies scenes like The Isle of Dogs (Anderson, 2018). More participants mentioned the environment as a humanistic environment but not entirely human-like: “These folding buildings, jumping flowers and monsters hiding under flowers are amazing! How are they moving? I wish they were real”. However, some of them found the environment less affective in design and movement as the result of the limitation in the movement of the papercraft elements. Interestingly, participants showed interest in the texture of the set. They simply liked the paper craft and some of them showed interest in the texture we made with origami. Some mentioned that the texture made the movements and the expressions more efficient: “I really like the origami, the texture on rainbows are stunning. The dusty ugly monster is really disgusting.”

5.Conclusion

 

In the work we presented as an interactive digital puppetry theatre designed to convert the abstract meaning of emotion into physical behavior, an interaction system between users and the physical piece in the Twitter handle was designed. A set of behaviours for each emotion based on the intensity were generated. And finally, a hardware and software system which were triggered by the emotion and the intensity of the emotion to actuate the servos, dynamic origamis and the main character. The piece was put to test in an exhibition site and evaluated by participants. In the work we designed, I made precise narratives and scenarios for each emotion. In each of these narratives, I considered the relationship between the environment and the main character in a way that the behaviour of the environment nourished the expressiveness of Hexx. Using metaphors such as considering the world of Hexx as a symbol of the inner mind or mental statement helped us with narratives. Considering different locations with themes (Park for joy and love, the city for sadness and despair, flower shop for feelings like disgust and fear) helped us with narrowing down from multi mood themes to positive and negative themes. Based on the research by Bates et al (1994) the three rules of making the illusion of life in a character helped us in building up the character of Hexx. Roles of designing affective objects showed us how to design movements and choreograph the eyes and feet of Hexx. In addition, the design of the set and setting the themes, moods and colours were affected by the aforementioned research.

In the evaluation, people considered the piece as an expressive interactive system with which they can communicate their emotions with. Although some participants found Hexx and the environment emotionally inexpressive, the majority agreed on the expressiveness and were tending to send more tweets @heyhexx in the future. However, some participant agreed on the complexity of the system. When encountering Hexx the robot and the environment for the first time, participants wanted to get more familiar with Hexx’s personality and the daily narrative I designed for it. What they founded difficult was to acknowledge how to cause particular objects in the environment to move. They ended up asking questions like: “How can I make it dance?” or “What is the flower shop supposed to do? How can I see the monster?”.

What we didn’t consider was time aspect mentioned in the principles of animation. We realized that during the human-robot interaction, not only the time has to be used carefully to convince the user about the livelihood and to invoke the sense of empathy in the user, also, the time has to be taken for the user to acknowledge all the aspects of the agent, to get familiar with the character’s personality and to know how he/she can manipulate the behaviours implemented in the agent. This factor was essential to our project due to the aim of the piece, having users being able to communicate their emotions by Hexx the robot and the environment together. Usually, users in their first encounter with the project were confused about what to tell Hexx (the phrase most of the visitors mentioned) nor they were acquainted with the character or Hexx’s personality. As a believable agent, Hexx must have the ability to introduce itself as a character, show the users it’s routine and be able to tell them how they can manipulate its behaviour by the text they feed the system with. This is a matter of fact as all the animation from Pixar and Walt Disney are based on giving an introduction of the character’s life to put the feeling of empathy in the viewer, take the time for the character to express itself as a person with personality and

In the future approach, I aim to design an automation mood for the system, where users can select among locations, Hexx’ multiple behaviours, it’s mental state and manipulate it simply with pushing buttons. Additionally, another mode for the system should be designed in which Hexx is performing for people a play in which it explains to people about its life and daily routine (a self-presenting piece) , for instance, what makes it sad, joyous, angry or fearful. And how it does react to each of these emotions embedded in its character.

Limitations:

The present project contains several limitations with itself. First, the variety of behaviours was limited to only 7 behaviours representing 7 emotions instead of 45 behaviours for 15 emotions based on the intensity of emotion. The second limitation was the lack of design in introducing the character to the participants which resulted in unfamiliarity, confusion and mostly consumption of facing a complicated system in the participants.

 

References

 

Bates, J et al. ( 1994), “The role of emotion in believable agents,” Communications of the ACM, vol. 37, no. 7, pp. 122—125.

Danesi, M. (2009). Dictionary of media and communications. New York & London: M. E. Sharpe, Inc.

Farahzadeh, P. Liewatanakorn, P. Yamaguchi, S. Behaviour for 5 Moods, May 2018,  [online] Available from: vimeo.com/269231262[Accessed 2nd September 2018]

Farahzadeh, P. Liewatanakorn, P. Yamaguchi. S’ @heyhexx-Work in Progress Show’.  July 2018. [online] Available from: http://two.wordpress.test/heyhexx.html [Accessed 2nd September 2018]

Gjerde, E. Bauhaus Foundation Course Instructional Booklet. January 2018,  [online] Available from: http://www.origamitessellations.com/category/diagrams/ [Accessed 2nd September 2018]

Heider F, Simmel M. (1944), An Experimental Study of Apparent Behaviour, The American Journal of Psychology, Vol. 57, No. 2, pp. 243-259

Hoffman, G. Ju, W. (2014) ‘Designing robots with movement in mind’.  Journal of Human-Robot Interaction, vol. 3, no. 1, pp. 89—122

Aouf, R A. ‘Texture-changing Skin Lets Robot Express its Feelings’, July 2018, [online] Available from: https://www.dezeen.com/2018/07/30/feeling-robot-cornell-university-texture-changing-skin-technology/[Accessed 2nd September 2018]

Inside Out, 2015. [animation] Directed by Pete Docter. USA: Pixar.

Jibril, T. A.  Abdullah, M. H. (2013). Relevance of emoticons in computer-mediated communication contexts: An overview. Asian Social Science, 9(4), 201—208.

Katsuno, H., & Yano, C. (2007). Kaomoji and expressivity in a Japanese housewives’ chat room. In B. Danet & S. C. Herring (Eds.), The multilingual Internet: Language, culture, and communication online (pp. 278—301). New York: Oxford University Press.

Kiesler, S., Siegel, J., & McGuire, T. W. (1984). Social psychological aspects of computer-mediated communication. American Psychologist, No 39, 1123—1134.

Loglia J M, Bowers C. (2015), Emoticons in Business Communication: Is the 🙂 Worth it? , Ekle E [Eds]. (2015), ‘Emotion, Technology and Design’, Cambridge, pp. 37-51.

Lo, S. (2008). The nonverbal communication functions of emoticons in computer-mediated communication. CyberPsychology & Behavior, No 11(5), 595—597.

McCarthy, L, December 2013, US+, [online] Available from: http://lauren-mccarthy.com/us [Accessed 2nd September 2018]

Nguyen, D. T. (2008). Visually dependent nonverbal cues and video communication. Unpublished doctoral dissertation, Berkeley: University of California.

Ortony, A., Clore, G. L., & Collins, A. (1988). The cognitive structure of emotions. Cambridge: Cambridge University Press.

Peng F et al, (2018), A Trip to the Moon: Personalized Animated Movies for Self-reflection, Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, No. 253.

Plutchik, R. (2001). Integration, differentiation, and derivatives of emotion. Evolution and Cognition, 7(2), pp 114—125.

Russell, J. A. (1980). A circumplex model of affect. Journal of Personality and Social Psychology, No 39(6), pp 1161—1178.

Rezabek, L. L., & Cochenour, J. J. (1998). ‘Visual cues in computer-mediated communication: Supplementing text with emoticons’. Journal of Visual Literacy, No 18, pp 201—215.

Studio for creativity Inquiry, The Oz Project , June 2012, US+, [online] Available from: https://vimeo.com/47883775[Accessed 2nd September 2018]

Scheirer J, Picard R. (1999), Affective Objects, MIT Media Laboratory Perceptual Computing Section Technical Report, No. 524.

Thomas, F. Johnston, O. (1981). Disney Animation: The Illusion of Life. Abbeville Press, New York.

Turkle, S.  Nearest Neighbours. (2010), ‘Alone Together: ‘Why We Expect More from Technology and Less from Each Other’, New York, pp. 31-42.

The Isle of Dogs, 2018[animation] Directed by Wes Anderson. USA: Studio Babelsberg, Indian Paintbrush & American Empirical Pictures.

Walther, J. B. (1992). Interpersonal effects in computer-mediated interaction. Communication Research, No 19, pp 52—90.

Walther, J. B. (1996). Computer-mediated communication: Impersonal, interpersonal, and hyper-personal interaction. Communication Research, No 23, pp 3—43.

Walther, J. B., & D’Addario, K. P. (2001). The impacts of emoticons on message interpretation in computer-mediated communication. Social Science Computer Review, No 19(3), pp 324—347.

Walther, J. B., Slovacek, C. L., & Tidwell, L. C. (2001). Is a picture worth a thousand words?: Photographic images in long-term and short-term computer-mediated communication. Communication Research, No 28(1), pp 105—134.

Wei, A. C. Y. (2012). Emoticons and the non-verbal communication: With reference to Facebook. Unpublished master’s thesis, Bangalore, India: Department of Media Studies, Christ University.

Submit a Comment