1.Workshop on Applied Multimodal Affect Recognition [Website] 

 

Contact Person: 

Shaun Canavan

Organizers: 

Shaun Canavan, Tempestt Neal, Marvin Andujar, and Lijun Yin

Abstract: 

Novel applications of affective computing have emerged in recent years in domains ranging from health care to the 5th generation mobile network . Many of these have found improved emotion classification performance when fusing multiple sources of data (e.g., audio, video, brain, face, thermal, positional). Multimodal affect recognition has the potential to revolutionize the way various industries and sectors utilize information gained from recognition of a person’s emotional state, particularly considering the flexibility in the choice of modalities and measurement tools (e.g., surveillance versus mobile device cameras). Multimodal classification methods have been proven highly effective at minimizing misclassification error in practice and in dynamic conditions. Further, multimodal classification models tend to be more stable over time compared to relying on a single modality, increasing their reliability in sensitive applications such as mental health monitoring and automobile driver state recognition. To continue the trend of lab to practice within the field and encourage new applications of affective computing, this workshop will provide a forum for researchers to exchange ideas on future directions, including novel fusion methods and databases, innovations through interdisciplinary research, and emerging emotion sensing devices. Also, this workshop will have a particular focus on the ethical use of novel application of affective computing in real world scenarios. More specifically, it will discuss topics including, but not limited to, privacy, manipulation of users, and public fears and misconceptions regarding affective computing.

 

2. Functions of Emotions for Socially Interactive Agents (Func-E) [Website]

 

Contact Person:

Patrick Gebhard, DFKI

Organizers:

Elisabeth André, Augsburg University

Ruth Aylett, Heriot Watt University
Patrick Gebhard, DFKI

Dimitra Tsovaltzi, DFKI
Tanja Schneeberger, DFKI

Abstract: 

The interdisciplinary ACII 2021 workshop “Functions of Emotions for Socially Interactive Agents (Func-E)” explores the challenges and the possibilities that come with the view that emotions are not universally unique patterns (internally and externally) and always connected to individual experiences. The focus is on exploring the concept of functions of emotions because we believe that this approach holds great potential for empathic systems.

Emotions have intrapersonal functions (e.g., motivational purposes, guidance of perception, and decision making), interpersonal functions (e.g., signaling of the nature of a relationship or a topic, providing incentives for specific behaviors and underline meanings, illustrating the communicated topic), and socio-cultural functions (e.g., coordinate social situations through the connection of norms to values, beliefs, and behavior).

Empathic and socially interactive agents would benefit from such a view with regard to understanding the dialog partner on an emotional level and showing appropriate behavior. Such agents can play into each of these functions and purposes by, e.g., complying with suitable situational dependent behavioral norms, mirroring behavior. Relying only on the interpretation of social signals might not be enough for every social context. Contextual information, individual differences of the users, and group characteristics are indeed crucial for this process. A computational representation of functions of emotions might be the key to next-gen empathic, socially interactive agents. Foremost, a representation of emotions that integrates individual subjective experience is required. Based on such a representation, individual and social functions of emotions can be modeled. Within that context, it is of most interest how these functions can be related to observable social signals (e.g., voice, gaze, gesture, body movements) and displayed emotions between, at least, two individuals. Moreover, empathic systems in various fields of application (e.g., therapeutic assistance, autonomous driving cars, learning social skills, learning, and working in groups) could be improved by incorporating computational models for functions of emotions.

 

3. 5th International Workshop on Multimodal Analyses enabling Artificial Agents in 
Human-Machine Interaction [Website]

Organisers:
Ronald Böck, Otto von Guericke University Magdeburg, 
Germany
Francesca Bonin, IBM Research Europe
Ronald Poppe, Utrecht University, The Netherlands

Abstract:

One of the aims of building multimodal user interfaces is to make the interaction between user and system as natural as possible in a situation as natural as possible. The most natural form of interaction can be considered how we interact with other humans. While the analysis of human-human communication has resulted in many insights, transferring these to human-machine interactions remains challenging. The automated analysis of the interaction has to consider both semantic and affective aspects, including personality, mood, or intentions of the user, anticipating the counterpart. These processes have to be performed in real-time in order for the system to respond without delays, in a natural environment. The MA3HMI workshop series, now at its 5th edition, brings together researchers working on the analysis of multimodal data as a means to develop technical devices that can interact with humans and react on human’s affects. In particular, artificial agents can be regarded in their broadest sense, including virtual chat agents, empathic speech interfaces and life-style coaches on a smart-phone. In line with the main conference’s theme, we focus on ethical aspect including those in data collection, biases in model development and in the deployment of systems.

 

4. Affective Movement Recognition Challenge and Workshop [Website]

Contact person:

Temitayo Olugbade, University College London, UK

Organisers

Nadia Bianchi-Berthouze, University College London, UK

Amanda Williams, University College London, UK

Nicolas Gold, University College London, UK

Gualtiero Volpe, Università di Genova, Italy

Antonio Camurri, Università di Genova, Italk

Roberto Sagoleo, Università di Genova, Italy

Simone Ghisio, Università di Genova, Italy

Beatrice de Gelder, Maastricht University, The Netherlands

Abstract

We bring to you both a challenge on affect recognition with naturalistic data based on body movement sensors as well as a workshop that will include discussion on how to translate strides made in bodily affective expression recognition research to application in the real world particularly in use cases where bodily expressions are more critical than other modalities. The AffectMove challenge is an interesting machine learning problem as: i) The challenge data are representative of unconstrained everyday settings;  ii) The affective/cognitive experiences that the data capture are application-specific states and not the so-called basic emotions that have traditionally been explored; iii) The data are relatively limited in size and reflect real-world difficulties where affect data capture is not trivial; iv) Altogether, the data come from multiple contexts and population groups and were captured using different sets of movement sensors. The AffectMove workshop complements this with a discussion that aims to initiate a timely debate, among researchers in the area, on what barriers remain to deployment of affect-recognition-from-bodily-expressions technology in the real world and how these could be addressed.

 

5. Broadening Participation in Affective Computing – Undergraduate Research & Teaching Strategies [Website]

Organisers:

Abstract: 

Affective computing courses are taught at institutions like University of Southern California, MIT, and Georgia Tech. These courses are mainly introductory and usually geared towards educating graduate students on the relevant and current psychological theories and artificial intelligence techniques that inform how technologies can begin to synthesize, sense, and interpret human phenomenon related to emotions. Institutions are beginning to craft affective computing courses that allow undergraduate students to apply these techniques to software projects without being bogged down in the theories that permeate graduate courses. Additionally, computer science programs are pursuing research related to technology, race, and society and social impact innovations that can influence the emotions of a nation, culture, or company. This workshop studies how researchers can increase the number of underserved and marginalized persons that study how emotions and affect can be mimicked, sensed, or interpreted by technologies. Techniques learned at the workshop can be applied to broadening participation efforts at universities globally. Undergraduate students will be able to showcase their work at the workshop through the effort.

 

6. What’s Next in Affect Modeling? [Website]

Contact Person:

Konstantinos Makantasis

Organisers:

Konstantinos Makantasis, Institute of Digital Games, University of Malta, Malta

Georgios N. Yannakakis, Institute of Digital Games, University of Malta, Malta and modl.ai, Malta

Bjoern Schuller, Imperial College London, UK

Abstract:

The valid and reliable evaluation of affect and affective interaction is key for the advancement of affective computing (AC). Recent breakthroughs in deep (machine) learning and generative AI have boosted the efficiency and generality of affect models by discovering novel representations of users and their context acting on high resolutions of multimodal signals. Such representations, however, are data hungry and in need of large datasets that AC is not able to offer. Moreover, as affect models gradually become larger and more complex, their expressivity, explainability, and transparency becomes increasingly opaque.

This workshop puts an emphasis on state of the art methods in machine learning and their suitability for advancing the reliability, validity, and generality of affective models. We will be investigating entirely new methods, untried in AC, but also methods that can be coupled with traditional and dominant practices in affective modeling. In particular, we encourage submissions that offer visions of particular algorithmic advancements for affect modeling and proof-of-concept case studies showcasing the potential of new sophisticated machine learning methods.