Skip to content →

DEEP (DFG, 2018-2021)

The DEEP (funded by DFG, since 2018) project tackles the challenge of connecting the outside world with internal symbolic situational representations that are related to individual human emotions. Therefore, the project creates and evaluates a unique combination of a ml-based real-time interpretation of human social signals and a real-time computational model of emotions in a dyadic communication setup between a human and a Social Agent.

The combination relies on a sophisticated representation of communicative emotions, and internal (situational, and structural) emotions, possible emotion elicitors and emotion targets, suitable emotion regulation strategies, and related sequences of social signals and its directions. At runtime, based on the interpretation of the social signals of a human dialog partner, a dynamic theory of mind representation of user emotions is created. It holds all possible internal user emotions with related mental states and cognitive strategies. As a result, this approach allows for a first time a real-time computationally disambiguation of emotion elicitors, emotion targets, and the recognition of possible emotion regulation strategies based on the interpretation of social signals.

Overall, the DEEP project realizes a real-time computational model that describes, on a symbolic level, how social cues can be linked to emotional appraisal context and internal emotional states. The model takes the user’s personality, role, status, relation(s), and other individual values into account.

The DEEP model will be evaluated in dyadic dialogs between a human and a Social Agent. In the future, the DEEP model can be exploited for the creation and investigation of next generation Social Agent applications by extending the user model of such systems by a real-time model of users’ internal feelings and emotion regulation strategies. These extensions allow a more empathic adaptation to a human user’s current situation.

Please check the project web page for more information.

Partners: Prof. Dr. Elisabeth André, Human Centered Multimedia, University Augsburg

News:

On 1.10.2018 a first project seminar with external experts took place at the chair of Prof. Dr. André in Augsburg. Prof. Dr. Eva Bänninger-Huber and Prof. Dr. Svenja Taubner gave talks on the topics “Affect Regulation Processes and Mimic Behavior” and “Interpersonal Understanding – Perspectives of Mentalization Theory”.