The human body generates social signals using various types of movements. The most predominant and most analysed of them all are the gestures produced by the limbs and facial muscle movements to generate expressions. Academic research experiments have confirmed that such body movements and postures do contain emotional data. To extract such data, our team has now acquired state of the art, custom-built sensor systems, namely the StrechSense gloves and the Rokoko Smart suit. These systems expand our capabilities to analyse the full skeletal motions of the human body and will enable us to obtain the emotional data contained in them.
People from the deaf community conversing in sign language convey meaningful information not only through their hands but also through their body motions, and facial expressions. With the goal of making use of all of this information, we are starting our journey to develop an end-to-end sign language interpreter with detailed motion capture data. This will not only allow us to understand the fine details of social signals encompassed in signing but also understand communicative components of emotions that may be associated with them. In the larger spectrum of the social signal domain, we hope our journey developing a sign language interpreter will also help us broaden the understanding of social signal patterns.