Search papers, labs, and topics across Lattice.
This paper introduces a new dataset of 2,780 utterances from 12 participants, capturing facial and neck surface EMG during phonated and silent speech, to investigate the link between affect and articulatory muscle activity. They evaluate intra- and inter-subject affect decoding using various features and model embeddings, achieving up to 0.845 AUC for discriminating frustration. The ablation study reveals that affective signatures are embedded in facial motor activity and persist without phonation, suggesting potential for affect-aware silent speech interfaces.
You can reliably decode frustration from facial muscle activity, even when people aren't speaking aloud.
The expression of affect is integral to spoken communication, yet, its link to underlying articulatory execution remains unclear. Measures of articulatory muscle activity such as EMG could reveal how speech production is modulated by emotion alongside acoustic speech analyses. We investigate affect decoding from facial and neck surface electromyography (sEMG) during phonated and silent speech production. For this purpose, we introduce a dataset comprising 2,780 utterances from 12 participants across 3 tasks, on which we evaluate both intra- and inter-subject decoding using a range of features and model embeddings. Our results reveal that EMG representations reliably discriminate frustration with up to 0.845 AUC, and generalize well across articulation modes. Our ablation study further demonstrates that affective signatures are embedded in facial motor activity and persist in the absence of phonation, highlighting the potential of EMG sensing for affect-aware silent speech interfaces.