Alert button
Picture for Daniel L. Bowling

Daniel L. Bowling

Alert button

Acoustically-Driven Phoneme Removal That Preserves Vocal Affect Cues

Oct 26, 2022
Camille Noufi, Jonathan Berger, Michael Frank, Karen Parker, Daniel L. Bowling

Figure 1 for Acoustically-Driven Phoneme Removal That Preserves Vocal Affect Cues
Figure 2 for Acoustically-Driven Phoneme Removal That Preserves Vocal Affect Cues
Figure 3 for Acoustically-Driven Phoneme Removal That Preserves Vocal Affect Cues

In this paper, we propose a method for removing linguistic information from speech for the purpose of isolating paralinguistic indicators of affect. The immediate utility of this method lies in clinical tests of sensitivity to vocal affect that are not confounded by language, which is impaired in a variety of clinical populations. The method is based on simultaneous recordings of speech audio and electroglottographic (EGG) signals. The speech audio signal is used to estimate the average vocal tract filter response and amplitude envelop. The EGG signal supplies a direct correlate of voice source activity that is mostly independent of phonetic articulation. These signals are used to create a third signal designed to capture as much paralinguistic information from the vocal production system as possible -- maximizing the retention of bioacoustic cues to affect -- while eliminating phonetic cues to verbal meaning. To evaluate the success of this method, we studied the perception of corresponding speech audio and transformed EGG signals in an affect rating experiment with online listeners. The results show a high degree of similarity in the perceived affect of matched signals, indicating that our method is effective.

* Submitted to the 2023 IEEE International Conference on Acoustics, Speech and Signal Processing 
Viaarxiv icon