Neural dynamics underlying interactive language use

Gregg Castellucci

Vocal communication is a central feature of social behavior in numerous species. While the neural systems underlying vocalization in humans and animals are well studied, less is known about how these circuits enable naturalistic vocal interaction. We investigate how the human brain gives rise to ethologically relevant spoken communication (i.e., conversational turn-taking) by performing a series of intracranial recording and perturbation experiments with neurosurgical patients engaged in both task-based and unconstrained turn-taking. In these interactive contexts, we find that spatially and functionally distinct networks are critical for a speaker’s ability to comprehend their partner’s turns, plan their own turns, and articulate the speech comprising those turns. Furthermore, while the neural substrates of language planning have remained elusive, our results demonstrate that restricted regions of inferior frontal gyrus (classical “Broca’s area”) and middle frontal gyrus are critical for speech-selective planning. Finally, to better understand the specific computations underlying vocal communication, we construct a theoretical framework consisting of the cognitive modules required for generating motor behavior during interaction (e.g., vocalization, co-speech gesture). This model is designed to account for the behavioral and neurobiological features of both naturalistic human language and animal communication; therefore, we intend this framework to facilitate the systematic identification of cognitive analogues across species – which may in turn rely on convergent neural mechanisms.