The Dynamic and Task-dependent Representational Transformation Between the Motor and Sensory Systems

Xing Tian

The motor and sensory systems need to work collaboratively in various action and perceptual tasks, such as speech. For example, it has been hypothesized that neural signals generated in the motor system can transfer directly to sensory system along a neural pathway (termed as motor-to-sensory transformation). Previous studies have demonstrated that the motor-to-sensory transformation is the key mechanism for speech production and online control. However, it is still unclear how the neural representation dynamically evolves among distinct neural systems and how such representational transformation depends on the task demand and the degrees of motor involvement. The present fMRI study combined univariate analysis with representational similarity analysis (RAS) and used three speech tasks -- articulation, silent articulation and imagined articulation to systematically investigate the representational formats and their dynamic evolution in the motor-to-sensory transformation.The univariate analyses showed that the frontal-parietal-temporal neural pathway was observed in all three speech tasks, but the extent of this motor-to-sensory transformation network differed when the degrees of motor engagement varied among tasks. More importantly, the RSA results showed that primarily articulatory information was represented in motor regions and acoustic information was represented in somatosensory and auditory regions for all three tasks. However, articulatory information was also cross-represented in the somatosensory and auditory regions for articulation and silent articulation tasks.These consistent results provided evidence for the dynamic evolution and task dependent transformation between representational formats in the motor-to-sensory transformation.