Speech is a highly variable signal. For example, acoustics of the same word spoken by different speakers are highly distinct; nevertheless, listeners can easily recognize the sounds to carry the same linguistic message. Human listeners and even infants over their first year of life, can perceive speech categories of their native language amidst remarkable acoustic variability present in speech signals. But how does the human brain naturally learn to group physically distinct sounds into their native speech categories without directed training or explicit feedback?
One candidate neural substrate is the striatum and its interaction with the cortex (i.e., cortico-striatal loops), which have been mostly implicated in visual category learning. However, this evidence was found using explicit feedback-based categorization tasks. Likewise, most speech learning studies typically use standard feedback-based tasks involving explicit categorization of sounds, so do not model how learning occurs in real-world. Thus, our understanding has been limited in regards to how the brain learns sound categories incidentally from complex naturalistic environments. Does the striatum plays any role in a more ecologically-valid category learning?
In order to investigate the neural basis of incidental sound category learning, Sung-Joo Lim (CMU Psychology/CNBC alumna and currently at Boston University as a Research Assistant Professor), worked with Julie Fiez (Pitt Psychology and CNBC) and Lori Holt (CMU Psychology and CNBC), used fMRI as participants actively played a rich multimodal and first-person videogame. This videogame provides a unique way to examine how sound category learning occurs incidentally without overt categorization; it does not require explicit attention to sounds, but sound category exemplars have functional utility in predicting upcoming gaming events, and come to incidentally guiding successful game actions in players. When participants were exposed to statistically coherent sound exemplars associated with a specific gaming event (i.e., appearance of a specific alien character), they learned functionally relevant sound categories by simply playing the game without explicit categorization of sounds.
“Critically, we found that the posterior striatum was engaged and functionally connected to the auditory cortex (i.e., left posterior superior temporal sulcus) during game play,” Lim reports. “The magnitudes of its activation and connectivity predicted the learning outcome assessed after the game training.” In contrast, the authors did not observe the same pattern of results in participants who played the same game but with less statistically coherent sound category exemplars, even though they exhibited similar game performance in the scanner as those who consistently heard coherent sounds in the game. These results provide evidence that the striatum is sensitive to the presence of statistical regularities in the behaviorally relevant categories even though subjects are not directed at categorization per se. “Through its interactions with the auditory cortex, the striatum contributes to incidental acquisition of sound category representations emerging from naturalistic learning environments,” Lim concludes.
Their paper, “Role of the striatum in incidental learning of sound categories”, was published in the Proceedings of the National Academy of Sciences.