![]() Two recent studies have found that projections from ACx to the auditory striatum drives decision-making in rodents, and that selective plasticity of these synapses may underlie the establishment of this behavioral association 20, 21. The sensory subdivision, especially the auditory striatal region which receives projections from the medial geniculate body (MGB, the main auditory thalamus) and the auditory cortex (ACx), remains largely elusive. Moreover, the majority of previous studies have focused on the motor subdivision of the dorsal striatum 16, 17, 18, 19. However, the mechanisms underlying this function remain unclear. Previous studies suggest that thalamostriatal projections may be important for alertness and behavioral switching 15. How these two projections coordinate to regulate striatal activities and striatal-dependent behaviors remain largely unknown. The axon terminals of cortical and thalamic projections converge with comparable densities onto individual striatal neurons, forming functional glutamatergic synapses (i.e., thalamostriatal and corticostriatal synapses) 12, 13, 14. The dorsal striatum receives topographically organized projections from nearly the entire neocortex 9, and from most thalamic nuclei 10, 11. Striatal neurons can respond to visual, auditory, or somatosensory stimulation 8, consistent with anatomically demonstrated sensory inputs. The neostriatum, the main input structure of the basal ganglia, has been widely implicated in habitual and social behaviors 1, 2, 3, decision-making 4, 5, 6, and reinforcement learning 7. Together, our findings reveal that the MGB projection mainly functions as a gain controller, whereas the primary ACx projection provides tuning information for striatal sound representations. In contrast, transiently silencing the primary ACx projection diminish sound responses preferentially at the best frequencies in striatal medium spiny neurons. While recording striatal sound responses, we find that transiently silencing the MGB projection reduced sound responses across a wide-range of frequencies in striatal medium spiny neurons. Here we show that chemogenetic inhibition of the projections from either the medial geniculate body (MGB) or primary auditory cortex (ACx) to auditory striatum in mice impairs performance in an auditory frequency discrimination task. How these pathways contribute to auditory striatal activity and function remains largely unknown. A posterior sub-region of the dorsal striatum, the auditory striatum, receives convergent projections from both auditory thalamus and auditory cortex. Meanwhile, the enhancement of information processing and the reliability of representation of the stimulus by cNEs suggest that cNEs should be considered the principal unit of information processing in AI.The dorsal striatum has emerged as a key region in sensory-guided, reward-driven decision making. Since single neurons can participate in multiple cNEs over the course of a recording, I also show that neurons can multiplex information, and encode slightly different spectro-temporal information, if they encode spectro-temporal information at all, when associated with different cNEs. cNEs also come in two flavors – one of them enhances stimulus representation over single neurons or simultaneously recorded random groups of neurons of the same size, while the other does not represent spectro-temporal features at all, and might reflect internally generated neuronal activity. These cNEs are meaningful constructs that are active in both spontaneous and evoked activity, and their synchronous evoked activity cannot be trivially explained by receptive field overlap. In this dissertation, I show that I can accurately detect coordinated neuronal ensembles (cNEs), which we define as groups of neurons that have reliable synchronous activity, in AI. Determining how AI encodes information will hence require an integrated approach that combines receptive field and multi-neuronal ensemble analyses. Meanwhile, some recent studies have also shown how populations of AI neurons can also encode auditory behavior. Despite that, most studies of information processing in AI focus on either single-unit spectro-temporal receptive field (STRF) estimation, or paired neuronal correlation analyses, and assume that AI neurons filter auditory information either as individual entities or pairs. The primary auditory cortex (AI) is made up of highly interconnected populations of neurons that are responsible for integrating bottom-up auditory information from the lemniscal auditory pathway and top-down inputs from higher-order cortical areas.
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |