Bilateral widefield calcium imaging reveals circuit asymmetries and lateralized functional activation of the mouse auditory cortex

Edited by Tobias Bonhoeffer, Max Planck Institute for Biological Intelligence, Munich-Martinsried, Germany; received November 28, 2022; accepted May 29, 2023
July 17, 2023
120 (30) e2219340120

Significance

Coordinated functioning of the two cortical hemispheres is crucial for perception, and neurodevelopmental disorders are associated with altered functional connectivity. The human auditory cortex shows functional lateralization, but the origin of lateralization is unknown. We developed a widefield microscope enabling bilateral calcium imaging on large areas of both temporal lobes in mice. Our results show that while auditory cortex topography is highly symmetrical, higher-order area A2 shows functional lateralization to high-frequency tones and adult vocalizations. Functional circuit analysis shows that A2 has lower and asymmetric hemispheric functional connections, which might underlie the functional lateralization. Lack of sound experiences prevents the development of asymmetric connections. Therefore, altered interhemispheric connections should be considered when treating developmental sensory disorders such as congenital deafness.

Abstract

Coordinated functioning of the two cortical hemispheres is crucial for perception. The human auditory cortex (ACx) shows functional lateralization with the left hemisphere specialized for processing speech, whereas the right analyzes spectral content. In mice, virgin females demonstrate a left-hemisphere response bias to pup vocalizations that strengthens with motherhood. However, how this lateralized function is established is unclear. We developed a widefield imaging microscope to simultaneously image both hemispheres of mice to bilaterally monitor functional responses. We found that global ACx topography is symmetrical and stereotyped. In both male and virgin female mice, the secondary auditory cortex (A2) in the left hemisphere shows larger responses than right to high-frequency tones and adult vocalizations; however, only virgin female mice show a left-hemisphere bias in A2 in response to adult pain calls. These results indicate hemispheric bias with both sex-independent and -dependent aspects. Analyzing cross-hemispheric functional correlations showed that asymmetries exist in the strength of correlations between DM-AAF and A2-AAF, while other ACx areas showed smaller differences. We found that A2 showed lower cross-hemisphere correlation than other cortical areas, consistent with the lateralized functional activation of A2. Cross-hemispheric activity correlations are lower in deaf, otoferlin knockout (OTOF−/−) mice, indicating that the development of functional cross-hemispheric connections is experience dependent. Together, our results reveal that ACx is topographically symmetric at the macroscopic scale but that higher-order A2 shows sex-dependent and independent lateralized responses due to asymmetric intercortical functional connections. Moreover, our results suggest that sensory experience is required to establish functional cross-hemispheric connectivity.
Processing of sensory information is essential for animal behavior, procreation, and survival. The brain achieves efficiency through the coordinated activation of regions as well as through the specialization of regions to specific tasks. The two hemispheres of the brain demonstrate both functional connectivity and functional lateralization (1), which are essential for normal brain function. For example, the left hemisphere of the human auditory cortex (ACx) specializes in speech processing, whereas the right hemisphere analyzes spectral content (25). Irregularities such as reduced laterality and interhemispheric functional connectivity can result in auditory hallucinations (3, 68), and atypical lateralization in children with unilateral cochlear implant leads to difficulty in speech processing and sound localization (9). Thus, key unresolved questions are how lateralized responses arise on the circuit level and how functional circuits develop.
Mouse ACx consists of two primary regions—primary auditory cortex (A1) and anterior auditory field (AAF)—surrounded by higher-order regions including the secondary auditory cortex (A2), ultrasonic field (UF), dorsoposterior (DP), and dorsomedial field (DM) (1013). Primary auditory areas typically receive projections from the ventral medial geniculate body (MGBv), whereas higher-order areas receive projections from the higher-order dorsal MGB (MGBd) and cortical regions outside of ACx (1416). While higher-order areas process more complex stimuli, the level of the ACx hierarchy at which lateralization emerges is unknown.
Similar to humans, mouse ACx shows a functional hemispheric bias. Presenting pup calls to the right but not left ear of mothers elicits maternal behavior (17) indicating hemispheric difference in sound perception. In response to adult vocalizations, c-Fos labeling in mice shows more active neurons in the left ACx than in the right ACx (18, 19) and a larger total area of left versus right ACx (18, 20). Moreover, inactivating the left ACx in mothers hinders pup retrieval (21) indicating a specialized functional lateralization. However, comparison of the hemispheres utilizing group averages overlooks individual differences in lateralization that arise from unique sensory experiences. In humans, fMRI can compare the functional responses between hemispheres in individual subjects. However, investigations of bilateral responses in mouse ACx have not been done, preventing the identification of individual differences in the lateralization of ACx subfields. In addition, the use of c-Fos labeling eliminates nuances in neural activity that occur in time and does not allow investigation of functional circuits underlying functional lateralization. To overcome these limitations, we constructed a bilateral in vivo widefield microscope to simultaneously image both hemispheres of awake adult mice. We found that the size and topographic organization is stereotypical and symmetric across mice. However, sex-specific hemispheric differences were present in the amplitude of sound-evoked responses of A2. Deriving functional connectivity maps showed that A2 had lower interhemispheric functional connectivity in comparison to other ACx regions. We also identified asymmetrical interhemisphere functional connectivity between A1, A2, AAF, and DM, but interanimal variability was high. Last, using mice with congenital sensorineural hearing loss, we show that deafness significantly prevents the development of cross-hemispheric connections. Together, our results show that asymmetric functional connections contribute to functional lateralization and that those connections are sculpted by sensory experience.

Results

In order to measure functional lateralization in mouse ACx and identify individual differences in lateralization, we constructed a bilateral widefield microscope. The microscope uses 2 CMOS cameras (20 Hz frame rate) and LED lights mounted at a ~90-degree angle to capture images from both ACxs (Fig. 1A). We imaged awake, head-fixed male and virgin female Thy1-GCaMP6s x CBA F1 mice (virgin female n = 5, male n = 5), which retain good hearing into adulthood (12, 22) (Fig. 1 A and B). We visualized neural activity in both hemispheres simultaneously during spontaneous activity as well as in response to 4 to 95 kHz tones, nine vocalizations with spectral content between 64 and 95 kHz, and adult pain calls containing low-frequency spectral power (23) all presented at 35, 50, and 65 dB (Fig. 1C and SI Appendix, Fig. S1A).
Fig. 1.
Schematics of the experimental setup. (A) Neural activity in auditory cortices of awake head-fixed mice were captured with the bilateral widefield microscope. (B) Craniotomy on temporal lobes (Top) allows optical access to a circular area covering the auditory cortex in each hemisphere (Middle). In addition to multiple fields of the auditory cortex (A1 = primary auditory cortex, A2 = secondary auditory cortex, AAF = anterior auditory field, DM = dorsomedial field, and UF = ultrasonic field), parts of the somatosensory cortex (S2 = secondary somatosensory cortex and PV = somatosensory parietal–ventral areas) were also in the field of view (Bottom). Horizontal scale bar represents 600 µm. (C) During the imaging session, mice were exposed to one of four types of audio environments: silence (spontaneous activity), pure tone, vocalizations, or pain calls.

Mouse ACx Shows Stereotypical and Highly Symmetrical Organization.

Many studies comparing ACx topography across animals show some degree of stereotypic organization; however, individual differences are also prominently observed (12, 13). To quantify individual differences in topographic organization, we created a quantitative method to compare topography and then utilized this method to investigate the similarity of topographic organization within an individual animal and across mice.
We captured 10 min of spontaneous activity as well as responses to tone stimuli on both cameras simultaneously (Fig. 1 AC). We then calculated the correlation between each pixel in the left hemisphere and each pixel in the right hemisphere during spontaneous activity (Fig. 2A). We found high interhemispheric correlations (R > 0.95) in areas corresponding to secondary somatosensory cortices (S2) (2426) and somatosensory parietal–ventral areas (PV) (2730). We used these two landmarks to create a cartesian coordinate system and mapped the locations of peak responses to sound stimulation from both hemispheres onto this coordinate system. For each cortical area (n = 5 areas), we then measured the spatial distance (in μm) between the locations of peak response in the left and right hemispheres within mice (Fig. 2 A and B). Using this analysis, we found that the average distance between locations of peak response in the left and right hemispheres within an individual mouse across all frequencies and cortical areas, denoted as LvR, was 169 μm (SD: 104.9 μm) (Fig. 2 C and D). Thus, peak areas of activation in each hemisphere were highly similar.
Fig. 2.
Geometric comparison of topography within and across mice. (A) Correlation of pixel traces was calculated for all pixels across both hemispheres (Left). Correlation hotspots (labeled S2 and PV) were located (Middle). The centroids of S2 and PV were used to define the orientation of the y axis of a coordinate system with S2 located at the origin and PV located on the y axis. Images were rotated to the new global axis to compare locations of peak activation of each cortical area (Right). (B) Left and right hemisphere axes were aligned so that location of peak activation could be compared. Red asterisks on axes designate left S2 and PV, and green asterisks on axes designate right S2 and PV. (C) Example of peak activation geometries across different tone frequencies in three mice (M1–M3). All three mice show symmetry between the hemispheres, and a stereotypical organization of peak activation is observed. (D) The difference between the (x, y) coordinates of the peak activation in the left hemisphere and the right hemisphere within each mouse (LvR, n = 111), the difference between the (x, y) coordinates of the peak activation between corresponding cortical areas in the left hemisphere across mice (LvL*, n = 495), the difference between the (x, y) coordinates of the peak activation between corresponding cortical areas in the right hemisphere across mice (RvR*, n = 495), and the difference between the (x, y) coordinates of the peak activation between the left hemisphere and the corresponding cortical area in the right hemisphere across mice (LvR*, n = 495) were calculated. Differences within mice are significantly smaller than across mice (***P < 0.001, two-sample t test, Bonferroni corrected). (E) Difference between left and right hemisphere peak location does not depend on stimulus frequency. (F) Difference between left and right hemisphere peak location does not significantly depend on cortical area. (G) Average activation field across seven mice is rotated to the global axis and overlaid to give a generalized response field to tones.
The distance between locations of peak response across the hemispheres did not depend on sound frequency (Fig. 2E). Within ACx, most areas show similar hemispheric differences and variability [primary auditory cortex (A1): 149.5 ± 73.2 μm; A2: 189.6 ± 84.9 μm; anterior AAF: 158.4 ± 66.0 μm; DM: 128.8 ± 159.8 μm] (Fig. 2F). The ultrasonic field (UF) showed the highest difference across hemispheres (247.0 ± 159.8 μm) (Fig. 2F), possibly due to weaker fluorescence levels emitted from this area than all other cortical areas (Fig. 2F).
Next, we compared the location of peak responses across mice. We grouped the data points representing difference in location of peak response between the left hemispheres across mice (LvL*), between the right hemispheres across mice (RvR*), and between the left and right hemispheres across mice (LvR*) (Materials and Methods). The average difference in the location of peak responses across all mice was 306 ± 166.7 μm for the left and 295.5 ± 168.1 μm for the right hemisphere (Fig. 2D). Comparing left versus right hemispheres across mice results in a difference of 309.9 ± 179.2 μm. Our findings show that across mice, the locations of peak response only differ by ~300 μm and that relative positioning of vectors is similar across all frequency levels (Fig. 2 C and D). Thus, there is a stereotypical geometry of peak response across mice (Fig. 2 B and C). For all cases, the difference in peak location across mice is significantly higher than differences within individuals (L Across vs. LR Within: P = 6.26e-16, R Across vs. LR Within: P = 1.12e-13, LR Across vs. LR Within: P = 2.10e-13, two-sample t test, Bonferroni corrected). This finding highlights that while ACx organization is stereotyped, the topographic organization between hemispheres is more similar within mice than across mice.
Our results indicate that the mouse auditory cortex demonstrates stereotypical subfield organization across mice and that topography between the hemispheres within an individual mouse is symmetrical regardless of stimulus frequency or cortical region (Fig. 2G). Higher similarity within an individual suggests that individual sensory experience might impact cortical organization, while stereotypy across animals also suggests a hard-wired, stereotypical mechanism underlying organization.

Area of Activation to Sound Stimuli Shows Lateralization.

PET and fMRI imaging studies in humans demonstrate that the left hemisphere has a larger area of activation in response to speech stimuli during both passive listening experiments (35, 31) and behavioral tasks (2). In mice, electrophysiological mapping also shows a larger overall active area in the left hemisphere than the right; however, these experiments did not compare hemispheres within individual mice and solely measured the area of responses to pure tone stimuli (20). C-Fos labeling studies found a higher number of labeled neurons in the left hemisphere in response to vocalizations, but the use of c-Fos labeling prevents the comparative analysis of pure tone and vocalization stimuli and precludes collection of multiple trials within an individual animal (19).
We thus set out to confirm whether the left hemisphere has a larger total area of activation by comparing the activity in both hemispheres in response to pure tone and adult vocalization stimuli within individual mice. We calculated the total area of activation—using 35% and 50% of the maximum image value as thresholds (Materials and Methods)—as well as the fractional size for each ACx area (SI Appendix, Figs. S3 and S5 C and F). While tones activated A1, AAF, DM, A2, and UF, vocalization did not strongly activate A1 (Fig. 3A).
Fig. 3.
Area of activation to sound stimuli shows lateralization. (A) Images illustrating the area of activation for each cortical area across all stimulus types. Active areas are defined as pixels with a normalized fluorescence level [ΔF/F(%)] greater than 35% of the maximum value in the image. Vocalizations evoke little to no activation in A1. (B) Hemispheric Area Bias (HAB) (% total areaLeft − % total areaRight)/(% total areaLeft + % total areaRight) of A1 in response to each stimulus type is shown for 10 mice in orange. Percent total area is calculated by summing the number of active pixels of a region and dividing by the total number of active pixels in the entire Acx. Left hemisphere preference is designated with opaque shading, while right hemisphere preference is shown with transparent shading. The average HAB across males (blue) and females (red) is also shown. (C) HAB of A2 (purple) shows left hemisphere preference to 4 to 64 kHz tones (P = 0.0097, n = 10, one-sample t test) and 70 to 95 kHz tones (P = 0.0260, n = 10, one-sample t test) across all mice. Comparing male and female HAB of A2, only males show a significant preference to 4 to 64 kHz tones (P = 0.0072, n = 5, one-sample t test). (D) HAB of DM (blue) shows a right hemisphere preference to 4 to 64 kHz tones (P = 0.0077, n = 10, one-sample t test). Comparing male and female HAB of DM, only males show a significant preference to 4 to 64 kHz tones (P = 0.0251, n = 5, one-sample t test). Horizontal scale bars represent 600 µm.
For males and females grouped together, the activation area was similar between the left hemisphere for all three stimulus categories for both 50% and 35% thresholds (SI Appendix, Fig. S5 A and D). Separated by sex, males and females also had nearly identical responsive areas for ultrasonic tones and vocalizations (SI Appendix, Fig. S5 B and E).
Comparing population averages of the fractional size of each ACx showed that relative sizes were similar across hemispheres for low-frequency tones, high-frequency tones, and vocalizations (SI Appendix, Fig. S5 C and F). However, high variability in measurements was observed between mice. To minimize the effect of individual animal differences, we computed a Hemispheric Area Bias Index (HAB): (% total areaLeft − % total areaRight)/(% total areaLeft + % total areaRight) for A1, A2, and DM for each mouse (Fig. 3 BD and SI Appendix, Fig. S4 BD). When comparing the HAB, we found a left hemisphere bias in A2 to 4 to 64 kHz tones at both thresholds and 70 to 95 kHz tones at 35% threshold and a right hemisphere bias in DM in response to 4 to 64 kHz tones at both thresholds (Fig. 3D and SI Appendix, Fig. S4D). Together, these findings suggest that the hemispheres show differences in the way in which the activity is divided across ACx areas but that across-animal variability is high.

Vocalizations Evoke Higher Response Amplitude in Left A2 than Right A2.

Our results indicate that the activated areas in ACx of each hemisphere differ. In addition to differences in the area of activation, the amplitude of neural responses could show hemispheric differences. Indeed, in the left hemisphere human ACx, e.g., the left planum temporale (PT), activation is higher than that of the right hemisphere (4). While behavioral studies suggest that there is a functional specialization of the left hemisphere for mouse vocalizations and pup calls (18, 21, 32), no direct comparison of the amplitude of activation between the hemispheres exists in mice. Given that each cortical area within the mouse ACx has a different peak response to tones and vocalization stimuli, with strongest responses to vocalizations outside of A1 (Fig. 3A), conflicting reports exist on which cortical area underlies vocalization processing, e.g., A2 for pup calls (18), DM of the left hemisphere of females for courtship calls (11), or DP or VPAF for general higher-order processing (13).
To resolve these issues, we calculated the peak response for each ACx area and stimulus by finding the maximum normalized fluorescence in each cortical area in response to pure tone and adult vocalization stimuli and compared this measurement across areas and hemispheres. We analyzed adult pain calls separately because these vocalizations contain a higher power of low frequencies. Adult pain calls are similar to pup wriggling calls but span a larger overall frequency spectrum, allowing us to not only identify hemispheric bias more independent of frequency bias but also uncover potential sex-specific biases (18, 23, 33, 34). First, we calculated the Hemispheric Response Bias (HRB: ResponseLeft − ResponseRight)/(ResponseLeft + ResponseRight) for each mouse individually (Fig. 4A). In A2, the HRB was positive for 9/10 mice, indicating that the left hemisphere was more strongly activated than the right hemisphere (Fig. 4A). Of note, 1/10 mice had a starkly opposite trend in response to 70 to 95 kHz tones and vocalizations at 50 and 65 dB (outlined in red) (Fig. 4A). Human studies show that in rare cases, language lateralization can be mirrored, especially in individuals who are left handed (5, 35, 36); therefore, perhaps a small percentage of mice also demonstrate a form of “handedness” in functional lateralization. We removed this mouse from the group averages for subsequent analysis.
Fig. 4.
Vocalizations evoke higher response amplitude in A2. (A) The HRB (ResponseLeft − ResponseRight)/(ResponseLeft + ResponseRight) where the left and right responses are the normalized fluorescence response is shown. The normalized fluorescence response is calculated by dividing change in fluorescence by baseline fluorescence [ΔF/F(%)]. The HRB is shown for A2 of individual mice in response to 4 to 64 kHz tones, 70 to 95 kHz tones, adult vocalizations, and adult pain calls at 50 dB and 65 dB. Of note, 9/10 mice show a left hemisphere bias in response to high-frequency tones and vocalization stimuli at both sound levels. The animal with the opposite trend is highlighted in red and eliminated from subsequent averages. The group averaged HRB is shown inlaid into (A). At 65 dB, female mice show a left hemisphere bias to 70 to 95 kHz tones (P = 0.0482, n = 4, one-sample t test), vocalizations (P = 0.0417, n = 4, one-sample t test), and adult pain calls (P = 0.007, n = 4, one-sample t test), whereas male mice only show a left hemisphere bias to 70 to 95 kHz tones (P = 0.0236, n = 5, one-sample t test). At 50 dB, male mice show a left hemisphere bias in response to 70 to 95 kHz tones (P = 0.0099, n = 5, one-sample t test) and vocalizations (P = 0.0098, n = 5, one-sample t test), whereas female mice show no significant left hemisphere bias at this sound level. No mouse 5 data in adult pain call trial due to damage to window prior to recording. (B) The normalized fluorescence levels for each tone frequency and vocalization stimulus type are averaged across mice and plotted. Solid lines show the mean across mice, and shading indicates SEM. (C) Left hemisphere A2 has a significantly higher response to 32 to 82 kHz tones. Of note, 8/9 vocalization trials demonstrate a left hemisphere bias of A2. No other cortical areas show a HRB.
We plotted the average peak responses across trials for each hemisphere (SI Appendix, Fig. S6 A and B). Each line connects the left and right hemisphere response within one mouse. We found that left A2 showed larger responses in response to 70 to 95 kHz tones (P = 0.040, n = 9, Wilcoxon rank-sum) and adult vocalizations (P = 0.045, n = 9, Wilcoxon rank-sum) at 50 dB. Computing the HRB based on sex showed that for adult vocalizations, the HRB is similar across both males (HRB: 0.21, SEM: 0.045) and females (HRB: 0.15, SEM: 0.050) at 50 dB and 65 dB (male ratio: 0.10, SEM: 0.050; female ratio: 0.12, SEM: 0.34) (Fig. 4A).
However, only females show significant left hemisphere response bias to adult pain calls at 65 dB (P = 0.007, n = 4, one-sample t test) with an HRB of 0.14 ± 0.021, which is similar to that of adult vocalizations. Female responses to vocalizations and adult pain calls only show a left hemisphere bias in A2 at 65 dB, whereas male responses show bias to vocalizations at 50 dB (Fig. 4A). Thus, A2 shows a sex-specific HRB for vocalizations.
Bias to vocalization could be due to a bias in frequency responses. We thus investigated how the response bias depends on tone frequency. We find that A2 shows a left hemisphere bias only for high-frequency tones (32 kHz to 82 kHz) (Fig. 4B). When we averaged peak activation in A2 across all mice for each vocalization, we found a left hemisphere response bias (P < 0.05, n = 9, paired-sample t test) in 8/9 vocalization stimuli (Fig. 4C). In addition, the left hemisphere A2 showed higher activation than all other cortical areas regardless of the hemisphere (SI Appendix, Fig. S6 A and B). None of the other cortical areas analyzed (A1, AAF, DM, and UF) showed a hemispheric difference to tones (at any frequency) nor vocalizations (SI Appendix, Fig. S6 A and B). Given that we observe a response bias in left A2 to pain calls containing low frequencies, this indicates that left A2 is more selective to vocalization stimuli not only due to higher responsiveness to frequencies within the vocalization range. Together, these results show that ACx and, in particular, A2 show functional lateralization to different stimuli in a sex-dependent manner.

Activity Correlations Reveal Lower Functional Connectivity between A2s and Specific Circuit Asymmetry.

So far, we have shown that while the functional areas are similar across hemispheres, there exists lateralization of the strength of functional activation. We next sought to understand the functional connectivity giving rise to this lateralization. Bilateral imaging allows us to calculate the cross-hemisphere correlations to derive functional connectivity maps. In particular, correlations in spontaneous activity (37) and noise correlations (38) are reflective of connectivity between neurons (8, 39, 40) both within or across hemispheres. Noise correlations are a measure of shared trial-to-trial response fluctuations between two neurons or two cortical areas (38, 41). Recorded at resting state, analyzing correlated fluctuations in spontaneous activity is an alternative way of inferring functional connectivity (39, 42). There are strengths to each method, so we used both to investigate the functional connectivity of ACx within and across hemispheres.
We first set out to investigate the pattern of interhemispheric correlations in spontaneous activity. We used 9-pure-tone stimuli to identify sound-responsive regions and identified ROIs (n = 10 mice). Subsequently, we recorded 10 min of spontaneous activity (12,000 frames) in silence without adjusting the head position and any imaging setting. We then collected all the ROIs identified from each tone for each animal and calculated the interhemispheric correlations in the spontaneous activity of the identified regions. In general, matching auditory fields (Fig. 5A) showed higher mean correlation than “nonmatching fields” (Fig. 5B). Moreover, in matched fields, left DM paired with right DM (DM-DM) showed a high correlation coefficient of 0.93, second to the UF-UF (norm. correlation coefficient = 0.99), and the A2-A2 had the lowest correlation coefficient [normalized (for each animal) correlation coefficient = 0.89] among the five pairings (all P < 0.05, n = 10, bootstrapping hypothesis test, boots = 100).
Fig. 5.
Interhemispheric functional connectivity revealed by spontaneous activity. (A) Normalized correlation coefficients of spontaneous activities between matched fields and (*P < 0.05, **P < 0.05, n = 10, bootstrapping hypothesis test, boots = 100) (B) nonmatched fields in the left/right hemisphere vs. same pair of fields in the right/left hemisphere, e.g., left A1-right DM vs. right A1-left DM. (C) Hemispheric bias between nonmatched fields. The first 10 columns of each panel show the hemispheric bias (e.g., CorrA1left-CorrDMrightCorrA1left+CorrDMright ) of individual mice in the corresponding field combinations. The 11th column (T) shows the pooled data from both sexes. Note that the result in column 11 does not include mouse 4 (male) as it is an outlier. The test statistic P values for the pooled data are shown next to the field labels. (*P < 0.05, n = 9, Wilcoxon signed-rank test) (D) similar to A and B, but the samples were drawn from ROIs elicited by 4 kHz, 16 kHz, and 64 kHz to spatially separate the ROI locations normalized correlation coefficients of spontaneous activities between matched fields and (*P < 0.05, **P < 0.05, ***P < 0.05, and ****P < 0.05, Tukey–Kramer test). (E) Nonmatched fields (*P < 0.05, t test, Bonferroni corrected). Sample number N of each pair: NAI−AI = 51, NDM−DM = 29, NAII−AII = 69, NAAF −AAF = 75, NUF −UF = 7, NAI−DM = 34, NDM−AI = 41, NAI−AII = 58, NAII−AI = 60, NAI−AAF = 58, NAAF −AI = 63, NAI−UF = 19, NUF-AI = 18, NDM−AII = 46, NAII−DM = 43, NDM−AAF = 49, NAAF −DM = 45, NDM−UF = 16, NUF −DM = 13, NAII−AAF = 72, NAAF −AII = 72, NAII−UF = 23, NUF −AII = 21, NAAF −UF = 24, and NUF −AAF = 22. (F and G) Functional connectivity between fields across hemispheres, shown in C and E, respectively, is summarized by linking preferred connections in purple lines.
To investigate whether the correlations between nonmatching fields were similar across interhemispheric pairings, e.g., is the correlation between left DM and right AAF the same as between left AAF and right DM, each nonmatching field pair was placed side by side. This comparison showed that the correlation of left DM to right AAF was not significantly weaker than that of right DM and left AAF (Fig. 5B); however, when we compare the correlation difference within individual animals (Fig. 5C), e.g., (CorrA1left-CorrDMright)/(CorrA1left+CorrDMright) , it shows a bias to the right in DM-AAF (P = 0.012, n = 9, Wilcoxon sign-rank) and a marginal bias in A2-AAF (P = 0.055, n = 9, Wilcoxon sign-rank). This indicates that while asymmetries exist, the individual variability is high. Individual mice showed varying degrees of asymmetry, but asymmetries in DM-AAF were consistently present (Fig. 5C). These results indicate that a number of auditory regions show asymmetries in their functional correlations and that most asymmetries include DM and AAF. Lumping ROIs from each subregion (e.g., A1) for each animal ignores the fact that there are tonotopic differences across subareas and that lumping might obscure more specific effects. We thus analyzed ROIs defined by responsiveness to different frequencies separately. Given the tuning properties of ACx neurons, some pixels could be activated by tones with nearby frequencies and thus contribute to multiple ROIs. To minimize this potential confound, we only used ROIs corresponding to 4 kHz, 16 kHz, and 64 kHz, which reduces the coactivation to <30% (SI Appendix, Fig. S7). By separating subareas, we still find that A2 shows the lowest interhemispheric correlations and that the interhemispheric bias exists in DM-AAF, A2-AAF, and A1-DM (Fig. 5 D and E).
We next used all the ROIs identified and paired across the brain. As a consequence, each animal has multiple ROIs for each field to pair with the other hemisphere (e.g., the low- and high-frequency areas of A1), and ROIs determined by close-by frequencies can have overlap. In this analysis, spontaneous activity correlations between nonmatched fields show significant asymmetries between A1-A2, A1-AAF, DM-A2, DM-AAF, and A2-AAF (SI Appendix, Fig. S8).
Sound stimuli increase overall activity levels in the cortex; therefore, noise correlation analysis might allow the detection of interactions between cells having low spontaneous activity rates and thus reveal additional asymmetries. We measured noise correlations during 1 s tone presentation. In general, noise correlations matched results from spontaneous activity with matching auditory fields showing higher correlations than “nonmatching fields” (SI Appendix, Fig. S9). Moreover, as with correlations determined from spontaneous activity, DM-DM, had the highest correlation, whereas A2-A2 had the lowest (SI Appendix, Fig. S9). The interhemispheric functional connectivity between nonmatched fields revealed by noise correlation was similar (SI Appendix, Fig. S9B), possibly due to contamination of the stimulus responses into noise correlations (43) or the higher sensitivity of our spontaneous activity recordings due to longer duration recording (10 min vs. 1 s, thus 12,000 vs. 20 frames). The noise correlation between A1 and DM (SI Appendix, Fig. S9B, DM-A1: 0.73, A1-DM: 0.72) and between DM and UF (SI Appendix, Fig. S9B, UF-DM: 0.77, DM-UF: 0.78) was equivalent or higher than the lowest value found in matched fields A2-A2 (SI Appendix, Fig. S9A, r = 0.72), which suggests that DM and UF might be parts of A1 instead of separate fields (11).
Together, our results show that the matched-field functional connectivity was higher than that of nonmatched fields and that A2-A2 shows the lowest interhemispheric functional connectivity. Our results also show that asymmetrical interhemispheric functional connectivity exists between nonmatching fields of the auditory cortex.

Interhemispheric Functional Connectivity Is Reduced in Congenitally Deaf Mice.

We showed that ACx fields are functionally connected across the hemispheres and that circuit asymmetries exist. Sound experience can shape functional circuits in ACx and tonotopic maps (4447). We thus investigated whether the development of interhemispheric functional connectivity is influenced by sensory experience. To answer this question, we imaged mice with congenital sensorineural hearing loss. Otoferlin is required for synaptic release from cochlear inner hair cells, and mutation in the otoferlin-encoded gene (OTOF) causes sensorineural hearing loss (4851). OTOF knockout (OTOF−/−) mice show reduced auditory cortical responses in early development (47), diminished auditory brainstem response (ABR) 3 wk after birth, and are profoundly deaf as adult (52). We compared the interhemispheric functional connectivity in normally hearing F1 mice (female n = 5, male n = 5) with otoferlin knockout mice (OTOF−/−, female n = 3, male n = 2) by recording spontaneous activity from both hemispheres for 10 min. However, since in the OTOF−/−, sound-evoked responses cannot be utilized to locate ACx, we developed two different approaches to recognize ROIs falling within the putative ACxs. The first method relies on fitting an average ACx geometry model derived from normal hearing control (OTOF+/+) mice (Fig. 2), and the second method relies on clustering neighboring pixels that were intrahemispherically correlated. Compared with normal hearing mice (Fig. 6 A, Left), interhemispheric functional connectivity was lower in OTOF−/− mice (Fig. 6 A, Right and Fig. 6B and SI Appendix, Fig. S12). Despite the overall reduction in correlation, the matched fields remained relatively higher in correlation than the nonmatching fields in OTOF−/−. These results suggest that sensory experience is required to establish interhemispheric functional correlations.
Fig. 6.
Development of inter- and intrahemispheric functional connectivity is abnormal in OTOF−/− mice. (A) Mean value of interhemispheric correlation coefficients across Acx cortices in control (OTOF+/+) mice and OTOF−/− mice. (B) Mean correlation coefficient difference between control and OTOF−/− mice. All P < 0.05 except A2-AAF, A2-UF (white dashed lines). (C) Interhemispheric functional connectivity between Acx fields in OTOF+/+ mice. The yellow lines connect pairs of fields that exceed a correlation coefficient threshold of 0.7. (D) Interhemispheric functional connectivity between projected Acx fields in OTOF−/− mice. (E) Intrahemispheric functional connectivity (Upper) and identified ROIs (Lower) in the left brain of an exemplar control mouse. (F) Same format as in E but for an exemplar OTOF−/− mouse. (G) Comparison of correlated area ratio (Upper) and ROI number density (Lower) between control and OTOF−/− in both hemispheres (n = 10 for OTOF+/+, n = 5 for OTOF−/−, *P = 0.03, **P = 0.03, bootstrapping hypothesis test, boots = 100). Horizontal scale bars represent 600 µm.
To visualize how functional connectivity differs between the OTOF-/- and OTOF+/+ groups, we thresholded (r = 0.7) (Fig. 6 C and D) and found that the only ROI pair that stayed above this threshold was between projected left UF and right UF. UF is located near S2, which is dominated by head representation, and we speculate that sensory deprivation of auditory inputs may have induced cross-modal plasticity and that S2 may have taken over putative UF territory (5356). Studies on congenital deaf mice (57) and in other deaf animals (58, 59) show that regions of ACx, particularly A1 and AAF, can respond to somatosensory input primarily from the head region. Therefore, hearing is required to establish interhemispheric functional correlations between ACxs, and the presence of strong functional interhemispheric correlations between putative UF might be evidence of cross-modal reorganization.

Intrahemispheric Functional Connectivity Is Lower in Congenitally Deaf Mice.

We next investigated the effects of sound experience on intrahemispheric functional connectivity. Fig. 6 E and F shows intrahemispherically correlated pixels above a threshold at 0.95 for control and OTOF−/− mice with one example, respectively. To visualize the functional connectivity, each pair of correlated pixels was linked with yellow lines with red terminal endpoints. Therefore, the regions covered with appreciable yellow shading indicate the clustering of functionally connected neuronal populations. As was observed from the decreased total area (comparing Upper panels of Fig. 6 E and F) of correlated neighboring pixels and the total number of ROIs identified by clustering the correlated pixels (Lower panels of Fig. 6 E and F), the intra-hemispheric functional connectivity of OTOF−/− within the projected ACx area appeared to reduce. To compare the observations between animals, the number of ROIs and correlated areas were normalized against the effective areas before comparing the two mouse groups. We defined the effective area for each hemisphere as the total number of pixels that were not blocked by the spatial mask, which is used to exclude the somatosensory cortex. Because the effective area may affect the number of ROIs and the correlated neighboring pixels identified, we divided the two values by the effective areas for each animal resulting in a measure for effective ROI number and correlated area ratio, respectively. The effective ROI number between control and OTOF−/− mice was similar for both hemispheres (Fig. 6 G, Lower), but the range was larger for the left hemisphere of the OTOF−/− mice. Conversely, the mean correlated area ratios (Fig. 6 G, Upper) for OTOF−/− mice were lower compared to control mice (P = 0.03 left hemisphere; P = 0.03 right hemisphere, n = 10 for OTOF+/+, n = 5 for OTOF−/−, bootstrapping hypothesis test, boots = 100), which implies that the intrahemispheric functional connectivity within the projected ACx region was weaker and that the origin of this connection was affected by the absence of hearing experience.

Discussion

Using a bilateral widefield microscope, we visualized neural activity in both hemispheres of mouse ACx simultaneously during spontaneous activity as well as in response to tones and vocalizations. Our results show that ACx has a highly stereotypical and symmetric functional organization across hemispheres and that the functional activation of A2 is lateralized. Our results suggest that the functional lateralization of ACx that emerges in A2 is due to functionally asymmetric connectivity between left and right A2, DM, and AAF and that the development of these connections requires sensory experience. We also find that these three regions are strongly driven by vocalizations.
Traditionally, blood vessels or other markers are used to align cortical areas across animals (20, 60). Here, we show that functional measures, such as identifying the highly correlated activity between secondary somatosensory cortices (S2), enable us to reliably pinpoint ACx location. Using the global coordinate system defined by S2 and PV, we revealed high symmetricity within animals as well as stereotypical organization across animals. This method allows for more precise comparison of topography across animals and suggests that variability in prior studies might have resulted from variability of the blood vessel pattern or qualitative alignment of cortical areas (13, 61). Moreover, this method allows investigation of the impact of genetics or experience on ACx organization.
When testing both tonal stimuli and vocalizations and averaging over mice, we found that the total area of activation of ACx was similar between the hemispheres and that the relative sizes of each cortical field were similar between hemispheres. However, we found that when comparing the hemispheres in individual mice that there is a consistently relatively larger size of A2 in the left hemisphere, highlighting the importance of considering individual differences in lateralization analysis. Thus, our findings are consistent with prior results showing that the left ACx contains a larger area of activation and more active neurons (18, 20).
We found that the largest sound-evoked activation occurs in A2 in both hemispheres and that A2 in the left hemisphere has higher evoked amplitude than the right A2 in response to high-frequency tones and vocalizations. In particular, we show a left-hemisphere A2 bias for vocalizations that contain lower frequencies, such as in the adult pain call. Thus, we speculate that A2 is not only attuned to high frequencies but is also specialized in the processing of general vocalization stimuli. Unlike other higher-order areas, A2 receives little input from MGBd and instead receives input from the MGBv (15, 16, 6264). However, the pathway from MGBv to A1 differs from the pathway from MGBv to A2. MGBv projections to A2 are broader, resulting in broader tuned individual neurons and more heterogeneous tuning between neighboring neurons (20, 60, 65, 66). The heterogeneity of tuning in A2 may better equip this region to process stimuli with varying temporal components, such as vocalizations. High tuning heterogeneity of neurons can result in more efficient and reliable processing, which could be useful in representing complex stimuli (67, 68).
High correlations suggest high functional connectivity between the two regions. Our results show that cross-hemisphere correlations between left and right A2 are significantly lower than all other cortical areas, implying lower cross-hemisphere functional connectivity in A2. Interhemispheric functional connectivity could impact functional lateralization: e.g., the dominant hemisphere could inhibit the nondominant hemisphere, thus increasing functional lateralization. In contrast, reciprocal excitatory connections between the dominant and nondominant hemisphere will reduce functional lateralization (1). In other words, higher interhemispheric functional connectivity between homotopic areas indicates higher functional similarity, or reduced functional lateralization, since both hemispheres readily share information (69). Therefore, the lower functional connectivity we observed between A2s signifies that there is hemisphere-specific specialization occurring in A2 that is not occurring in the other cortical areas in response to high-frequency tones and vocalizations. Despite the functional influence of interhemispheric projections (70), we cannot rule out the possibility that common ascending input from MGBs might contribute to the observed functional correlations. However, this would require tight synchronization of the MGBs.
We found that the functional connectivity between the hemispheres is not symmetrical but that individuals vary. For instance, the left AAF has a high functional connectivity to right DM and right A2 than the other direction around, but the hemispheric difference would not emerge when the individual variability is not accounted for. This is consistent with results in humans demonstrating that interhemispheric functional connectivity has a larger impact on functional lateralization than structural asymmetries (71).
Last, our results show that congenital deafness alters the formation of interhemispheric connections in ACx, suggesting that sensory experience shapes these connections. This is consistent with findings that auditory experience has a profound influence on the topographic organization and circuitry within ACx (4447), but early spontaneous activity might also play a role (72). Sensory deprivation in one sensory modality at birth can lead to enhanced sensitivity of other spared sensory modalities through cortical region expansion (54, 59). We speculate that the interhemispheric functional connectivity in the OTOF−/− UF field, which is close to S2, could be evidence of cross-modal plasticity.
Using a bilateral widefield microscope, we were able to demonstrate hemispheric specialization and asymmetric functional connectivity in mice. Moreover, our data suggest that sensory experience is required to establish the asymmetric functional connectivity we observed.

Materials and Methods

Animals.

All procedures were approved by the Johns Hopkins University Animal Care and Use Committee. For hearing mice, a total of 10 mice were used (5 female and 5 male imaged between 6 and 34 wk of age). Thy1-GCaMP6s (Jackson Laboratory Stock #024275) were crossed with CBA/CAJ (Jackson Laboratory Stock #000654) to prevent early-onset hearing loss (73). C57BL/6 mice are homozygous for the recessive Cdh23 allele, which causes this hearing loss. The resulting offspring are heterozygous Ahl+/Cdh23, ensuring that they have healthy hearing and express GCaMP6s under the Thy1 promoter, which is present in excitatory neurons. For congenital deaf mice, a total of five mice were used (three female and two male imaged between 8 and 14 wk of age). OTOF−/− mice expressing GCaMP6s were created by crossing offspring of OTOF−/− and Thy1-GCaMP6s breedings. Founders for our OTOF−/− breeding colony were generously provided by Tobias Moser (Göttingen). All mice were housed in a 12-h reverse light/dark cycle room in the institutional animal colony before and after experiments.

Bilateral Widefield Microscope.

The bilateral widefield microscope (Fig. 1A) consisted of a pair of identical widefield calcium imaging units synchronized by a shared trigger, each collecting images from one side of the temporal lobe. Blue LEDs (M470L4, Thorlabs)—filtered between 450 and 490 nm (ET470/40x, Chroma)—illuminated each side of the brain through the cranial windows. Emission filters (AT535/40m, Chroma) with bandpass 515-55 nm collected the green fluorescence with peak 490 nm emitted from calcium indicator, GCaMP6s. A dichroic mirror 1 long pass at 499 nm (MD499-FITC, Thorlabs) was used to reflect excitation light while allowing fluorescent emission to propagate to the CMOS camera (CS235MU, Thorlabs). To ensure identical illumination power of two units, we modulated the power between 2.9 and 3.1 mW with a DC voltage. Identical dry objectives (UplanSApo 4×/0.16, Olympus) and tube lens (AC508-150A, Thorlabs) were used on both units to ensure equivalent lateral amplification of 3.3.

Craniotomy.

Mice were injected with 0.1cc dexamethasone (2 mg/mL, VetOne) 2 to 3 h before surgery to reduce swelling during surgery. Mice were initially anesthetized with 4% isoflurane (Fluriso, VetOne) with continued exposure at 1.5 to 2.5% throughout surgery. Rectal body temperature of the mice was maintained at 36 °C using a heating pad. Hair was removed with scissors and a hair removal agent (Nair). The skin was sterilized by applying Betadine and then 70% ethanol three times. Skin on top of the head was then removed, exposing the skull. Connective tissue was scraped away from the skull using a scalpel in order to prepare the surface for application of the headpost, and the temporal muscle on both sides was resected to create space for two cranial window implants. Cyanoacrylate (Vetbond) was applied to open incisions to prevent bleeding and infection. The headpost was then secured to the top of the head using both super glue (Loctite) and dental acrylic (C&B Metabond, Parkell). A craniotomy with a diameter of approximately 3 mm was performed over the left auditory cortex first. The cranial window consisted of two circular 3-mm glass coverslips and one circular 4-mm glass coverslip secured together using optic glue (NOA71-Norland Products). Silicone elastomer (Kwik-Sil, World Precision Instruments) was carefully placed around the edge of the 4-mm coverslip, and the window was implanted with 3-mm coverslips facing toward the cortex. Dental acrylic was carefully applied around the 4-mm coverslip to further secure the window in place. The same process was then repeated on the right side of the skull: Anatomical landmarks such as the temporal ridge, lambdoid suture, and the medial cerebral artery were used to ensure window symmetry across hemispheres. Immediately following surgery, mice were injected with 0.1cc dexamethasone, 0.05cc cefazolin (1g/vial, West Ward Pharmaceuticals), and 5 mg/kg of carprofen (0.5 mg/mL, Zoetis US) and placed under a heat lamp for recovery. For 3 d following surgery, mice were injected with 0.1cc dexamethasone and carprofen once a day to ensure proper healing from surgery. In addition, antibiotic water (Sulfamethoxazole and Trimethoprim Oral Suspension, USP 200 mg/40 mg per 5 mL, Aurobindo Pharms USA; 6 mL solution diluted in 100 mL water) replaced normal drinking water for 7 d following the procedure.

Widefield Imaging.

All data collection was completed at least 1 wk after the craniotomy surgery. Throughout the experiment, the mice were awake with the headplate fixed to a 3-axis translational stage, allowing position adjustments. A dark chamber padded with sound-absorbing foam enclosed the mice, the bilateral widefield microscope, and the speaker to minimize outside noise and light. The imaging system and the sound delivery system were controlled by a custom-made MATLAB graphical user interface (GUI). We set the imaging parameters of both imaging subsystems identically: frame rate 20 Hz, 4-by-4 binning with additional 10.9 dB gain. Binning was required to enhance the signal-to-noise ratio (SNR), and the additional gain was used to remove readout noise under low light.

Sound-Evoked and Spontaneous Responses.

Stimuli used in sound-evoked response experiments include two sets of pure tones and two sets of vocalizations. The first set of tones fall within the typical mouse hearing ranging between 4 kHz and 64 kHz with logarithmic spacing and two tones per octave, meaning a total of nine frequency levels. The other set of tones consists of five ultrasonic tones, ranging between 70 kHz and 94 kHz with an even spacing of 6 kHz, which coincides with the mouse vocalization spectrum. As for vocalization stimuli, the first set consists of nine different vocalization stimuli created using the “virtual mouse organ” (23), in which syllables recorded from mice were combined into bouts using a third-order Markov model. Each bout fell between 64 kHz and 94 kHz and contained 10 syllables. Vocalizations of p100 mice and older were used. The tenth vocalization stimulus consists of a bout of 10 adult pain calls which contain abundant sound energy at low frequencies (Fig. 1C and SI Appendix, Fig. S1A). Both tones and vocalization stimuli were played for 1 s with a 3 to 3.5-s pause of silence in between. Each tone and vocalization stimulus was played at 35 dB, 50 dB, and 65 dB. Stimuli were played in a random order to prevent the prediction of stimulus. For both tones and vocalizations, each stimulus was repeated 10 times. Sound waveforms were loaded into an RX6 multifunction processor (Tucker-Davis Technologies) and fed through a PA5 attenuator (Tucker-Davis Technologies) to output the intended sound level. Output from the PA5 was fed to an ED1 speaker driver and, subsequently, an ES1 electrostatic speaker (Tucker-Davis Technologies). Because auditory stimuli presented to one ear preferentially activates the contralateral hemisphere and since we intended to image neural activity in both hemispheres, the speaker was placed directly in front of the mouse at a distance of 10 cm to prevent bias in the activity of one hemisphere. Pure tone stimuli were generated and calibrated by custom MATLAB script; for details, see ref. 12. The peak amplitude of the tones was calibrated to 70 dB SPL using a Brüel & Kjaer 49440A microphone (SI Appendix, Fig. S1B). Vocalization stimuli were fed through a whitening filter from 4 to 95 kHz and calibrated so that the average of the peak sound level was equivalent to 70 dB (SI Appendix, Fig. S1C). There were no systematic differences in sound levels (<0.05 dB between 4 and 64 kHz) measured at the position of each ear.
Spontaneous activity was recorded for 10 min in silence without presenting any sensory stimulus to the mice. For all hearing mice, sound-evoked response experiments were completed first followed by the spontaneous activity experiment without releasing the mouse to ensure identical field of view throughout all experiments. In total, all experiments together lasted ~67 min. For congenitally deaf mice, only the spontaneous activity experiment was conducted.

Image Preprocessing.

All the data analysis was done by custom MATLAB scripts. Because the response nature of spontaneous activity is different from that of sound-evoked activity, we have two protocols to calculate the fractional difference in fluorescence signals, ΔF/F0, with respect to the baseline F0. In sound-evoked response experiments, the baseline was defined as the average fluorescence intensity four frames preceding the stimulus onset. In spontaneous response experiments, the baseline (F0) of ΔF/F0 was defined as the average pixel values of the first 550 image frames (~30 s) with linear adjustment according to the drifting trend. The overall fluorescence drifting level for each time point was calculated by taking the moving average, with a window size of 550 frames, of the original fluctuation trace.

ROI Definitions.

In the present study, a variety of approaches are used to define ROIs, and these can be classified into three categories: sound-stimulation method, cofluctuation correlation method, and map projection method. When presenting pure tones and vocalizations, the sound-stimulation method is used to reveal the topology of ACx, map out the tonotopic gradient, and contrast the response strengths in different cortical areas in both hemispheres. In addition, the sound stimulation method is also used in functional connectivity analysis in OTOF+/+ mice, where the noise correlation and the synchronicity of the spontaneous activity within the sound-sensitive areas are calculated. Alternatively, the cofluctuation method can be used to identify ROIs by analyzing the cofluctuating regions within or across the hemispheres. Because this method does not require neurons to respond to the sound stimulus, it can be applied to both OTOF+/+ and OTOF−/− mice. The map projection method is specifically used on OTOF−/− mice. We determined the ACx topology map from sound stimulation on multiple OTOF+/+ mice and projected this map onto the cortical surface of the OTOF−/− mice. Both the cofluctuation and map projection methods are used for functional connectivity analysis.
For the sound-stimulation method, peaks of activation were calculated by averaging across the 20 playout frames, and then averaging across the 10 trials in order to obtain a single “activation image” that characterizes the response to each stimulus. Then, a Gaussian filter with sigma 1.5 was applied to the activation images corresponding to each stimulus. ROI masks were overlaid over the original activation image to isolate the active regions. Each area of activation was assigned to a particular cortical area A1, A2, AAF, DM, or UF using the tonotopic gradients as reference (SI Appendix, Fig. S3).
Functional connectivity measurements evaluate the degree of correlation between spontaneous activity of different areas. Hence, by definition, “functionally connected neurons” fluctuate together (74), which means that they either share the same source of fluctuation or transmits fluctuation to each other by different or indirect connections. To find intrahemispheric connected areas, the pixel-wise correlation was calculated. To perform pixel-wise correlation calculation, each image frame from the stitched ΔF/F image series was first reshaped into a column vector. Each column vector was then arranged in time order to form a 2D matrix, where each row represents the fluctuation of ΔF/F over the time course for the corresponding pixel. Pearson’s correlation coefficient was next calculated between all the possible combinations of rows. By applying a given correlation coefficient threshold and resuming the pixel locations back to their 2D form, the functional connectivity between pixels within and across hemispheres was visualized. The red-shaded regions shown in SI Appendix, Fig. S10B demonstrate the characteristic pattern when applying a relatively high correlation threshold across hemispheres, 0.85 in this case, which correspond to the secondary (S2) somatosensory cortex and somatosensory parietal–ventral area (PV) in both hemispheres, respectively (75). We used S2 and PV as the dorsorostral boundary of the ACx and applied a correlation threshold of 0.95 within each hemisphere to identify potential ACx (yellow regions in SI Appendix, Fig. S10B). After clustering the neighboring correlated pixels, ROIs can be defined, shown in SI Appendix, Fig. S10C, and demonstrated a similar pattern as by the sound-stimulation method, shown in SI Appendix, Fig. S10A.
The map projection method was developed because the OTOF−/− mice do not respond to sound stimulation. The method is based on the assumption that the relative location between ACx and the axis formed between S2 and PV is fixed. The projection of the ACx topography model derived from OTOF+/+ mice onto OTOF−/− mice was done to define ROIs. An ACx topography model is an averaged map of sound-responsive ROIs identified from 10 OTOF+/+ mice. For the convenience of averaging the model across animals, we transferred the ROIs from local coordinates to global coordinates by taking the centroid of S2 as a reference for shifting and rotating the ROIs with respect to S2 centroid until the line formed by centroids of S2 and PV was vertically oriented, as shown in SI Appendix, Fig. S11B. The projection of the ROIs was made by aligning the orientation and position of S2 and PV in OTOF−/− mice (SI Appendix, Fig. S11A) with that of the model (SI Appendix, Fig. S11 B and C).

Calculating Location of Peak Response.

The peak fluorescence response within each area was calculated by finding the maximum pixel value within the ROI and then taking the average of a 3 × 3 pixel square around the maximum pixel. The accuracy of each peak location was confirmed by manually ensuring that the location fell on the cortex rather than blood vessels or window edges. Only activity falling within the top 90th percentile was considered a peak for geometric comparisons. For comparison of activation strength of each cortical area, if activity within a particular cortical area did not fall within the top 90th percentile, a 3 × 3 square around maximum pixel value of the total area of that region based on tonotopic gradients was used (SI Appendix, Fig. S3).

Area of Activation Calculation.

The area of activation to each stimulus was calculated by averaging the activation images across all attenuation levels (35 dB, 50 dB, and 65 dB) and then applying a Gaussian filter using “imgaussfilt” with sigma 1.5 to the averaged image to eliminate noise (SI Appendix, Fig. S2). Singular value decomposition was performed on the Gaussian filtered images, and the first 10 principal components were used. The images were binarized using “imbinarize” at two different thresholds: one threshold at 50% of the maximum value in the image and one threshold at 35% of the maximum value. The binarized images for each stimulus were summed together, and any pixel >1 was considered active. The number of pixels within the binarized images was calculated using the “bwarea” function. The number of pixels was multiplied by a conversion factor in order to find the area in μm . We used a 0.01-mm stage calibration slide (MUHWA) to measure the effective pixel resolution of our setup to be 1.86 μm . The conversion factor was then calculated by:
CF=d×b×1.862μm2/pixel(1×106)μm2/mm2,
where d corresponds to the downsampling factor (2), and b corresponds to the binning factor (4). The conversion factor was calculated to be 2.21 × 10−4 mm2/pixel. Division of the auditory cortex into subregions was achieved by overlaying tonotopic gradients to manually trace which regions of the binarized image correspond to what cortical areas (SI Appendix, Fig. S2). Percent of total area is calculated as follows:
area of subdivisiontotal voc. area=% of voc. area,area of subdivisiontotal tone area=% of tone area.
Hemispheric Area Bias was calculated using the following equation: (% total areaLeft − % total areaRight)/(% total areaLeft + % total areaRight). The HAB was compared across hemispheres and across mice, and either rank-sum or paired-sample t tests were used depending on the distribution in order to determine differences in distribution of activation across hemispheres.

Geometric Comparison of Tonotopy.

The geometry between cortical areas was calculated using a global coordinate system by setting hotspots of correlated activity (R > 0.95) that are located in the somatosensory cortex as origin. The centroids of each hotspot were calculated for both hemispheres by taking the average pixel locations within a particular hotspot. Then, the dorsal centroid and rostral centroid were used to create the y axis of a global coordinate system, and the existing coordinate system of each image was rotated accordingly (Fig. 2A). The location(s) of peak activation in (x, y) coordinates in response to each stimulus presented (4 to 64 kHz, played at 35 dB, 50 dB, and 65 dB) were determined. For each stimulus, the location of peak activation was determined for each ACx area (five areas), in each hemisphere (2), across all mice (10 mice). The (x, y) coordinates of peak activation in response to each stimulus were averaged over all sound levels. Coordinates for 4 kHz and 5.7 kHz were averaged together and represent the “low-frequency” group. Coordinates for 16 kHz and 22.6 kHz were averaged together and represent the “midfrequency” group. Coordinates for 45.3 kHz and 64 kHz were averaged together and represent the “high-frequency” group. Vectors between the new origin and each peak location (with coordinates rotated to the new axis) were determined (Fig. 2A). Right hemisphere axes were rotated around the y axis, aligned, and overlayed onto the left hemisphere axes to visualize differences between left and right hemisphere locations (Fig. 2B). The following comparisons were made. LvR: The difference between the (x, y) coordinates of the peak location in the left hemisphere and the peak location in the right hemisphere (in a unit of distance) within each mouse was calculated. For each frequency group (low, medium, and high), there were one to five locations of peak activation depending on whether the cortical area (five areas) had a peak activation greater than the 90th percentile. The locations of peak activation were then compared within all mice (10 mice). LvL*: The difference between the (x, y) coordinates of the peak location in the left hemisphere of one mouse and the corresponding (x, y) coordinates of the peak location in the left hemisphere in all other mice was calculated. For instance, the difference between peak location in left A1 to low frequencies in mouse 1 and the peak location in left A1 to low frequencies in mouse 2 to 10 was compared. This was repeated for all cortical areas and frequencies. RvR*: The same method was used as in LvL* except using the right hemisphere locations. LvR*: The difference between the (x, y) coordinates of the peak location in the left hemisphere of one mouse and the corresponding (x, y) coordinates of the peak location in the right hemisphere of all other mice was calculated. For instance, the difference between peak location in left A1 to low frequencies in mouse 1 and the peak location in right A1 to low frequencies in mouse 2 to 10 was compared.

Functional Connectivity Derived from Spontaneous Activity and Noise Correlations.

Given an ROI (see the ROI Definitions section), a ΔF/F trace (nearly 10 min long for spontaneous experiments and 1 s for sound-evoked experiments) can be calculated by taking the average value within the ROI. These ROI-specific traces would be paired together to calculate Pearson’s correlation coefficient indicating the functional connectivity between ROIs.
The responses evoked by nine pure tones were used to calculate noise correlation. The ROIs used here were defined by sound stimulations. For the purpose of determining functional connectivity between ROIs, comparisons were only made between traces that were collected from the same sound presentation trial. The following equation is used to calculate the noise correlation (76):
ρn=covvip-viqi,viq-viqiivarvip-vipiivarviq-viqii,
where p and q stand for two distinct ROIs, and v is the ΔF/F trace representing the corresponding ROI during sound presentation trial i. In the present study, we calculate noise correlation between ROIs across hemispheres. Each trace was calculated within an observation window of 1 s at a frame rate of 20 Hz.

Data, Materials, and Software Availability

Data and code have been deposited in JHU Data Repository (https://doi.org/10.7281/T1/ORP47N) (77).

Acknowledgments

Supported by U19 NS107464 (P.O.K.) and NIH RO1DC017785 (P.O.K.).

Author contributions

G.C., C.-T.C., and P.O.K. designed research; G.C. and C.-T.C. performed research; G.C. and C.-T.C. analyzed data; and G.C., C.-T.C., and P.O.K. wrote the paper.

Competing interests

The authors declare no competing interest.

Supporting Information

Appendix 01 (PDF)

References

1
O. Güntürkün, F. Ströckens, S. Ocklenburg, Brain lateralization: A comparative perspective. Physiol. Rev. 100, 1019–1063 (2020).
2
R. J. Zatorre, P. Belin, Spectral and temporal processing in human auditory cortex. Cereb. Cortex 11, 946–953 (2001).
3
T. J. Crow, Schizophrenia as failure of hemispheric dominance for language. Trends Neurosci. 20, 339–343 (1997).
4
M. Tervaniemi, K. Hugdahl, Lateralization of auditory-cortex functions. Brain Res. Brain Res. Rev. 43, 231–246 (2003).
5
H. Steinmetz, J. Volkmann, L. Jäncke, H. J. Freund, Anatomical left-right asymmetry of language-related temporal cortex is different in left- and right-handers. Ann. Neurol. 29, 315–319 (1991).
6
S. Steinmann, G. Leicht, C. Mulert, The interhemispheric miscommunication theory of auditory verbal hallucinations in schizophrenia. Intern. J. Psychophysiol. 145, 83–90 (2019).
7
V. Oertel et al., Reduced laterality as a trait marker of schizophrenia–evidence from structural and functional neuroimaging. J. Neurosci. 30, 2289–2299 (2010).
8
M. Gavrilescu et al., Reduced connectivity of the auditory cortex in patients with auditory hallucinations: A resting state functional magnetic resonance imaging study. Psychol. Med. 40, 1149–1158 (2010).
9
K. A. Gordon, D. D. Wong, B. C. Papsin, Bilateral input protects the cortex from unilaterally-driven reorganization in children who are deaf. Brain 136, 1609–1625 (2013).
10
I. Stiebler, R. Neulist, I. Fichtel, G. Ehret, The auditory cortex of the house mouse: Left-right differences, tonotopic organization and quantitative analysis of frequency representation. J. Comp. Physiol. A 181, 559–571 (1997).
11
H. Tsukano et al., Delineation of a frequency-organized region isolated from the mouse primary auditory cortex. J. Neurophysiol. 113, 2900–2920 (2015).
12
J. Liu et al., Parallel processing of sound dynamics across mouse auditory cortex via spatially patterned thalamic inputs and distinct areal intracortical circuits. Cell Rep. (Cambridge) 27, 872–885.e7 (2019).
13
S. Romero et al., Cellular and widefield imaging of sound frequency organization in primary and higher order fields of the mouse auditory cortex. Cereb. Cortex 30, 1603–1622 (2020).
14
R. A. Reale, T. J. Imig, Tonotopic organization in auditory cortex of the cat. J. Comp. Neurol. 192, 265–291 (1980).
15
J. A. Winer, L. M. Miller, C. C. Lee, C. E. Schreiner, Auditory thalamocortical transformation: Structure and function. Trends Neurosci. 28, 255–263 (2005).
16
T. A. Hackett, T. R. Barkat, B. M. J. Brien, T. K. Hensch, D. B. Polley, Linking topography to tonotopy in the mouse auditory thalamocortical circuit. J. Neurosci. 31, 2983–2995 (2011).
17
G. Ehret, Left hemisphere advantage in the mouse brain for recognizing ultrasonic communication calls. Nature 325, 249–251 (1987).
18
D. B. Geissler, G. Ehret, Auditory perception vs. recognition: Representation of complex communication sounds in the mouse auditory cortical fields. Eur. J. Neurosci. 19, 1027–1040 (2004).
19
R. B. Levy et al., Circuit asymmetries underlie functional lateralization in the mouse auditory cortex. Nat. Commun. 10, 2783 (2019).
20
I. Stiebler, R. Neulist, I. Fichtel, G. Ehret, The auditory cortex of the house mouse: Left-right differences, tonotopic organization and quantitative analysis of frequency representation. J. Comp Physiol. A 181, 559–571 (1997).
21
B. J. Marlin, M. Mitre, J. A. D’amour, M. V. Chao, R. C. Froemke, Oxytocin enables maternal behaviour by balancing cortical inhibition. Nature 520, 499–504 (2015).
22
Z. Bowen, D. E. Winkowski, P. O. Kanold, Functional organization of mouse primary auditory cortex in adult C57BL/6 and F1 (CBAxC57) mice. Sci. Rep. 10, 10905 (2020).
23
J. M. S. Grimsley, J. J. M. Monaghan, J. J. Wenstrup, Development of social vocalizations in mice. PLoS One 6, e17460 (2011).
24
R. Aronoff et al., Long-range connectivity of mouse primary somatosensory barrel cortex. Eur. J. Neurosci. 31, 2221–2233 (2010).
25
K. Kilteni, H. H. Ehrsson, Functional connectivity between the cerebellum and somatosensory areas implements the attenuation of self-generated touch. J. Neurosci. 40, 894–906 (2020).
26
L. Petreanu, D. Huber, A. Sobczyk, K. Svoboda, Channelrhodopsin-2-assisted circuit mapping of long-range callosal projections. Nat. Neurosci. 10, 663–668 (2007).
27
M. S. Remple, E. C. Henry, K. C. Catania, Organization of somatosensory cortex in the laboratory rat (Rattus norvegicus): Evidence for two lateral areas joined at the representation of the teeth. J. Comp. Neurol. 467, 105–118 (2003).
28
A. M. Benison, D. M. Rector, D. S. Barth, Hemispheric mapping of secondary somatosensory cortex in the rat. J. Neurophysiol. 97, 200–207 (2007).
29
M. Nishimura, H. Sawatari, M. Takemoto, W. J. Song, Identification of the somatosensory parietal ventral area and overlap of the somatosensory and auditory cortices in mice. Neurosci. Res. 99, 55–61 (2015).
30
C. C. Liao, C. T. Yen, Functional connectivity of the secondary somatosensory cortex of the rat. Anat. Rec. (Hoboken) 291, 960–973 (2008).
31
C. D. Good et al., Cerebral asymmetry and the effects of sex and handedness on brain structure: A voxel-based morphometric analysis of 465 normal adult human brains. Neuroimage 14, 685–700 (2001).
32
J. K. Schiavo et al., Innate and plastic mechanisms for maternal behaviour in auditory cortex. Nature 587, 426–431 (2020).
33
G. Ehret, C. Bernecker, Low-frequency sound communication by mouse pups (Mus musculus): Wriggling calls release maternal behaviour Anim. Behav. 34, 821–830 (1986).
34
W. O. Williams, D. K. Riskin, A. K. Mott, Ultrasonic sound as an indicator of acute pain in laboratory mice. J. Am. Assoc. Lab. Anim. Sci. 47, 8–10 (2008).
35
S. Knecht et al., Handedness and hemispheric language dominance in healthy humans. Brain 123 Pt 12, 2512–2518 (2000).
36
R. Gerrits, H. Verhelst, G. Vingerhoets, Mirrored brain organization: Statistical anomaly or reversal of hemispheric functional segregation bias? Proc. Natl. Acad. Sci. U.S.A. 117, 14057–14065 (2020).
37
M. H. Lee, C. D. Smyser, J. S. Shimony, Resting-state fMRI: A review of methods and clinical applications. AJNR Am. J. Neuroradiol. 34, 1866–1872 (2013).
38
M. R. Cohen, A. Kohn, Measuring and interpreting neuronal correlations. Nat. Neurosci. 14, 811–819 (2011).
39
V. Bharmauria et al., High noise correlation between the functionally connected neurons in emergent V1 microcircuits. Exp. Brain Res. 234, 523–532 (2016).
40
T. R. Sato, N. W. Gray, Z. F. Mainen, K. Svoboda, The functional microarchitecture of the mouse barrel cortex. PLoS Biol. 5, e189 (2007).
41
D. R. Lyamzin et al., Nonlinear transfer of signal and noise correlations in cortical networks. J. Neurosci. 35, 8065–8080 (2015).
42
G. Rothschild, I. Nelken, A. Mizrahi, Functional organization and population dynamics in the mouse primary auditory cortex. Nat. Neurosci. 13, 353–360 (2010).
43
A. Rupasinghe et al., Direct extraction of signal and noise correlations from two-photon calcium imaging of ensemble neuronal activity. Elife 10, e68046 (2021).
44
L. I. Zhang, S. Bao, M. M. Merzenich, Persistent and specific influences of early acoustic environments on primary auditory cortex. Nat. Neurosci. 4, 1123–1130 (2001).
45
L. I. Zhang, S. Bao, M. M. Merzenich, Disruption of primary auditory cortex by synchronous auditory inputs during a critical period. Proc. Natl. Acad. Sci. U.S.A. 99, 2309–2314 (2002).
46
X. Meng, D. Mukherjee, J. P. Y. Kao, P. O. Kanold, Early peripheral activity alters nascent subplate circuits in the auditory cortex. Sci. Adv. 7, eabc9155 (2021).
47
D. Mukherjee, X. Meng, J. P. Y. Kao, P. O. Kanold, Impaired hearing and altered subplate circuits during the first and second postnatal weeks of otoferlin-deficient mice. Cereb. Cortex 32, 2816–2830 (2022).
48
Y. I. Iwasa et al., OTOF mutation analysis with massively parallel DNA sequencing in 2,265 Japanese sensorineural hearing loss patients. PLoS One 14, e0215932 (2019).
49
B. Vona, A. Rad, E. Reisinger, The Many faces of DFNB9: Relating OTOF variants to hearing impairment. Genes (Basel) 11, 1411 (2020).
50
A. Manchanda et al., Otoferlin depletion results in abnormal synaptic ribbons and altered intracellular calcium levels in zebrafish. Sci. Rep. 9, 14273 (2019).
51
T. Pangršič, E. Reisinger, T. Moser, Otoferlin: A multi-C2 domain protein essential for hearing. Trends Neurosci. 35, 671–680 (2012).
52
U. Stalmann, A. J. Franke, H. Al-Moyed, N. Strenzke, E. Reisinger, Otoferlin is required for proper synapse maturation and for maintenance of inner and outer hair cells in mouse models for DFNB9. Front Cell Neurosci. 15, 677543 (2021).
53
L. Bell et al., The cross-modal effects of sensory deprivation on spatial and temporal processes in vision and audition: A systematic review on behavioral and neuroimaging research since 2000. Neural Plast. 2019, 9603469 (2019).
54
C. Mezzera, G. López-Bendito, Cross-modal plasticity in sensory deprived animal models: From the thalamocortical development point of view. J. Chem. Neuroanat. 75, 32–40 (2016).
55
M. Pernia et al., Cross-modal reaction of auditory and visual cortices after long-term bilateral hearing deprivation in the rat. Brain Struct. Funct. 225, 129–148 (2020).
56
G. D. Scott, C. M. Karns, M. W. Dow, C. Stevens, H. J. Neville, Enhanced peripheral visual processing in congenitally deaf humans is supported by multiple brain regions, including primary auditory cortex. Front. Hum. Neurosci. 8, 177 (2014).
57
D. L. Hunt, E. N. Yamoah, L. Krubitzer, Multisensory plasticity in congenitally deaf mice: How are cortical areas functionally specified? Neuroscience 139, 1507–1524 (2006).
58
B. L. Allman, L. P. Keniston, M. A. Meredith, Adult deafness induces somatosensory conversion of ferret auditory cortex. Proc. Natl. Acad. Sci. U.S.A. 106, 5925–5930 (2009).
59
M. A. Meredith, S. G. Lomber, Somatosensory and visual crossmodal plasticity in the anterior auditory field of early-deaf cats. Hear Res. 280, 38–47 (2011).
60
J. B. Issa et al., Multiscale optical Ca2+ imaging of tonal organization in mouse auditory cortex. Neuron 83, 944–959 (2014).
61
W. Guo et al., Robustness of cortical topography across fields, laminae, anesthetic states, and neurophysiological signal types. J. Neurosci. 32, 9159–9172 (2012).
62
R. Anderson, P. Knight, M. Merzenich, The thalamo- cortical and corticothalamic connections of AI, AII, and the anterior auditory field (AAF) in the cat: Evidence for two largely segregated systems of connections. J. Comp. Neurol. 194, 664–701 (1980).
63
G. Schreiner, K. J. Dover, Lysias and the Corpus Lysiacum (Sather Classical Lectures, 39). Berkeley and Los Angeles, Univ. of Calif. Pr., 1968. X, 200 p. Pr. sh. 54. Mnemosyne (1973), vol. 26, pp. 67–69.
64
M. Saenz, D. R. M. Langers, Tonotopic mapping of human auditory cortex. Hearing Res. 307, 42–52 (2014).
65
S. Ohga et al., Direct relay pathways from Lemniscal auditory thalamus to secondary auditory field in mice. Cereb. Cortex 28, 4424–4439 (2018).
66
H. Tsukano et al., Reconsidering tonotopic maps in the auditory cortex and lemniscal auditory thalamus in mice. Front. Neural. Circ. 11, 14 (2017).
67
M. I. Chelaru, V. Dragoi, Efficient coding in heterogeneous neuronal populations. Proc. Natl. Acad. Sci. U.S.A. 105, 16344–16349 (2008).
68
J. Lengler, F. Jug, A. Steger, Reliable neuronal systems: The importance of heterogeneity. PLoS One 8, e80694 (2013).
69
X. Chang, G. Collin, R. C. W. Mandl, W. Cahn, R. S. Kahn, Interhemispheric connectivity and hemispheric specialization in schizophrenia patients and their unaffected siblings. Neuroimage Clin. 21, 101656 (2019).
70
B. J. Slater, J. S. Isaacson, Interhemispheric callosal projections sharpen frequency tuning and enforce response fidelity in primary auditory cortex. eNeuro 7, ENEURO.0256-20.2020 (2020).
71
K. E. Stephan, G. R. Fink, J. C. Marshall, Mechanisms of hemispheric specialization: Insights from analyses of connectivity. Neuropsychologia 45, 209–228 (2007).
72
N. X. Tritsch, E. Yi, J. E. Gale, E. Glowatzki, D. E. Bergles, The origin of spontaneous activity in the developing auditory system. Nature 450, 50–55 (2007).
73
R. D. Frisina et al., F1 (CBA×C57) mice show superior hearing in old age relative to their parental strains: Hybrid vigor or a new animal model for “Golden Ears”? Neurobiol. Aging 32, 1716–1724 (2011).
74
A. Goulas, H. B. Uylings, C. C. Hilgetag, Principles of ipsilateral and contralateral cortico-cortical connectivity in the mouse. Brain Struct. Funct. 222, 1281–1295 (2017).
75
D. Feldmeyer et al., Barrel cortex function. Prog. Neurobiol. 103, 3–27 (2013).
76
C. D. Hacker, A. Z. Snyder, M. Pahwa, M. Corbetta, E. C. Leuthardt, Frequency-specific electrophysiologic correlates of resting state fMRI networks. Neuroimage 149, 446–457 (2017).
77
C.-T. Chen, G. Calhoun, Data associated with the publication: [Bilateral widefield calcium imaging reveals circuit asymmetries and lateralized functional activation of mouse auditory cortex]. Johns Hopkins Research Data Repository. https://doi.org/10.7281/T1/ORP47N5. Accessed 29 June 2023.

Information & Authors

Information

Published in

The cover image for PNAS Vol.120; No.30
Proceedings of the National Academy of Sciences
Vol. 120 | No. 30
July 25, 2023
PubMed: 37459544

Classifications

Data, Materials, and Software Availability

Data and code have been deposited in JHU Data Repository (https://doi.org/10.7281/T1/ORP47N) (77).

Submission history

Received: November 28, 2022
Accepted: May 29, 2023
Published online: July 17, 2023
Published in issue: July 25, 2023

Keywords

  1. auditory cortex
  2. hemisphere
  3. bilateral
  4. imaging
  5. experience

Acknowledgments

Supported by U19 NS107464 (P.O.K.) and NIH RO1DC017785 (P.O.K.).
Author contributions
G.C., C.-T.C., and P.O.K. designed research; G.C. and C.-T.C. performed research; G.C. and C.-T.C. analyzed data; and G.C., C.-T.C., and P.O.K. wrote the paper.
Competing interests
The authors declare no competing interest.

Notes

This article is a PNAS Direct Submission.

Authors

Affiliations

Georgia Calhoun1
Department of Biomedical Engineering, Johns Hopkins University, Baltimore, MD 21205
Kavli Neuroscience Discovery Institute, Johns Hopkins University, Baltimore, MD 21205
Department of Biomedical Engineering, Johns Hopkins University, Baltimore, MD 21205
Kavli Neuroscience Discovery Institute, Johns Hopkins University, Baltimore, MD 21205
Department of Biomedical Engineering, Johns Hopkins University, Baltimore, MD 21205
Kavli Neuroscience Discovery Institute, Johns Hopkins University, Baltimore, MD 21205

Notes

2
To whom correspondence may be addressed. Email: [email protected].
1
G.C. and C.-T.C. contributed equally to this work.

Metrics & Citations

Metrics

Note: The article usage is presented with a three- to four-day delay and will update daily once available. Due to ths delay, usage data will not appear immediately following publication. Citation information is sourced from Crossref Cited-by service.


Altmetrics




Citations

Export the article citation data by selecting a format from the list below and clicking Export.

Cited by

    Loading...

    View Options

    View options

    PDF format

    Download this article as a PDF file

    DOWNLOAD PDF

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Personal login Institutional Login

    Recommend to a librarian

    Recommend PNAS to a Librarian

    Purchase options

    Purchase this article to access the full text.

    Single Article Purchase

    Bilateral widefield calcium imaging reveals circuit asymmetries and lateralized functional activation of the mouse auditory cortex
    Proceedings of the National Academy of Sciences
    • Vol. 120
    • No. 30

    Figures

    Tables

    Media

    Share

    Share

    Share article link

    Share on social media