# Intrinsic neural diversity quenches the dynamic volatility of neural networks

Edited by Eve Marder, Brandeis University Department of Biology, Waltham, MA; received November 3, 2022; accepted May 19, 2023

Commentary

July 24, 2023

## Significance

Contemporary research has identified widespread within-cell-type diversity in the brain, manifest through variations in biophysical features such as neuronal excitability. A natural question that arises from this phenomenon is what functional role, if any, this heterogeneity might serve. Combining computational and mathematical techniques, it is shown that cell-to-cell biophysical diversity, far from mere developmental noise, represents a homeostatic control mechanism, enriching and restraining population responses to modulatory input while reinforcing the resilience of neuronal circuits. These results highlight the importance of biophysical diversity in the persistence of brain function over time and in the face of change.

## Abstract

Heterogeneity is the norm in biology. The brain is no different: Neuronal cell types are myriad, reflected through their cellular morphology, type, excitability, connectivity motifs, and ion channel distributions. While this biophysical diversity enriches neural systems’ dynamical repertoire, it remains challenging to reconcile with the robustness and persistence of brain function over time (resilience). To better understand the relationship between excitability heterogeneity (variability in excitability within a population of neurons) and resilience, we analyzed both analytically and numerically a nonlinear sparse neural network with balanced excitatory and inhibitory connections evolving over long time scales. Homogeneous networks demonstrated increases in excitability, and strong firing rate correlations—signs of instability—in response to a slowly varying modulatory fluctuation. Excitability heterogeneity tuned network stability in a context-dependent way by restraining responses to modulatory challenges and limiting firing rate correlations, while enriching dynamics during states of low modulatory drive. Excitability heterogeneity was found to implement a homeostatic control mechanism enhancing network resilience to changes in population size, connection probability, strength and variability of synaptic weights, by quenching the volatility (i.e., its susceptibility to critical transitions) of its dynamics. Together, these results highlight the fundamental role played by cell-to-cell heterogeneity in the robustness of brain function in the face of change.

### Sign up for PNAS alerts.

Get alerts for new articles, or get an alert when an article is cited.

Neural systems exhibit surprisingly reliable behavior across a lifespan. Despite high phenotypic variability (1–4), learning-related plasticity changes (5), and constant alterations in neuromodulatory tone (6–9) and circuit topology (7, 10), neural dynamics remain qualitatively invariant in healthy brains over extended time scales. This is a signature of the brain’s manifest resilience, where its dynamics persist despite changes in intrinsic and/or extrinsic control parameters, preserving associated function (11–14). In contrast, the failure to regulate such perturbations may predispose neural systems to dynamic volatility: qualitatively distinct dynamics following changes in stability, resulting from critical transitions (15). Such volatile dynamics in neural systems often arise from disease states: For example, changes associated with epilepsy (16), stimuli (17), or modulatory fluctuations associated with circadian and/or multidien rhythms (18) may cause these systems to slip toward critical transitions, such as recurrent seizures (19–21).

The resilience of neural circuits has been thoroughly studied through pioneering experiments in the crab and lobster stomatogastric ganglia (STG) network (1, 22). These experiments revealed highly stable, robust and invariant rhythmic activity despite pervasive phenotype heterogeneity, even when exposed to severe environmental perturbations (1). These discoveries in neuroscience echo a long history, primarily in the field of macroecology, of experimental and theoretical studies examining the relationship between biodiversity, stability, and the resilience of ecosystems and food webs over time (see refs. (12, 14, 23–28) and references therein), which typify the well-known “stability-diversity” debate (12, 14, 27, 29). In this setting, resilience is a system’s propensity for invariance and ability to retain its (in)stability in response to changing control parameters. In contrast, volatile systems are associated with changes in stability and critical transitions, also called bifurcations (11, 13, 15, 27). A confluence of theoretical studies in macroecology have explored this question and shown (see ref. (29) and references therein) that diversity often renders a system volatile. Combined graph-theoretic and spectral approaches have shown that complex networks tend to lose stability when population sizes increase (12, 30), coupling weights are too strong and/or diverse (12, 23–25, 28), connection probability is too dense (12, 28, 30, 31), or when connectivity motifs become too heterogeneous (32).

These questions have been examined by neuroscientists as well: Numerous experimental (1, 4, 22, 33–37) and theoretical (37–43) studies have explored the influence of cellular heterogeneity, seemingly the norm in the brain (44–47), on neural dynamics and communication. The impact of heterogeneities on the stability of task-specific recurrent neural networks has also been extensively examined (48–58). In the context of disease states, excitability heterogeneity can stabilize neural dynamics away from pathological brain states (37, 59). Collectively, these studies have shown that cell-to-cell diversity stabilizes “healthy” dynamics to optimize responses, learning, information flow, and coding capacity by tuning neural networks toward criticality (60), a regime that balances quiescence and excitability while residing between stability and instability. Despite these advances, linking single-neuron attributes with emergent physiological activity that undergirds the persistence of brain function remains inaccessible by current experimental techniques.

Inspired by decades of theoretical work in macroecology, we extended spectral theory for large-scale random systems (30, 61) and applied it to neuroscience to study the impact of biophysical diversity on the brain’s resilience over extended time scales. We considered a generic large-scale nonlinear neural network with sparse balanced excitatory and inhibitory connections, over time scales spanning minutes, hours, and/or days to examine the persistence of its dynamics. We exposed this network to a slowly fluctuating modulatory input, a control parameter that is continuously interrogating the system’s stability. Over such time scales, slow modulation influences neural activity in a manner mimicking fluctuations during the resting state resulting from modulatory (6–9), environmental (1, 22, 62), and/or stimuli-induced perturbations, for instance. We explored how heterogeneity impacts the dynamics of neurons and neuronal networks during such modulatory challenges. To quantitatively determine a system’s resilience or volatility, we leveraged spectral theory for large random systems (30, 61), commonly used in macroecology to examine the stability of complex natural systems, such as food webs (12, 14, 23–28). Through this framework, we analyzed the statistical properties of eigenvalues resulting from changes in network size, synaptic weights, connectivity motifs, modulatory drive, and cell-to-cell biophysical diversity among neurons. In so doing, we looked beyond the stability of the system to how this stability responds to gradual intrinsic and/or extrinsic changes, in order to understand how biophysical diversity predisposes balanced neural systems to stability transitions.

We begin these explorations by showing that excitability heterogeneity, one of many types of intrinsic biophysical diversity (

*D**i**s**c**u**s**s**i**o**n*), transforms the dynamics of neural networks while rendering them less prone to sudden shifts in stability. Here, excitability heterogeneity refers to cell-to-cell variability in firing rate thresholds (*Materials and Methods*), a focus motivated by the reality that neuronal excitability is a primary mechanism targeted by intrinsic plasticity mechanisms in learning (63, 64) and is altered in pathological states like epilepsy (37) and neuropsychiatric conditions (65).Our joint computational and mathematical analysis shows that diversity in neuronal excitability restrains neural populations’ response to perturbations by promoting gradual and linear changes in network firing rates. This manifests through a heterogeneity-induced decrease in firing rate correlations, a result with important implications for stability. We hence leveraged the spectral theory for large-scale random systems to reveal that excitability heterogeneity implements a generic control mechanism promoting: 1) homeostasis, by tuning the distribution of eigenvalues in the complex plane in a context-dependent way, and 2) resilience, by anchoring this eigenvalue distribution and making it less dependent on modulatory influences. We explored how excitability heterogeneity can influence system resilience to “insults” like increases in network size, connection probability, strength and variability of synaptic weights, and modulatory fluctuations which promote stability transitions. We found that intrinsic excitability heterogeneity rendered the network more resilient to these insults, a generic feature that was further preserved across a wide range of network topologies. These findings are particularly relevant to learning where synaptic plasticity, unless stabilized by homeostatic mechanisms, would lead to runaway (i.e., unstable) activity (5, 66, 67). Taken together, these results indicate that neuronal diversity, a fundamental organizing principle of neuronal circuits (44, 45), serves a role in generating the brain’s resilience.

## Results

Neural systems display activity that remains qualitatively invariant over extended time scales. This feature highlights their resilience in the face of changes in connectivity, development and aging, pathological insults, and exposure to perturbations such as stimuli and/or modulatory influences (6–9). To better understand the mechanisms underlying such resilience and how it is influenced by excitability heterogeneity, we developed a mathematical framework in which long-term stability can be analytically quantified (

*Materials and Methods*). We built and analyzed a large-scale, balanced and sparse network with excitatory and inhibitory connections (Fig. 1) whose dynamics extend over time scales spanning minutes, hours, and/or days. This model is both flexible and general, encompassing a wide range of population-based models involving excitatory and inhibitory interactions. It relates network size, the mean activity of neurons, their mutual synaptic connectivity, their individual level of excitability, and the influence of slowly varying modulatory inputs. We required that neurons were exposed to balanced synaptic connectivity such as seen experimentally (68, 69), in which the net sum of excitatory and inhibitory synaptic weights is zero. We further selected connection probabilities reflecting those observed experimentally (70).Fig. 1.

Within this framework, we can tune the intrinsic excitability of each individual neuron, resulting in increasingly heterogeneous networks; without such variability, the network remains homogeneous. It is well known that balanced networks are prone to volatility, i.e., susceptible to stability transitions (71, 72). To confirm this, neurons in the network were collectively exposed to a random, slowly varying modulatory input, mimicking excitability changes in neural activity arising from endogenous and/or exogenous control parameter changes (i.e., neuromoduation, temperature, etc.) (6–9). Such a slowly varying modulatory input continuously interrogates network stability and therefore is an ideal tool to expose the system’s resilience. As expected from this context, the homogeneous network (i.e., where neurons possess an identical excitability profile; Fig. 1

*A*) was predisposed to volatility through recurring changes in stability. Frequent sharp transitions between states of low- and high-frequency dynamics could be observed in the network’s mean activity (*Materials and Methods*), confirmed using power spectral analysis. Such transitions index states of instability, characterized by high-frequency and correlated activity, and are reminiscent of dynamics seen in electrophysiological recordings during seizures (73).However, heterogeneous networks (i.e., where neurons possess distinct excitability profiles; Fig. 1

*B*) did not exhibit such transitions in response to an identical modulatory input. Instead, intrinsic excitability variability was found to suppress these transitions, and low-frequency uncorrelated activity persisted throughout. Intrinsic variability among neurons was implemented by varying the effective firing rate response functions, reflecting diverse degrees of cellular excitability. We randomized firing rate response thresholds in which excitability is sampled from a normal distribution of mean 0 and variance*σ*_{H}^{2}(Fig. 1*C*;*Materials and Methods*). This way of characterizing heterogeneity is well aligned with experimental evidence that excitability is a key target for intrinsic plasticity mechanism (3, 37, 63, 65, 74) and echoes numerous previous studies on heterogeneous networks (37, 41, 43, 75). This heterogeneity leads to differing response functions for individual neurons (Fig. 1*D*) as well as variable responses to perturbations such as those caused by modulatory input (Fig. 1*E*).### Effect of Excitability Heterogeneity on Network Dynamics.

We further characterized the influence of heterogeneity on the dynamics and volatility of our network in Fig. 2. In the homogeneous case (i.e.,

*σ*_{H}^{2}= 0), neuronal activity was indeed characterized by alternating epochs of stability and instability, as portrayed in Fig. 2*A*. Transient stable dynamics indexes states in which neural activity is stable and relaxes back to equilibrium after perturbations. In contrast, other periods were characterized by unstable neural activity in which the activity of individual neurons diverges away from equilibrium. Such periods of instability result from critical transitions (15) in which neural activity departs from stability, and may diverge, become synchronous and/or chaotic.Fig. 2.

To illustrate the corresponding activity of the neurons, we generated exemplar Poisson spike trains whose rates match those predicted by our network model (

*Materials and Methods*). As illustrated in Fig. 2*A*, modulatory input triggered abrupt and frequent transitions between regimes of quiescence (i.e., low firing rates) and high-frequency firing in which burst-like activity could also be observed. Such periods of intense spiking repetitively followed the onset of instability, resulting in network-wide correlated fluctuations. The presence of such covariability is also apparent through the high variance of the mean network firing rate. These observations suggest that slowly fluctuating modulatory input may expose the volatility of homogeneous networks by revealing sudden stability transitions and dynamical regimes that are qualitatively distinct, exemplifying nonresilient behavior.In contrast, when excitability heterogeneity is introduced (i.e.,

*σ*_{H}^{2}> 0) in Fig. 2*B*, significantly different dynamics are observed despite identical connectivity and modulatory drive. Indeed, the long-term dynamics of the network are resilient: Robust, invariant stability replaced the intermittent behavior seen in the homogeneous case. No transitions between stability and/or instability occurred. As a direct consequence of heterogeneity, redundancy in the neurons’ equilibria is broken: Neuron fixed points were now distributed with a mean*μ*_{uo}=*B*+*S*_{o}and variance*σ*_{uo}^{2}(*Materials and Methods*). As can be seen in the representative spiking activity of neurons, such differences in excitability yield increasingly moderate responses to modulation. Neurons with high excitability compensate for those with low excitability, and vice versa, thereby restraining network fluctuations by distributing them across the population. This translates into more gradual changes in the network mean firing rate, compared to the homogeneous case (cf. Fig. 2*A*and*B*).We quantified both numerically and mathematically such changes in mean network firing rate, to quantify network-wide excitability (

*Materials and Methods*). Fig. 2*C*shows that excitability heterogeneity promotes increasingly linear, moderate changes in mean firing rate resulting from modulatory input. In contrast, homogeneous networks exhibit steep, nonlinear responses. Such results confirm that differences in excitability restrain firing rate fluctuations across the population by limiting sudden increases in activity. Counterintuitively, a consequence of these results is that heterogeneity also enhances the net excitability of the network: Smaller modulatory amplitudes are required to trigger a (albeit weaker) response compared to homogeneous networks. Conversely, decreased heterogeneity can be seen as normalizing excitability across neurons. Such network changes echo those resulting from a transition between class I and class II excitability (76), in line with previous experimental observations comparing healthy and epileptogenic neurons (37) as well as other pathologies (77). Such linearization in mean firing rate response is known to have profound consequences on the stability or neural networks (37, 78, 79).To better understand how this linearization arises, we next evaluated how neuron-to-neuron differences in excitability profiles impacted firing rate covariability. To do so, we considered regimes of stability (in both the homogeneous and heterogeneous cases) and quantified pairwise firing rate correlations between neurons jointly driven by a common modulatory input of varying amplitude. As shown in Fig. 2

*D*, neuron-to-neuron differences in excitability threshold limit covariation between their individual firing rates. We validated these results analytically in the weak connectivity limit (*Materials and Methods*). These results provide context to those presented in Fig. 1*A*and*B*, in which the steepness of the response of heterogeneous networks is suppressed in favor of smooth uncorrelated fluctuations. The effect was also found to scale with modulatory amplitude. We note that this is a direct consequence of the effect portrayed in Fig. 2*C*. This analysis suggests that excitability heterogeneity also manifests through the decorrelation of neural responses—thereby limiting network-wide (i.e., correlated) firing rate fluctuations caused by modulatory inputs (59, 80). This decorrelation is complementary yet distinct from the synaptic mechanisms of excitatory–inhibitory balance (81), relying instead on cell-to-cell differences in excitability profiles.The influence of heterogeneity on stability (and hence volatility) was also confirmed by numerically computing Lyapunov exponents across independent realizations of the network connectivity and independent trials, in which the system exhibits different configurations and is exposed to variable modulatory inputs. The network is unstable whenever Lyapunov exponents are positive, and stable otherwise. As shown in Fig. 2

*E*, persistent positive mean Lyapunov exponents with large variance are observed in the homogeneous case. Such volatility is expected from the theory of nonlinear balanced networks (71, 72). In the heterogeneous case, Lyapunov exponents remained bounded below zero throughout. These results confirm persistent stability and suggest enhanced resilience in the presence of excitability heterogeneity.### Volatility of Homogeneous Networks.

Figs. 1 and 2 illustrate the cell and circuit manifestations of the profound dynamical changes that can arise from excitability heterogeneity and its attendant loss. To better understand the underlying mechanisms underlying the cell and circuit effects of excitability heterogeneity on dynamical stability, we harnessed spectral theory for random matrices (28, 61, 82), which is a way of characterizing the stability of large-scale networks and dynamical systems. By construction, our network model is subject to the circular law of random matrix theory (61, 82) in which the eigenvalues—defining stability—are constrained with high probability in a disk in the complex plane. The eigenvalues populating that disk are complex numbers. If they all possess negative real parts, the network is stable: Fluctuations are absorbed by the network, and neural activity returns to its equilibrium after being perturbed. In our network model, such stability manifests through low-intensity, asynchronous neuronal firing smoothly driven by modulatory inputs (e.g., Fig. 2

*B*). However, if some eigenvalues have positive real parts, the network is instead unstable: Neural activity diverges once perturbed, exhibiting synchronous and/or chaotic dynamics (e.g., Fig. 2*A*). In the intermediate case, when eigenvalues lie close to the imaginary axis, the network is said to reside at a critical point sitting between stability and instability, commonly referred to as metastable. The circular law provides a powerful way to analyze the behavior of a large number of eigenvalues simultaneously, without the challenging task of considering them individually. We direct the reader toward ref. (83) for an excellent introduction and discussion on spectral analysis, the circular law, and its applications in neuroscience.The circular law (61, 82) has been extensively used in neuroscience to characterize the dynamics of neural circuits (83–86) and in macroecology to examine the stability of complex natural systems, such as food webs (12, 14, 23–28). It provides a powerful and intuitive framework for characterizing resilience in our network model. Given that eigenvalues are constrained within a disk of radius

*Γ*(called spectral radius), one can thus quantify stability by considering how*Γ*changes with modulatory input, heterogeneity, and other control parameters. The spectral radius may be viewed as a measure of robustness of neuronal populations, indexing how much networks resist change and retain stability.Geometrically, changes in the spectral radius

*Γ*(reflecting changes in the size of the spectral disk) result either in the clustering or dispersion of eigenvalues centered around the local relaxation gain ℓ;*Materials and Methods*. As can be seen in Fig. 3*A*, whenever the spectral radius*Γ*becomes larger (resp. smaller) than |ℓ|, eigenvalues will cross the imaginary axis, and the network becomes unstable (resp. stable) with high probability (30, 31, 61, 87–89). If the spectral radius remains commensurate with |ℓ|, then the network is considered metastable and in the vicinity of a critical point.Fig. 3.

While the net size of the spectral radius determines the system’s stability, how this spectral radius changes with respect to a control parameter (e.g., modulatory input amplitude

*S*_{o}) reflects the system’s resilience or volatility. If eigenvalues (which are contained in the spectral disk) move significantly in the complex plane due to perturbations (e.g., modulatory input or other control parameters), the network is prone to sudden qualitative changes in dynamics. Indeed, as neural systems can reside in both stable (relaxation) and/or unstable (oscillations, synchrony, chaos) functionally meaningful dynamic regimes, the value of eigenvalues (inferred from the spectral radius*Γ*) evaluated at a given moment in time conveys little information. It is instead how these eigenvalues change that reflects resilience or volatility. To examine this sensitivity, we subjected the homogeneous network to a thorough spectral analysis (*Materials and Methods*). By virtue of having homogenous excitability, individual neurons’ steady states were found to be identical across the network and entirely dependent on the modulatory input amplitude (i.e.,*u*_{j}^{o}=*B*+*S*_{o}), as expected. This is fully consistent with the dynamics observed in Fig. 2. Over short time scales, the modulatory input*S*(*t*)≈*S*_{o}can be considered constant: Its influence on the spectral radius*Γ*may thus be quantified. Indeed, as can be seen in Fig. 3*A*, both numerically and analytically, the spectral radius was found to be highly sensitive to modulatory input: Changes in*S*_{o}resulted in high-amplitude clustering and/or dispersion of the eigenvalues around the relaxation gain, causing frequent transitions between stability and instability. The spectral radius*Γ*was found to increase with the modulatory input amplitude (*S*_{o}), indicating that such fluctuations generally lead to instability.We confirmed this volatility in Fig. 3

*B*, alongside the alignment between our numerical and analytical calculations. Time-dependent changes in the amplitude of the modulatory input (such as those exemplified in Fig. 2) significantly contract and/or expand spectral radius*Γ*, whose value intermittently crosses the stability threshold, leading to an alternation between stability and instability. As*S*_{o}fluctuates, the network undergoes epochs of instability, alternating with periods where neural activity is either suppressed (*S*_{o}strongly inhibiting) and/or saturated (*S*_{o}strongly exciting). We note that fast changes in*S*(*t*) might cause the network to cross the unstable regime briefly; instability is then difficult to observe since the system does not evolve sufficiently fast to exhibit unstable observable dynamics. Results plotted in Fig. 3*C*show a high dependence of the spectral radius on modulatory input amplitude (*S*_{o}). Stability (i.e., relaxing neural activity, small*Γ*) characterizes inhibitory and/or low-amplitude modulatory input, while higher amplitudes lead to instability (i.e., divergent, chaotic, and/or synchronous neural activity, large*Γ*) and eventually saturation (i.e., neural activity plateaus, small*Γ*). As shown in Fig. 3*D*, our analysis also revealed that the spectral radius*Γ*—and hence the dispersion of eigenvalues in the complex plane—increases with network size (*N*), connection probability (*ρ*), firing rate response gain (*β*), as well as net synaptic strength (*μ*); individually or collectively, all these network features diminish the system’s resilience. This is in line with previous results (25, 30) notably on balanced networks (71, 72), highlighting that homogeneous networks are generically prone to instability. Taken together, our analysis indicates that in sparse balanced and homogeneous networks, high sensitivity to modulatory input and other control parameters underlies the system’s volatility.### Intrinsic Excitability Heterogeneity Tunes Stability and Resilience.

Numerous previous studies (39, 41, 60, 75) have shown that heterogeneous neural systems adapt and converge toward a regime of metastability to optimize responses and coding properties. Such metastability manifests itself through critical-like neural activity (90, 91) and/or dynamics residing in the vicinity of a state transition (92). From the perspective of the aforementioned circular law, such dynamical properties emerge whenever these networks are brought toward and operate in dynamical regimes resulting from an intermediate spectral radius, neither too small (i.e., strong stability leading to quiescence; eigenvalues all have negative real parts) nor too large (i.e., strong instability leading to divergence, chaos, and/or synchrony; some eigenvalues have positive real parts).

Our previous findings (37) suggest that intrinsic excitability heterogeneity should improve network resilience, as portrayed in Figs. 1 and 2. To better understand the mechanism behind these dynamics, we adapted the spectral theory for large-scale random systems (28, 61, 82) to expose the influence of excitability heterogeneity on the distribution of eigenvalues. We specifically explored the susceptibility of the spectral radius

*Γ*—and hence the dispersion of eigenvalues in the complex plane—to modulation across various degrees of heterogeneity (*σ*_{H}^{2}> 0) (*Materials and Methods*). Our analysis revealed two main roles played by diversity on network dynamics: a) homeostatic control on network stability and b) the promotion of its resilience.Indeed, we found that intrinsic excitability heterogeneity is a homeostatic mechanism exerting bidirectional and context-dependent control on network stability: enriching the dynamics whenever they are too poor or, conversely, stabilizing network activity whenever it is too unstable. Indeed, as shown in Fig. 4

*A*, heterogeneity increased the spectral radius (*Γ*) for small modulatory input amplitudes (*S*_{o}). This is also illustrated at the cell and circuit level in Fig. 2*B*. Excitability heterogeneity maintains network activity during low-drive (*S*_{o}) states, and restrains activity in high-drive states. By doing so, heterogeneity prevents both quiescence and excessive firing, while decorrelating firing rates. For low amplitudes of modulation, lack of heterogeneity yields highly stable neural activity that invariably relaxes back to equilibrium whenever perturbed: The spectral radius is infinitesimal, and eigenvalues are clustered around the relaxation gain ℓ. Introducing excitability heterogeneity expands the spectral disk, enriching network dynamics toward instability. Surprisingly, heterogeneity has the opposite effect for higher modulatory input amplitudes, for which the system is highly unstable: Here, heterogeneity instead contracts the spectral disk and stabilizes network dynamics. This contextual control of excitability heterogeneity on stability which depends on modulatory fluctuations suggests that heterogeneity tunes the spectral disk—and hence eigenvalue dispersion—toward an optimal intermediate size.Fig. 4.

Indeed, we found that intrinsic excitability heterogeneity is a homeostatic mechanism exerting bidirectional and context-dependent control on network stability: enriching the dynamics whenever they are too poor, or conversely, stabilizing network activity whenever it is too unstable. Indeed, as shown in Fig. 4

*A*, heterogeneity increased the spectral radius (*Γ*) for small modulatory input amplitudes (*S*_{o}). This is also illustrated at the cell and circuit level in Fig. 2*B*, where excitability heterogeneity maintains network activity during low-drive (*S*_{o}) states, while dampening the activity in high-drive states, preventing both quiescence and excessive firing, while decorrelating firing rates. For low amplitudes of modulation, lack of heterogeneity yields highly stable neural activity that invariably relaxes back to equilibrium whenever perturbed: The spectral radius is infinitesimal, and eigenvalues are clustered around the relaxation gain ℓ. Introducing excitability heterogeneity expands the spectral disk, enriching network dynamics toward instability. Surprisingly, higher modulatory input amplitudes, for which the system is highly unstable, led to the opposite. Indeed, heterogeneity was found to here instead contract the spectral disk and stabilize the dynamics. This contextual control of excitability heterogeneity on stability which depends on modulatory fluctuations suggests that heterogeneity tunes the spectral disk—and hence eigenvalue dispersion—toward an optimal intermediate size.The contextual influence of excitability heterogeneity on network stability described above stems from a damping of spectral radius sensitivity with respect to modulatory input (Fig. 4

*A*). Indeed, sharp changes in*Γ*caused by*S*_{o}(such as those seen in Fig. 3*B*and*C*) were evened out by heterogeneity, resulting in an enrichment or stabilization of the dynamics as the spectral radius is increased or decreased, respectively. Specifically, heterogeneity increased the spectral radius for low modulatory input, while doing the opposite for high amplitudes. The homeostatic influence of heterogeneity on network stability could be confirmed in Fig. 4*B*. Irrespective of modulatory input amplitude*S*_{o}, heterogeneity was found to tune the spectral radius—through either enrichment or stabilization—toward the same intermediate radius.To confirm the alignment of our mathematical analysis and numerical simulations, we computed the variance of the neuron’s fixed point distribution (i.e.,

*σ*_{uo}^{2}), which was also found to depend on the degree of heterogeneity (Fig. 4*C*). Introducing heterogeneity consistently prevented stability transitions, rendering the system more resilient. Indeed, as can be seen Fig. 4*D*, the transition rate—corresponding to the number of bifurcations observed in the network per unit time—decreased monotonically, confirming the trend seen in Figs. 1 and 2.Another important conclusion stemming from our analysis is that excitability heterogeneity generically enhances network resilience. As can be seen from Fig. 4, increasing excitability heterogeneity significantly damped spectral radius changes resulting from modulatory input. This implies that excitability heterogeneity anchors eigenvalues in the complex plane, limiting how much they change in response to modulatory input. As discussed before, this limits the probability of changes in stability, preventing bifurcations and confirming resilience. Indeed, excitability heterogeneity made

*Γ*less sensitive to changes in*S*_{o}, and by doing so, quenched volatility. This was confirmed in Fig. 5*A*by systematically varying modulatory input amplitude and the degree of heterogeneity while measuring the spectral radius. We found that heterogeneity damped the sensitivity of the network stability on*S*_{o}, as the spectral radius gradually becomes effectively independent of*S*_{o}beyond a given degree of heterogeneity (dashed box in Fig. 5*A*). To encapsulate the effect of excitability heterogeneity on the network’s resilience, we computed both the spectral volatility (*κ*)—which measures the effective sensitivity of the spectral radius on a given control parameter—as well as the resilience parameter (ℛ)—which is the reciprocal of the spectral volatility—as a function of modulatory input amplitude (i.e.,*S*_{o}). These metrics quantify how invariant to changes in a given control parameter the eigenvalue distribution is. This is done by looking at variations of the spectral radius, see Materials and Methods. As shown in Fig. 5*B*, heterogeneity optimized resilience to modulatory input, and the spectral volatility decreased. Collectively, these results demonstrate that excitability heterogeneity greatly enhances the resilience of sparse balanced networks by anchoring the eigenvalues in the complex plane and decreasing the sensitivity of their distribution to modulatory input.Fig. 5.

### Heterogeneity May Stabilize Networks across Changes in Connectivity.

Our results so far indicate that excitability heterogeneity implements a long time-scale homeostatic control mechanism that promotes resilience in networks exposed to modulatory inputs. However, other control parameters might influence the neural systems’ stability over these time scales. Neural systems are subjected to perpetual change, even in the absence of modulatory fluctuations and/or stimuli. Synaptic plasticity is a salient example: During learning, the number of synapses and/or the effective synaptic weights change, as a consequence of processes such as long-term potentiation (LTP) and depression (LTD) (10). Networks undergoing such plasticity-induced structural modifications of their connectivity tend to be weakly resilient. Indeed, most forms of synaptic plasticity lead to the development of instability, in which run-away neural activity departs from baseline and needs to be compensated/stabilized through various homeostatic feedback processes (5, 66, 67), a few of which have found experimental support (93).

We asked whether excitability heterogeneity, on its own, could prevent stability transitions in neural systems undergoing plasticity-induced changes in connectivity. Our previous analysis demonstrates that the spectral radius

*Γ*—and hence the dispersion of eigenvalues in the complex plane—increases with connection probability (*ρ*) as well as net synaptic strength (*μ*). This suggests that long time-scale changes in these network features—prone to increase together or independently during learning—generically promote instability and volatility. As can be seen in Fig. 6, this is confirmed numerically: Increasing both the connection probability (Fig. 6*A*) and synaptic strength (Fig. 6*B*) over a physiologically realistic range resulted in instability, as measured with the mean Lyapunov exponent. However, this occurred only in the homogeneous case. Indeed, increasing the heterogeneity suppressed this instability with the mean Lyapunov exponent remaining negative over the range of values of explored, i.e., the system did not experience any stability transitions.Fig. 6.

Mathematical analysis affirms this finding. Fig. 6

*C*shows that the spectral radius*Γ*increases monotonically with connection probability (*ρ*) and synaptic strength (*μ*) interchangeably, underlying such systems’ volatility and associated vulnerability to stability transitions. Slow time-scale changes in connectivity resulting from plasticity (illustrated by the gray curves linking points a and b in Fig. 6*C*) result in stability transitions. Introducing heterogeneity moved the effective stability threshold (i.e.,*Γ*(*ρ*,*μ*)=|ℓ|) further in parameter space, resulting in overall compensation for the destabilizing influence of increases in connection probability and synaptic strength (c.f., Fig. 3*D*). In this case, slow time-scale changes in connectivity cause stability transitions to become increasingly unlikely as the net size of the stability region increases. In addition to this stabilizing influence, heterogeneity was also found to promote resilience via anchoring the eigenvalue distribution in the complex plane. We thus computed the resilience metric, now as a function of connection probability (*ρ*; ℛ_{ρ}) and synaptic strength (*μ*; ℛ_{μ}). As shown in Fig. 6*D*, increasing the degree of excitability heterogeneity enhanced resilience for both these control parameters, i.e., promoting the persistence of stability by decreasing the spectral volatility and the susceptibility of the spectral radius on changes in connection probability (*ρ*) and synaptic strength (*μ*).## Discussion

In the last several years, with continued advancements in high-throughput single-cell RNA sequencing (scRNAseq) (94), it is abundantly clear that within cell types, there is a transcriptomic continuum rather than discrete subtypes (44, 45, 47). This within-cell-type transcriptomic diversity is also reflected in functional diversity in excitability features in human (4, 37) and rodent neurons (45, 47) and likely a direct manifestation of the observed transcriptomic variability, given the correlation between the transcriptome and electrophysiological properties of neurons (95, 96). In the light of these technical advances in describing the properties of individual neurons at scale, a major challenge for neuroscience is to bridge across the divide between individual neuronal properties and network function (97). While bridging this gap remains a significant challenge experimentally, although advances in imaging technologies, NeuroPixels (98), Ca

^{2+}(99), and ultrasound (100) are continually closing it, it is the promise of computational and mathematical analyses to simplify the complexity of the brain while addressing this critical divide between brain structure and function (101).It is within this context of bridging scales that we here bridge between neuronal diversity—a seemingly fundamental design principle of the brain—and the stability of cortical dynamics. Collectively, our results suggest that excitability heterogeneity makes balanced sparse neural networks insensitive to changes in many key control parameters, quenching volatility. Our analyses and results can be generalized across a wide range of network sizes, connectivity profiles, topologies, types of heterogeneity, dynamics (e.g., asynchronous, rhythmic), and individual neuron response properties. Indeed, while we have focused here on Erdős–Rényi-type topology, our results may be easily extended to other graph structures (e.g., multimodal, scale-free, cascade models) through a proper rescaling of the spectral radius (29, 32) and can also be modified to study time-delayed systems (88, 102). In particular, the distribution of eigenvalues reported here might adopt a different shape whenever predator–prey (i.e., excitation-inhibition), competition and/or mutualistic interactions are introduced, yet are fully amenable to explore heterogeneous neuronal properties (62, 84–86, 103, 104).

We have been in part biased by our initial work in the context of epilepsy, which is a pathological condition where individuals slip in and out of pathological dynamical brain states (21, 105) called seizures, and how excitability homogenization renders circuits more prone to such seizure-like states (37). By drawing parallels to this previous work (37), we find a reasonable approximation of what the heterogeneities studied here might translate to in an in vitro or in vivo setting. Indeed, the model neurons implemented in this study are analogous to those used in this previous work and allow us to conclude that changes in

*σ*_{H}^{2}from values on the order of 10^{−2}to values on the order of 10^{1}roughly correspond to increasing the heterogeneity from pathological levels of approximately 1 mV (SD in the distance to threshold) to physiological levels of approximately 10 mV. While this correspondence is by no means exact given the intricacies of the respective neuron models, it supports the range of relevant heterogeneity that has been (37) and could be observed experimentally.Our choice to explore excitability heterogeneity is not haphazard, and for four reasons, it is not surprising that we find that it has profound effects on resiliency of brain circuits. First, as discussed above, cellular diversity is the norm in the brain and thus appears to be a clear “design principle” of neuronal circuits, which we accept at face value to be beneficial to the brain and for which the biological machinery clearly exists (106). Second, there is ample evidence both experimentally and computationally that excitability heterogeneity is helpful for information coding in the brain, decorrelating brain networks while expanding their informational content (3, 74). Third, we have shown that among a number of experimentally determined electrophysiological features of human neurons, it is the loss of excitability heterogeneity that accompanies epilepsy (37), echoing the normalization in excitability that accompanies other disease states (77). Our mathematical and computational work showed that excitability heterogeneity prevents sudden transitions to highly correlated information-poor brain activity. Lastly, neuronal excitability is highly malleable. This malleability arises from the process of intrinsic plasticity, where neuronal excitability is modulated by the neuron’s past activity (5, 63). Indeed learning is accompanied by changes in voltage and calcium-activated channels that are principally involved in setting resting membrane potential, input resistance, and rheobase (63). These channels are also altered in a number of neuropsychiatric conditions, including epilepsy (65). Excitability thus represents a local parameter tuned to activity of each neuron of the network it is embedded in.

In the light of these results, various forms of neuronal (44–46) and glial (36) heterogeneity support the resilience and qualitative invariance of neural circuits over extended time scales. This of course holds true in healthy brains despite continuous external and internal changes, driven by factors including modulatory inputs (6–9), environmental fluctuations and/or stimuli (1, 22, 62), and changes in connectivity like those resulting from synaptic plasticity (66, 67). This also holds true for processes that continuously change the brain during development and aging, where brain dynamics remain stable over many decades despite the structural changes that accompany time and pathological processes, where failure to regulate brain activity in the face of pathological insults predisposes the brain to dynamic volatility (15, 18). A confluence of both experimental (1, 4, 22, 33–37) and theoretical studies (37–43, 59) have highlighted the role of heterogeneity in brain dynamics and stability. Notably, phenotypic diversity has been shown to promote the stability of brain function and its associated dynamics through degeneracy, redundancy, and covariation (1, 22, 107). Notably, in ref. (32), the authors provided a comprehensive overview of the destabilizing influence of motif heterogeneity—the variability in the connection degree or alternatively a lack of redundancy in connectivity—on complex random graphs. Recontextualized from the perspective of neural systems, these results and ours suggest that networks exhibiting redundant connectivity motifs alongside node heterogeneity will generically exhibit enhanced stability and resilience, corroborating numerous experimental findings (1).

Like all computational and theoretical work, there are limitations to the contexts in which these results are applicable. First, our model represents a balance between neurophysiological relevance and mathematical tractability. More detailed and biophysically rich models are certainly required to provide a more comprehensive understanding of the role of diversity on network stability. Second, phenotype diversity certainly impacts neural activity beyond excitability. A more thorough characterization of neural variability is surely warranted to reinforce the alignment between our model and experimental data, notably to improve the scope of our predictions. Third, our model does not consider stochastic fluctuations that are known to be ubiquitous in neural systems (108) and to influence their stability (79, 109, 110). We have neglected this source of variation due to the long time scales considered here and the moderate nonlinearity of the system (i.e., parameterized by the firing rate response gain

*β*). Future work is required to incorporate noise in both our simulations and analyses.In summary, our results position excitability heterogeneity and, possibly more generally, neuronal diversity (from transcriptomic and functional studies), as a critical design feature of the brain to ensure that its rich dynamics are preserved in the face of a wide set of network parameters. Furthermore, resilience of dynamics to scale (physical scale as in the number of neurons or connectivity as in the strength of connections) is of course critically important for a growing developing brain as well as an aging brain. Excitability heterogeneity might provide the dynamical resilience required to stabilize brain dynamics through these profound structural changes.

## Materials and Methods

### Network Model.

We consider a large network of

*N*neurons whose activity evolves according to the interplay between local relaxation, recurrent synaptic connectivity, and slowly varying modulatory input. This model provides a description of dynamics unfolding over extended time scales, hence quantifying mean neuronal activity. The mean somatic membrane potential of neurons*u*_{i}(*t*),*i*∈ [1,*N*] obeys the following set of nonlinear differential equations:$$\begin{array}{c}\hfill \tau \frac{d}{\mathit{dt}}\mathbf{u}=\mathbf{L}[\mathbf{u}]+\mathbf{W}f[\mathbf{u}+\mathbf{H}]+\mathbf{B}+\mathbf{S}(t),\end{array}$$

[1]

where

*τ*≫ 1 represents the slow time scale at which the dynamics occur. In the following, we rescale time by*t*→*t**τ*for convenience. The model in Eq.**1**is both flexible and general, encompassing the mean behavior of a wide scope of interconnected neuron models involving excitatory and inhibitory interactions, such as the celebrated Wilson–Cowan and Jansen–Rit models, for instance. Depending on the spatial scale considered, which remains here undefined, such models can be either considered to be neuron-based (where nodes represent individual neurons—the perspective we adopt here) or populations (where nodes represent assemblies of such neurons), geared toward the characterization of neuronal mean activity across extended time scales. The term $\mathbf{L}[\mathbf{u}]=\ell \mathbf{u}$ is a linear local relaxation term with rate ℓ < 0, and ${(\mathbf{f}[\mathbf{u}])}_{i}=f({u}_{i})=\frac{1}{2}(1+\phantom{\rule{0.333333em}{0ex}}\text{erf}[\beta {u}_{i}])$ represents the firing rate response function of neurons with the Gaussian error function erf[⋅]. This function relates the membrane potential activity to the firing rate of the neurons. The choice of such response functions is motivated by the fact that they can be generalized to various classes of excitability by a proper adjustment of the response gain*β*. For instance, small values of*β*yield class I excitability, while larger values correspond to class II (76). These different cases are both examined in the subsequent analysis. The vector-valued term $\mathbf{H}$ implements node diversity through spatially heterogeneous neuronal excitability, i.e., variable firing rate thresholds between neurons. The entries of $\mathbf{H}$ are sampled from a zero-mean Gaussian distribution of variance ${\sigma}_{\mathbf{H}}^{2}$. The neuron’s baseline activity $\mathbf{B}<\mathbf{0}$ is a constant offset term used to set the neurons in a subthreshold regime in the absence of input. Last, the network in Eq.**1**is further subjected to a slow modulatory input $\mathbf{S}(t)$.The connectivity matrix $\mathbf{W}$ in Eq.

**1**specifies synaptic coupling between any pair of neurons. We assume randomly distributed excitatory and inhibitory coupling (89), with connection probability*ρ*. This connectivity motif corresponds to a weighted Erdős–Rényi random graph; we emphasize, however, that the following results may be easily extended to other topologies (e.g., ref. (32)). The strength of these synaptic connections is Gaussian-distributed with mean*μ*_{e}and*μ*_{i}, variance*σ*_{e}^{2}and*σ*_{i}^{2}, and with probability density functions*p*_{e}and*p*_{i}, respectively. We ensure that there are no self-connections, i.e.,*W*_{ii}= 0 ∀*i*. In addition, we parameterize the relative density of excitatory versus inhibitory connections by a coefficient*f*, 0 ≤*f*≤ 1. Consequently, the probability density function of synaptic weights*W*_{ij}may be written as*p*=*ρ**f**p*_{e}−*ρ*(1 −*f*)*p*_{i}(89). Moreover, we choose balanced connectivity with $\sum _{j=1}^{N}{W}_{\mathit{ij}}=0$, i.e., the sum over excitatory and inhibitory synaptic connections vanishes at each node yielding*μ*_{W}= 0 and a corresponding synaptic connectivity variance*σ*_{W}^{2}(see*SI Appendix*for details). The mean network activity ⟨*u*⟩(*t*) is defined as the average activity across all neurons $\u27e8u\u27e9(t)=\sum _{i=1}^{N}{u}_{i}(t)/N$.### Stability.

By construction, the network in Eq.

**1**subscribes to the circular law of random matrix theory (61, 82). According to this law, the distribution of eigenvalues—reflecting stability—is constrained with high probability within a disk centered around the local relaxation gain ℓ (called the spectral disk) in the complex plane, whose radius*Γ*(called the spectral radius) can be determined analytically. If the disk is bounded in the left-hand side of the imaginary axis (i.e., all real parts of these eigenvalues are negative), the network is said to be stable and its activity invariably relaxes back to its equilibrium after a perturbation. In our network model, such stable equilibrium is characterized by weak, asynchronous neuronal firing. If some eigenvalues cross the imaginary axis (i.e., the spectral disk is too large, and some eigenvalues possess positive real parts), the network is said to be unstable, leading to activity that diverges, is synchronous and/or chaotic. In the intermediate case, when dominant eigenvalues (those possessing the largest real part) lie close to the imaginary axis, the network is said to reside at a critical point sitting between stability and instability, commonly referred to as metastable. Through this framework, equilibria of Eq.**1**are stable whenever*Γ*< |ℓ|, metastable if*Γ*≈ |ℓ|, and unstable otherwise. One may hence determine the spectral radius*Γ*and determine whether stability persists to changes in modulatory input and other control parameters. At short time scales (*τ*≪ 1), the slowly varying modulatory input in Eq.**1**can be considered constant, i.e.,**S**(*t*)=*S*_{o}(1, …, 1)^{t}. Consequently, if all neurons share the same excitability profile, i.e., $\mathbf{H}=0$, we find the stability condition$$\begin{array}{cc}\hfill {\mathrm{\Gamma}}_{\mathbf{H}=\mathbf{0},\mathbf{S}\ne \mathbf{0}}& =\sqrt{\frac{{\sigma}_{W}^{2}(N-1)\rho {\beta}^{2}}{\pi}}{e}^{-{\beta}^{2}{({S}_{o}+B)}^{2}}<|\ell |.\hfill \end{array}$$

[2]

Eq.

**2**further reveals that increasing the network size (*N*), the connection probability (*ρ*), the variance of synaptic weights (*σ*_{W}^{2}; implying increases of the mean (*μ*) and variances*σ*_{e}^{2},*σ*_{i}^{2}) as well as the firing rate response gain (*β*) all cause an expansion of the spectral radius*Γ*, as does the modulatory input amplitude*S*_{o}. The dependence of Eq.**2**on these various parameters is plotted in Fig. 3. Consequently, they all lead to instability of Eq.**1**, in line with previous studies (32, 110).The influence of excitability heterogeneity on stability may also be exposed by investigating how diversity in excitability thresholds, i.e., $\mathbf{H}\ne \mathbf{0}$, impacts the spectral radius

*Γ*. Recall that elements of $\mathbf{H}$ are random and sampled from a Gaussian distribution of mean 0 and variance ${\sigma}_{\mathbf{H}}^{2}$. In this case, the stability condition becomes instead$$\begin{array}{c}\hfill {\mathrm{\Gamma}}_{\mathbf{H}\ne \mathbf{0},\mathbf{S}\ne \mathbf{0}}=\sqrt{\frac{{\sigma}_{W}^{2}(N-1)\rho {\beta}^{2}}{\pi \sqrt{\gamma}}}{e}^{-{({S}_{o}+B)}^{2}{\beta}^{2}/{d}^{2}\gamma}<|\ell |,\end{array}$$

[3]

where $\gamma =1+4{\beta}^{2}({\sigma}_{{\mathbf{u}}^{o}}^{2}+{\sigma}_{\mathbf{H}}^{2})$ (see

*SI Appendix*for a detailed derivation).### Volatility and Resilience.

Resilience refers to the qualitative invariance of dynamical states and the absence of stability transitions. To measure resilience, one may quantify the sensitivity of the spectral radius, i.e., how much

*Γ*fluctuates when exposed to changes to a given control parameter*P*. To do so, we defined the spectral volatility measure*κ*_{P}, reflecting the sensitivity of the spectral disk to change in the parameter*P*over the range of values that this parameter can take (cf.,*SI Appendix*for more details). Small volatility reflects persistence of the eigenvalue distributions and its overall resistance toward stability transitions. We may thus introduce the reciprocal of*κ*_{P}to quantify resilience, i.e., ℛ_{P}= 1/(1 +*κ*_{P}). Note that low (high) volatility corresponds to high (low) resilience.Given that we have derived the spectral radius analytically in Eq.

**3**, we are in a position to compute the spectral volatility and resilience analytically. Specifically, if one considers*P*=*S*_{o}, one obtains$$\begin{array}{c}\hfill {\kappa}_{{S}_{o}}=2{\sigma}_{W}\sqrt{\frac{(N-1)\rho \beta}{\pi \sqrt{\gamma}}},\\ \hfill \end{array}$$

[4]

$$\begin{array}{c}\hfill {\mathcal{R}}_{{S}_{o}}=\frac{\sqrt{\pi \sqrt{\gamma}}}{\sqrt{\pi \sqrt{\gamma}}+2{\sigma}_{W}\sqrt{(N-1)\rho \beta}},\end{array}$$

[5]

where $\gamma =1+4{\beta}^{2}({\sigma}_{{\mathbf{u}}^{o}}^{2}+{\sigma}_{\mathbf{H}}^{2})$. Eqs.

**4**and**5**show that the volatility and resilience of the network with respect to the modulatory input amplitude both depend on excitability heterogeneity through the factor*γ*. Whenever*σ*_{H}^{2}> 0 increases, the spectral volatility*κ*_{So}decreases and the resilience ℛ_{So}increases.### Network Statistics.

To generate representative spiking activity in Fig. 2

*A*and*B*associated with the dynamics of our network model (i.e., Eq.**1**), we simulated Poisson spike trains whose firing rates match those generated by neurons in our network. Specifically, we simulated Poisson processes*X*_{i}→ Poisson(*r*_{i}), where*r*_{i}=*f*_{i}=*f*[*u*_{i}+*h*_{i}] corresponds to the firing rate of individual neurons. The resulting exemplar spiking activity corresponds to a given a realization of such Poisson processes whose rates are identical to the ones predicted by our model.Normalized mean network firing rates $\overline{r}$ in Fig. 2

*C*can be computed with and without heterogeneity. Since $\overline{r}=\sum _{i=1}^{N}{f}_{i}/N$, in the*N*≫ 1 limit of large networks, we find$$\begin{array}{c}\hfill \overline{r}\approx \frac{1}{2}\left(1+\phantom{\rule{0.333333em}{0ex}}\text{erf}\left[\frac{\beta ({S}_{o}+B)}{\sqrt{1+2{\beta}^{2}({\sigma}_{{\mathbf{u}}^{o}}^{2}+{\sigma}_{\mathbf{H}}^{2})}}\right]\right),\end{array}$$

[6]

see

*SI Appendix*for more details.Excitability heterogeneity also shapes pairwise firing rate correlations. Over long time scales (i.e.,

*τ*≫ 1) the modulatory input is not constant and may be considered a normally distributed and uncorrelated random process of mean 0 and variance 2*S*_{o}^{2}, resulting in correlated fluctuations in neuronal activity. One may further consider the weak coupling limit where the recurrent connectivity is small compared to modulatory input amplitude (*σ*_{W}^{2}≪*S*_{o}^{2}). In this regime, the activity of individual neurons in Eq.**1**is strongly driven by modulatory inputs, exhibiting a normally distributed probability density function of mean 0 and variance*S*_{o}^{2}. We note that these approximations hold whenever the network is stable (i.e.,*Γ*< |ℓ|).For neurons

*i*and*j*with excitability thresholds*h*_{i}and*h*_{j}, respectively, the mean pairwise correlation coefficient*c*_{ij}between the firing rates*r*_{i}=*f*[*u*_{i}+*h*_{i}] and*r*_{j}=*f*[*u*_{j}+*h*_{j}] may be expressed as$$\begin{array}{c}\hfill {c}_{\mathit{ij}}=\frac{\phantom{\rule{0.333333em}{0ex}}\text{cov}[{r}_{i}{r}_{j}](\mathrm{\Delta}{h}_{\mathit{ij}})}{\sqrt{\phantom{\rule{0.333333em}{0ex}}\text{Var}[{r}_{i}]\phantom{\rule{0.333333em}{0ex}}\text{Var}[{r}_{j}]}},\end{array}$$

[7]

where cov[

*r*_{i}*r*_{j}](*Δ**h*_{ij}) corresponds to the covariance expressed as a function of the difference*Δ**h*_{ij}=*h*_{j}−*h*_{i}in excitability between two neurons*i*and*j*, and Var[*r*_{i, j}] is the firing rate variance (see*SI Appendix*for further details).## Data, Materials, and Software Availability

Code data have been deposited in GitHub (https://github.com/Jeremie-Lefebvre/Hutt-et-al-PNAS-2023) (111).

## Acknowledgments

We thank the National Sciences and Engineering Research Council of Canada (NSERC Grants RGPIN-2017-06662 to J.L. and RGPIN-2015-05936 to T.A.V.) and the Krembil Foundation (Krembil Seed Grant to J.L. and T.A.V.) for support of this research.

### Author contributions

A.H., S.R., T.A.V., and J.L. designed research; A.H. and J.L. performed research; A.H. and J.L. analyzed data; and A.H., S.R., T.A.V., and J.L. wrote the paper.

### Competing interests

The authors declare no competing interest.

## Supporting Information

Appendix 01 (PDF)

- Download
- 267.83 KB

## References

1

E. Marder, J. M. Goaillard, Variability, compensation and homeostasis in neuron and network function.

*Nat. Rev. Neurosci.***7**, 563–574 (2006).2

S. J. Altschuler, L. F. Wu, Cellular heterogeneity: Do differences make a difference?

*Cell***141**, 559–563 (2010).3

S. J. Tripathy, K. Padmanabhan, R. C. Gerkin, N. N. Urban, Intermediate intrinsic diversity enhances neural population coding.

*Proc. Natl. Acad. Sci. U.S.A.***110**, 8248–8253 (2013).4

H. Moradi Chameh et al., Diversity amongst human cortical pyramidal neurons revealed via their sag currents and frequency preferences.

*Nat. Commun.***12**(2021).5

G. G. Turrigiano, S. B. Nelson, Homeostatic plasticity in the developing nervous system.

*Nat. Rev. Neurosci.***5**, 97–107 (2004).6

A. M. Graybiel, Neurotransmitters and neuromodulators in the basal ganglia.

*Trends Neurosci.***13**, 244–254 (1990).7

E. Marder, Neuromodulation of neuronal circuits: Back to the future.

*Neuron***76**, 1–11 (2012).8

S. H. Lee, Y. Dan, Neuromodulation of brain states.

*Neuron***76**, 209–222 (2012).9

S. Rich, M. Zochowski, V. Booth, Effects of neuromodulation on excitatory–inhibitory neural network dynamics depend on network connectivity structure.

*J. Nonlinear Sci.***30**, 2171–2194 (2020).10

M. A. Lynch, Long-term potentiation and memory.

*Physiol. Rev.***84**, 87–136 (2004).11

C. S. Holling, Resilience and stability of ecological systems.

*Annu. Rev. Ecol. Syst.***4**, 1–23 (1973).12

K. S. McCann, The diversity–stability debate.

*Nature***405**, 228–233 (2000).13

H. Kitano, Biological robustness.

*Nat. Rev. Genet.***5**, 826–837 (2004).14

A. R. Ives, S. R. Carpenter, Stability and diversity of ecosystems.

*Science***317**, 58–62 (2007).15

M. Scheffer et al., Anticipating critical transitions.

*Science***338**, 344–348 (2012).16

H. H. Jasper,

*Jasper’s Basic Mechanisms of the Epilepsies*(OUP USA, 2012), vol. 80.17

C. J. Honey, T. Valiante, Neuroscience: When a single image can cause a seizure.

*Curr. Biol.***27**, R394–R397 (2017).18

M. O. Baud et al., Multi-day rhythms modulate seizure risk in epilepsy.

*Nat. Commun.***9**, 1–10 (2018).19

M. A. Kramer, H. E. Kirsch, A. J. Szeri, Pathological pattern formation and cortical propagation of epileptic seizures.

*J. R. Soc. Interface***2**, 113–127 (2005).20

Z. Zhang, T. Valiante, P. Carlen, Transition to seizure: From “macro’’- to “micro’’-mysteries.

*Epilepsy Res.***97**, 290–299 (2011).21

V. K. Jirsa, W. C. Stacey, P. P. Quilichini, A. I. Ivanov, C. Bernard, On the nature of seizure dynamics.

*Brain***137**, 2210–2230 (2014).22

J. M. Goaillard, E. Marder, Ion channel degeneracy, variability, and covariation in neuron and circuit resilience.

*Annu. Rev. Neurosci.***44**, 335–357 (2021).23

R. Levins, Some demographic and genetic consequences of environmental heterogeneity for biological control.

*Am. Entomol.***15**, 237–240 (1969).24

S. A. Levin,

*Some Mathematical Questions in Biology*(American Mathematical Society, 1974).25

R. M. May, Qualitative stability in model ecosystems.

*Ecology***54**, 638–641 (1973).26

I. Hanski, M. Gilpin, Metapopulation dynamics: Brief history and conceptual domain.

*Biol. J. Linnean Soc.***42**, 3–16 (1991).27

T. Elmqvist et al., Response diversity, ecosystem change, and resilience.

*Front. Ecol. Environ.***1**, 488–494 (2003).28

S. Allesina, S. Tang, Stability criteria for complex ecosystems.

*Nature***483**, 205–208 (2012).29

P. Landi, H. O. Minoarivelo, Å. Brännström, C. Hui, U. Dieckmann, “Complexity and stability of adaptive ecological networks: A survey of the theory in community ecology” in

*Systems Analysis Approach for Complex Global Challenges*(Springer, 2018), pp. 209–248.30

R. M. May, Will a large complex system be stable?

*Nature***238**, 413–414 (1972).31

R. M. May, Stability in multispecies community models.

*Math. Biosci.***12**, 59–79 (1971).32

G. Yan, N. Martinez, Y. Y. Liu, Degree heterogeneity and stability of ecological networks.

*J. R. Soc. Interface***14**, 20170189 (2017).33

H. Markram et al., Interneurons of the neocortical inhibitory system.

*Nat. Rev. Neurosci.***5**, 793–807 (2004).34

I. Mody, R. A. Pearce, Diversity of inhibitory neurotransmission through GABA(A) receptors.

*Trends Neurosci.***27**, 569–575 (2004).35

I. Soltesz et al.,

*Diversity in the Neuronal Machine: Order and Variability in Interneuronal Microcircuits*(Oxford University Press, 2006).36

I. Matias, J. Morgado, F. C. A. Gomes, Astrocyte heterogeneity: Impact to brain aging and disease.

*Front. Aging Neurosci.***11**, 59 (2019).37

S. Rich, H. M. Chameh, J. Lefebvre, T. A. Valiante, Loss of neuronal heterogeneity in epileptogenic human tissue impairs network resilience to sudden changes in synchrony.

*Cell Rep.***39**, 110863 (2022).38

C. Börgers, N. Kopell, Synchronization in networks of excitatory and inhibitory neurons with sparse, random connectivity.

*Neural Comput.***15**, 509–538 (2003).39

J. Mejias, A. Longtin, Optimal heterogeneity for coding in spiking neural networks.

*Phys. Rev. Lett.***108**, 228102 (2012).40

M. Y. Yim, A. Aertsen, S. Rotter, Impact of intrinsic biophysical diversity on the activity of spiking neurons.

*Phys. Rev. E***87**, 032710 (2013).41

J. Mejias, A. Longtin, Differential effects of excitatory and inhibitory heterogeneity on the gain and asynchronous state of sparse cortical networks.

*Front. Comput. Neurosci.***8**, 107 (2014).42

J. Gjorgjieva, G. Drion, E. Marder, Computational implications of biophysical diversity and multiple timescales in neurons and synapses for circuit performance.

*Curr. Opin. Neurobiol.***37**, 44–52 (2016).43

R. Gast, S. A. Solla, A. Kennedy, Macroscopic dynamics of neural networks with heterogeneous spiking thresholds.

*Phys. Rev. E*,**107**, 024306 (2023).44

M. S. Cembrowski, V. Menon, Continuous variation within cell types of the nervous system.

*Trends Neurosci.***41**, 337–348 (2018).45

F. Scala et al., Phenotypic variation of transcriptomic cell types in mouse motor cortex.

*Nature***598**, 144–150 (2021).46

H. Zeng, What is a cell type and how to define it?

*Cell***185**, 2739–2755 (2022).47

B. Tasic et al., Shared and distinct transcriptomic cell types across neocortical areas.

*Nature***563**, 72–78 (2018).48

X. J. Wang, Macroscopic gradients of synaptic excitation and inhibition in the neocortex.

*Nat. Rev. Neurosci.***21**, 169–178 (2020).49

A. Renart, P. Song, X. J. Wang, Robust spatial working memory through homeostatic synaptic scaling in heterogeneous cortical networks.

*Neuron***38**, 473–485 (2003).50

R. Darshan, A. Rivkind, Learning to represent continuous variables in heterogeneous neural networks.

*Cell Rep.***39**, 110612 (2022).51

G. R. Yang, M. R. Joglekar, H. F. Song, W. T. Newsome, X. J. Wang, Task representations in neural networks trained to perform many cognitive tasks.

*Nat. Neurosci.***22**, 297–306 (2019).52

K. Kaleb, V. Pedrosa, C. Clopath, Network-centered homeostasis through inhibition maintains hippocampal spatial map and cortical circuit function.

*Cell Rep.***36**, 109577 (2021).53

F. Zeldenrust, B. Gutkin, S. Denéve, Efficient and robust coding in heterogeneous recurrent networks.

*PLoS Comput. Biol.***17**, e1008673 (2021).54

A. Seeholzer, M. Deger, W. Gerstner, Stability of working memory in continuous attractor networks under the control of short-term plasticity.

*PLoS Comput. Biol.***15**, e1006928 (2019).55

C. M. Kim, C. C. Chow, Learning recurrent dynamics in spiking networks.

*eLife***7**, e37124 (2018).56

A. Bernacchia, X. J. Wang, Decorrelation by recurrent inhibition in heterogeneous neural circuits.

*Neural Comput.***25**, 1732–1767 (2013).57

N. Perez-Nieves, V. C. Leung, P. L. Dragotti, D. F. Goodman, Neural heterogeneity promotes robust learning.

*Nat. Commun.***12**, 5791 (2021).58

M. Deistler, J. H. Macke, P. J. Gonçalves, Energy-efficient network activity from disparate circuit parameters.

*Proc. Natl. Acad. Sci. U.S.A.***119**, e2207632119 (2022).59

I. Aradi, I. Soltesz, Modulation of network behaviour by changes in variance in interneuronal properties.

*J. Physiol.***538**, 227–251 (2002).60

Z. Ma, G. G. Turrigiano, R. Wessel, K. B. Hengen, Cortical circuit dynamics are homeostatically tuned to criticality in vivo.

*Neuron***104**, 655–664 (2019).61

V. L. Girko, Circular law.

*Theory Probab. Its Appl.***29**, 694–706 (1985).62

L. S. Tang, A. L. Taylor, A. Rinberg, E. Marder, Robustness of a rhythmic circuit to short- and long-term temperature changes.

*J. Neurosci.***32**, 10075–10085 (2012).63

W. Zhang, D. J. Linden, The other side of the engram: Experience-driven changes in neuronal intrinsic excitability.

*Nat. Rev. Neurosci.***4**, 885 (2003).64

D. Debanne, Y. Inglebert, M. Russier, Plasticity of intrinsic neuronal excitability.

*Curr. Opin. Neurobiol.***54**, 73–82 (2019).65

H. Beck, Y. Yaari, Plasticity of intrinsic neuronal properties in CNS disorders.

*Nat. Rev. Neurosci.***9**, 357–369 (2008).66

F. Zenke, G. Hennequin, W. Gerstner, Synaptic plasticity in neural networks needs homeostasis with a fast rate detector.

*PLoS Comput. Biol.***9**, e1003330 (2013).67

F. Zenke, W. Gerstner, Hebbian plasticity requires compensatory processes on multiple timescales.

*Philos. Trans. R. Soc. B: Biol. Sci.***372**, 20160259 (2017).68

M. N. Shadlen, W. T. Newsome, Noise, neural codes and cortical organization.

*Curr. Opin. Neurobiol.***4**, 569–579 (1994).69

M. N. Shadlen, W. T. Newsome, The variable discharge of cortical neurons: Implications for connectivity, computation, and information coding.

*J. Neurosci.***18**, 3870–3896 (1998).70

L. C et al., Local connectivity and synaptic dynamics in mouse and human neocortex.

*Science***375**, e2211449119 (2022).71

C. van Vreeswijk, H. Sompolinsky, Chaos in neuronal networks with balanced excitatory and inhibitory activity.

*Science***274**, 1724–1726 (1996).72

C. Freitas, E. Macau, A. Pikovsky, Partial synchronization in networks of non-linearly coupled oscillators: The deserter hubs model.

*Chaos***25**, 043119 (2015).73

M. C. Ng, J. Jing, M. B. Westover,

*Atlas of Intensive Care Quantitative EEG*(Springer Publishing Company, 2019).74

K. Padmanabhan, N. N. Urban, Intrinsic biophysical diversity decorrelates neuronal firing while increasing information content.

*Nat. Neurosci.***13**, 1276–1282 (2010).75

M. Di Volo, A. Destexhe, Optimal responsiveness and information flow in networks of heterogeneous neurons.

*Sci. Rep.***11**, 17611 (2021).76

W. Gerstner, W. M. Kistler, R. Naud, L. Paninski,

*Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition*(Cambridge University Press, 2014).77

K. J et al., Excitability of neocortical neurons in a mouse model of amyotrophic lateral sclerosis are not specific to corticospinal neurons and are modulated by advancing disease.

*J. Neurosci.***37**, 9037–9053 (2017).78

J. Lefebvre, A. Hutt, J. Knebel, K. Whittingstall, M. Murray, Stimulus statistics shape oscillations in nonlinear recurrent neural networks.

*J. Neurosci.***35**, 2895–2903 (2015).79

S. Rich, A. Hutt, F. K. Skinner, T. A. Valiante, J. Lefebvre, Neurostimulation stabilizes spiking neural networks by disrupting seizure-like oscillatory transitions.

*Sci. Rep.***10**, 1–17 (2020).80

J. Lengler, F. Jug, A. Steger, Reliable neuronal systems: The importance of heterogeneity.

*PLoS ONE***8**, e80694 (2013).81

S. Denève, C. K. Machens, Efficient codes and balanced networks.

*Nat. Neurosci.***19**, 375–382 (2016).82

R. M. May,

*Stability and Complexity in Model Ecosystems*(Princeton University Press, 2001).83

G. Christodoulou, T. P. Vogels, The eigenvalue value (in neuroscience).

*OSF Preprints*, (2022). https://doi.org/10.31219/osf.io/evqhy.84

O. Harish, D. Hansel, Asynchronous rate chaos in spiking neuronal circuits.

*PLoS Comput. Biol.***11**, e1004266 (2015).85

S. Ostojic, Two types of asynchronous activity in networks of excitatory and inhibitory spiking neurons.

*Nat. Neurosci.***17**, 592–600 (2014).86

F. Mastrogiuseppe, S. Ostojic, Two types of asynchronous activity in networks of excitatory and inhibitory spiking neurons.

*Neuron***99**, 609–623.e29 (2018).87

J. Ginibre, Statistical ensembles of complex, quaternion, and real matrices.

*J. Math. Phys.***6**, 440–449 (1965).88

V. K. Jirsa, M. Ding, Will a large complex system with time delays be stable?

*Phys. Rev. Lett.***93**, 070602 (2004).89

K. Rajan, L. Abbott, Eigenvalue spectra of random matrices for neural networks.

*Phys. Rev. Lett.***97**, 188104 (2006).90

A. Levina, J. M. Herrmann, T. Geisel, Dynamical synapses causing self-organized criticality in neural networks.

*Nat. Phys.***3**, 857–860 (2007).91

L. Cocchi, L. L. Gollo, A. Zalesky, M. Breakspear, Criticality in the brain: A synthesis of neurobiology, models and cognition.

*Progr. Neurobiol.***158**, 132–152 (2017).92

S. Ratté, Y. Zhu, K. Y. Lee, S. A. Prescott, Criticality and degeneracy in injury-induced changes in primary afferent excitability and the implications for neuropathic pain.

*eLife***3**, e02370 (2014).93

W. C. Abraham, Metaplasticity: Tuning synapses and networks for plasticity.

*Nat. Rev. Neurosci.***9**, 387 (2008).94

F. Tang et al., RNA-Seq analysis to capture the transcriptome landscape of a single cell.

*Nat. Protocols***5**, 516–535 (2010).95

S. J. Tripathy et al., Transcriptomic correlates of neuron electrophysiological diversity.

*PLoS Comput. Biol.***13**, e1005814 (2017).96

C. Bomkamp et al., Transcriptomic correlates of electrophysiological and morphological diversity within and across excitatory and inhibitory neuron classes.

*PLoS Comput. Biol.***15**, e1007113 (2019).97

R. Yuste, From the neuron doctrine to neural networks.

*Nat. Rev. Neurosci.***16**, 487–497 (2015).98

J. J. Jun et al., Fully integrated silicon probes for high-density recording of neural activity.

*Nature***551**, 232–236 (2017).99

L. Luo, E. M. Callaway, K. Svoboda, Genetic dissection of neural circuits: A decade of progress.

*Neuron***98**, 256–281 (2018).100

A. Lakshmanan et al., Molecular engineering of acoustic protein nanostructures.

*ACS Nano***10**, 7314–7322 (2016).101

G. T. Einevoll et al., The scientific case for brain simulations.

*Neuron***102**, 735–744 (2019).102

E. Pigani, D. Sgarbossa, S. Suweisa, A. Maritana, S. Azaelea, Delay effects on the stability of large ecosystems.

*Proc. Natl. Acad. Sci. U.S.A.***119**, e2211449119 (2022).103

J. Aljadeff, M. Stern, T. Sharpee, Transition to chaos in random networks with cell-type-specific connectivity.

*Phys. Rev. Lett.***114**, 088101 (2015).104

S. Tang, S. Pawar, S. Allesina, Correlation between interaction strengths drives stability in large ecological networks.

*Ecol. Lett.***17**, 1094–1100 (2014).105

M. L. Saggio et al., A taxonomy of seizure dynamotypes.

*eLife***9**, e55632 (2020).106

D. Arendt et al., The origin and evolution of cell types.

*Nat. Rev. Genet.***17**, 744–757 (2016).107

B. E. Mizusaki, C. O’Donnell, Neural circuit function redundancy in brain disorders.

*Curr. Opin. Neurobiol.***70**, 74–80 (2021).108

A. A. Faisal, L. P. Selen, D. M. Wolpert, Noise in the nervous system.

*Nat. Rev. Neurosci.***9**, 292–303 (2008).109

A. Hutt, A. Longtin, L. Schimansky-Geier, Additive noise-induced Turing transitions in spatial systems with application to neural fields and the Swift–Hohenberg equation.

*Physica D***237**, 755–773 (2008).110

A. Hutt, J. Lefebvre, D. Hight, H. Kaiser, Phase coherence induced by additive Gaussian and non-Gaussian noise in excitable networks with application to burst suppression-like brain signals.

*Front. Appl. Math. Stat.***5**, 69 (2020).111

H. Axel, R. Scott, V. Taufik, A. L. Jérémie, Hutt-et-al-PNAS-2023. Github. https://github.com/Jeremie-Lefebvre/Hutt-et-al-PNAS-2023. Deposited 13 June 2023.

## Information & Authors

### Information

#### Published in

#### Classifications

#### Copyright

Copyright © 2023 the Author(s). Published by PNAS. This article is distributed under Creative Commons Attribution-NonCommercial-NoDerivatives License 4.0 (CC BY-NC-ND).

#### Data, Materials, and Software Availability

Code data have been deposited in GitHub (https://github.com/Jeremie-Lefebvre/Hutt-et-al-PNAS-2023) (111).

#### Submission history

**Received**: November 3, 2022

**Accepted**: May 19, 2023

**Published online**: July 3, 2023

**Published in issue**: July 11, 2023

#### Keywords

#### Acknowledgments

We thank the National Sciences and Engineering Research Council of Canada (NSERC Grants RGPIN-2017-06662 to J.L. and RGPIN-2015-05936 to T.A.V.) and the Krembil Foundation (Krembil Seed Grant to J.L. and T.A.V.) for support of this research.

##### Author Contributions

A.H., S.R., T.A.V., and J.L. designed research; A.H. and J.L. performed research; A.H. and J.L. analyzed data; and A.H., S.R., T.A.V., and J.L. wrote the paper.

##### Competing Interests

The authors declare no competing interest.

#### Notes

This article is a PNAS Direct Submission.

### Authors

## Metrics & Citations

### Metrics

#### Citation statements

#### Altmetrics

### Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

#### Cited by

Loading...

## View Options

### View options

#### PDF format

Download this article as a PDF file

DOWNLOAD PDF#### Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Personal login Institutional Login#### Recommend to a librarian

Recommend PNAS to a Librarian#### Purchase options

Purchase this article to access the full text.