New Research In
Physical Sciences
Social Sciences
Featured Portals
Articles by Topic
Biological Sciences
Featured Portals
Articles by Topic
 Agricultural Sciences
 Anthropology
 Applied Biological Sciences
 Biochemistry
 Biophysics and Computational Biology
 Cell Biology
 Developmental Biology
 Ecology
 Environmental Sciences
 Evolution
 Genetics
 Immunology and Inflammation
 Medical Sciences
 Microbiology
 Neuroscience
 Pharmacology
 Physiology
 Plant Biology
 Population Biology
 Psychological and Cognitive Sciences
 Sustainability Science
 Systems Biology
Multiscale complex network of protein conformational fluctuations in singlemolecule time series

Edited by R. Stephen Berry, University of Chicago, Chicago, IL, and approved November 20, 2007 (received for review August 6, 2007)
Abstract
Conformational dynamics of proteins can be interpreted as itinerant motions as the protein traverses from one state to another on a complex network in conformational space or, more generally, in state space. Here we present a scheme to extract a multiscale state space network (SSN) from a singlemolecule time series. Analysis by this method enables us to lift degeneracy—different physical states having the same value for a measured observable—as much as possible. A state or node in the network is defined not by the value of the observable at each time but by a set of subsequences of the observable over time. The length of the subsequence can tell us the extent to which the memory of the system is able to predict the next state. As an illustration, we investigate the conformational fluctutation dynamics probed by singlemolecule electron transfer (ET), detected on a photonbyphoton basis. We show that the topographical features of the SSNs depend on the time scale of observation; the longer the time scale, the simpler the underlying SSN becomes, leading to a transition of the dynamics from anomalous diffusion to normal Brownian diffusion.
Optical singlemolecule spectroscopy has provided unique insights into both the distribution of molecular properties and their dynamic behavior, which are inaccessible using ensembleaveraged measurements (1–5). In principle, the complexity observed in the dynamics and kinetics of a protein originates in the underlying multidimensional energy landscape (6–12). The dynamics can be understood as the protein traversing from one state (node) to another along a complex network in conformational space or, more generally, in state space. The network properties of biological systems can provide a new perspective for addressing the nature of their hierarchical organization in multidimensional state space (10, 11, 13, 14), enabling us to ask such questions as: Is there any distinctive network topology that is characteristic for the native basin into which a protein folds? Are there any common network features that biological systems may have evolved by adapting to the changes in the environment? Motivated by questions of this nature, we address how one can extract the state space network (SSN) of multiscale biological systems explicitly from a singlemolecule time series, free from a priori assumptions on the underlying physical model or rules.
Fluorescence resonance energy transfer (FRET) and electron transfer (ET) are among the mostly widely used techniques for measuring the dynamics of protein conformational fluctuations (15) and folding (16–19). For example, Yang et al. (15) used a singlemolecule electron transfer experiment to reveal the complexity of protein fluctuations of the NADH:flavin oxidoreductase (Fre) complex. The fluorescence lifetimes showed that the distance between flavin adenine dinucleotide (FAD) and and a nearby tyrosine (Tyr) in a single Fre molecule fluctuates on a broad range of time scales (10^{−3} s to 1 s). Although the overall dynamics in the distance fluctuation are nonBrownian, they reflect normal diffusion on longer time scales. To gain further understanding of such anomalous behavior for protein conformational fluctuations, several analytical models have been proposed in terms of the generalized Langevin equation with fractional Gaussian noise (20), and the simplified discrete (21) and continuous (22) chain models. Recently, allatoms simulations (23) were performed to extrapolate the physical origin of the anomalous FADTyr distance fluctuation observed in experiment (>10^{−3} s) from the simulation time scales in nanoseconds.
These theoretical studies underscore the difficulties in establishing a minimal physical model for the origin of complexity in the kinetics or dynamics of biomolecules. Instead of postulating or constructing a physical model to characterize experimental results, we take a different approach to “let the system speak for itself” through the singlemolecule time series. Such unbiased solutions, which are datadriven instead of modeldriven, have been provided in the context of singlemolecule FRET experiments (24) and emission intermittency, demonstrated in resolving quantum dot blinking states (25), but defining the states and the corresponding networks from a singlemolecule time series remains a challenging problem. For instance, even while the system travels among different physical states, the values in the measured observable can be the same (26–28), i.e., degenerate, due to the finite resolution of the observation, noise contamination, and the limited number of measurable physical observables in the experiment. Such degeneracy can give rise to apparent longterm memory along the sequence of transitions even when the transitions among states are Markovian (29).
In this article, we present a method to extract the hierarchical SSN spanning several decades of time scales from a singlemolecule time series. Within the limited information available from a scalar time series, this method lifts degeneracy as much as possible. The SSN is expected to capture the manner in which the network is organized, which may be relevant to some functions of biological systems. The crux of our approach is the combination of computational mechanics (CM) developed by Crutchfield et al. (30, 31) in information theory and Wavelet decomposition for singlemolecule time series. The states are defined not only in terms of the value of the observable at each time but also the historical information of a set of the multiscale Wavelet components along the course of time evolution. Using the singlemolecule ET time series of Fre/FAD complex, we demonstrate that the multiscale SSNs provide analytical expressions for the multitime correlation function with the physical basis for the longterm conformational fluctuations.
Results and Discussion
SSN: OneDimensional Brownian Motion on a Harmonic Well.
We briefly describe how the original CM (31) defines “states”and constructs their network from scalar time series [see the detailed descriptions in supporting information (SI) Text . For a given time series x = (x(t _{1}), x(t _{2}) …, x(t _{N})) of the physical observable x, which is continuous in value and could be the intramolecular distance reported by fluorescent probes, we first discretize it to the symbolic sequence s = (s(t _{1}), s(t _{2})…, s(t _{N})) where s(t _{i}) denotes the symbolized observable at time t _{i} (see Fig. 1).An upper bound for the number of symbols may be determined by the experimental resolution. As we will see below, CM requires a statistical sampling of the subsequences in the symbolic time series s. Therefore, the choice of discretization scheme should depend not only on the experimental resolution but also on the statistical properties of the time series. A reasonable discretization is such that the topological properties of the constructed network are insensitive to the increase in the number of partitions. Second, we trace along the time series s for each time step t _{i} to record which subsequence of length L _{future}, (s _{A} ^{future} = {s(t _{i} _{+1}), …, s(t _{i+Lfuture})}), follows consecutively after the subsequence of length L _{past}, (s _{B} ^{past} = {s(t _{i−Lpast+1},…, s(t _{i}))}). Here A, B, … represent different symbolic subsequences that appear in s (see Fig. 1). The transition probability from s _{B} ^{past} to s _{A} ^{future} [denoted by P(AB)] can then be obtained for the time series s.
Third, we define “state” (denoted by S _{i} herein) by the set of past subsequences {s _{B′} ^{past}, s _{B″} ^{past}, …} with length L _{past}, which make transition to the future subsequences s _{A} ^{future} with the same transition probabilities (i.e., P(AB′) = P(AB″) = P(AB‴) = … for all A). A directed link (i.e., transition) from a state S _{i} to another state S _{j} can be drawn to represent the transition probability P(s _{A} ^{future}S _{i}) if the subsequence s _{A} ^{future} is generated from a transition from S _{i} to S _{j} along the time series s. The extraction of all states and transitions among them yields a SSN associated with the time series s. The most attractive feature of the CM is that it extracts (instead of postulating) the underlying SSN from time series: the length of memory L _{past} is chosen so as to make all transitions among the states Markovian; namely, the next state to visit is solely determined by the current state. The inferred SSN is, hence, regarded as a kind of hidden Markovian model extracted from the data. It has been established mathematically that such a SSN inferred from the time series, with memory effect automatically included, provides us with a minimal and optimally predictive machinery that can best reproduce the time series s in a statistical sense (31) (see also SI Text for more details).
As an example, Fig. 2 illustrates the actual SSNs constructed from the time series x(t) of a onedimensional normal Brownian trajectory in a harmonic well. Here we discretize x using the equal probability partition with 12 symbols, i.e. each symbol has the same occurrence probability. The symbolic time series s is resampled every m time steps from which the SSN is constructed. Each node represents the state composed of the set of symbolic subsequences, which have the same transition probability as described above. The area of the node is proportional to the resident probability of the state along the time series s. The directed links connecting the nodes represent the transitions between the states. The abscissa of the circle (state) center corresponds to the average value of x over the set of symbolic subsequences assigned to that state, and the ordinate denotes the average distribution–distribution (D–D) distance D _{distrib} (i), which measures on average how different from the others the transition probability of the ith state is (the detailed definition will be given later).
It was found that past subsequences with only one symbol (L _{past} = 1) are sufficient to construct the underlying SSN for the case of m = 1/λ and m = 3/λ (where 1/λ characterizes the correlation time scale of x) such that a further increase of L _{past} does not change the network topology. This is due to the Markovian nature of Brownian motion that only requires the present value of the observable to “predict”the future. When the step size m ≲ 1/λ, the system cannot jump over to all the accessible regime of s and thus there are 12 states (no identical transition probability exists). The states extracted from s at such a short time scale just mimic a “trajectory” of a stochastic variable in the discretized space.
As m increases, some of the nearby states start to merge and eventually only a single state is obtained at m = 5/λ as the autocorrelation decays to almost zero, where the required L _{past} is found to be zero. This manifests that it does not require any information of the current value to predict the future when the correlation is negligible. The time series recorded in every 5/λ time steps or longer is statistically equivalent to the dodecahedron's dice toss. Moreover, one can see from the Langevin case that the subsequences contained in a state are localized in the physical observable space (e.g., the x) for time scale shorter than the correlation time scale (e.g., at 1/λ). However, such localization is lost for much longer time scales (e.g., at 5/λ). A more general discussion of the connection between the changes of localization properties of the states in the physical observable space as a function of time scale and its relation to the state transition probability similarity is given in SI Text .
One can expect that CM is able to extract the time scale on which the system loses memory in the observable. It also reveals how the system smears out the fine structure of the state space in terms of the time scaledependent SSN. Such a “modelfree”approach is crucial in capturing the complexity in the kinetics and dynamics observed in singlemolecule experiments. However, there are several practical drawbacks in the standard form of CM, especially for systems with hierarchical time and space scales.
First, the number of possible past subsequences s _{A} ^{past} grows exponentially with L _{past} and the sampling of s _{A} ^{past} becomes worse rapidly due to the finite length of the time series. So it is difficult to properly resolve the SSN when longterm memories exist. Although the CM discussed above using skipping time steps works well for the Markovian Brownian dynamics, it skips and so neglects the information between consecutive sampling steps that may contains important nonMarkovian properties in real singlemolecule time series. Second, CM relies on the concept of stationarity for the underlying processes. This implies that the statistical properties of the system changes slowly within the length of the time series from which the SSN is constructed. However, this is not necessarily the case for real systems where the existence of hierarchical time and space scales provides a diverse dynamical properties over different scales. Therefore, a decomposition of the observable time series into a set of hierarchical, stationary (and nonstationary) processes with different time scales is highly desirable for the prescription of CM.
Most importantly, after one extracts the underlying dynamics for each characteristic time scale associated with the long memory process, there may exist “mutual correlation”or “nonadiabatic coupling”across different hierarchies of different time scales. Hence, the incorporation of the mutual correlation across the decomposed hierarchies is important for establishing the correct SSN hidden in the time series for multiscale complex systems.
Below we propose a scheme of multiscale CM based on the discrete wavelet decomposition, which can not only overcome the existing difficulties in the current form of CM but also resolve the cumbersome degeneracy problem in singlemolecule measurements as much as possible. Here, we apply our method to the delaytime time series of the Fre/FAD complex (15) in the ET experiment. We note, however, that our method is general and should be applicable to any time series.
Hierarchical SSN: Anomalous Conformational Fluctuation in Fre/FAD Complex.
The protein structure of the Fre/FAD complex and the position of the tyrosin residue Tyr^{35}, relative to the FAD substrate are shown in Fig. 4 A. Fig. 3 illustrates the multiscale CM scheme based on the discrete wavelet decomposition of the delaytime time series. The delay time of the fluorescence photons are recorded with respect to the excitation pulse as a function of the chronological arrival time of the detected photon (15). Discrete wavelet decomposition (33) produces a family of hierarchically organized decompositions from a scalar time series τ = (τ_{1},…,τ_{N}), where A ^{(j)} = (A _{1} ^{(j)}, …, A_{N} ^{(j)} and D ^{(j)} = (D _{1} ^{(j)}, …, D_{N} ^{(j)} are called as jlevel approximation and detail, respectively. In the case of dyadic decomposition, which is applied in this paper, A ^{(j)} approximates τ with a time resolution of 2^{j} time steps by discarding fluctuations with time scale smaller than 2^{j} time steps, and D ^{(j)} captures the fluctuation of τ over the time scale of 2^{j} time steps. The time series can be reconstructed by adding back all fluctuations with timescales smaller than or equal to those of the approximation. Moreover, approximations of different time scales are related by A ^{(j)} = A ^{(j+1)} + D ^{(j+1)} with j ≥ 0. In this paper, the Haar wavelet is adopted for its simple interpretation: A_{i} ^{(j)} and D_{i} ^{(j)} of the Haar wavelet are the mean and the mean fluctuation over a bin of 2^{j} time steps, respectively (see SI Text for details).
Fig. 3 A exemplifies the discrete wavelet decomposition with n = 3 using the delaytime time series (denoted here by τ). The subSSNs, denoted by ε^{A(n)} and ε^{D(j)}, are constructed from the time series of A ^{(n)} and from that of D ^{(j)} (j ≤ n) by the same algorithm used in Fig. 2, respectively. Fig. 3 B and C present ε^{A(3)} and ε^{D(3)} extracted from the time series of A ^{(3)} and D ^{(3)}, both with the time steps 2^{3}. Due to the nature of A ^{(n)} (the binned average), the constructed subSSN ε^{A(n)} averages out the information contained in each bin. This suppresses the noise from photon statistics but on the other hand suffers from information loss inside the bins (24). Therefore, a combined SSN should be constructed by incorporating the SSNs of fluctuations inside the bin (the details) and the correlations among them into ε^{A(n)}. For instance, the incorporation of ε^{D(n)} and ε^{D(n − 1)} into ε^{A(n)} gives the SSN that describes dynamics with time scale 2^{n} by taking account of fluctuations down to the bin size of 2^{n − 1}.
Fig. 3 D demonstrates how ε^{A(3)} and ε^{D(3)} can form the combined SSN ε^{A(3),D(3)}. By tracing A ^{(3)} and D ^{(3)} time series, one can identify which state the system visited at each time step in ε^{A(3)} and ε^{D(3)}, respectively. The possible states of the combined SSN ε^{A(3),D(3)} are given by the product set {S _{i} ^{A(3)},S _{j} ^{D(3)}} (≡ S _{ij}), where S _{i} ^{A(3)} and S_{j} ^{D(3)} denote the ith and jth state in ε^{A(3)}, and ε^{D(3)} (where i = 1, …, N ^{A(3)} and j = 1, …, N ^{D(3)}). The resident probability of the S _{ij} denoted by P _{A(3),D(3)}(S _{ij}) can then be computed as follows: where N(S _{ij}) is the number of simultaneous occurrence at the states S _{i} ^{A(3)} and S_{j} ^{D(3)} along the time series, and N is the number of data points of the series. In general P _{A(3),D(3)} (S _{ij}) ≠ P _{A(3)} (S _{i} ^{A(3)}) P _{D(3)} (S_{j} ^{D(3)}) because the two time series A ^{(3)} and D ^{(3)} are statistically correlated. On the other hand, the transition probability from S _{ij} to S _{i′j′} can also be obtained by where N(S _{i′j′}, S _{ij}) is the number of visiting S _{i′j′} at 2^{3} time steps passed after visiting S _{ij}. In general, a transition from one state to another in ε^{A(n),D(n)} takes 2^{n} time steps as in ε^{A(n)}.
The combined SSN ε^{A(n),D(n)} corresponds to a “splitting” of the states S_{i} ^{A(n)} of the approximation to S _{ij} with 1 ≤ j ≤ N ^{D(n)} by incorporating the fluctuation inside the bins. Similarly, other subSSNs (ε^{D(2)}, ε^{D(1)}) can be incorporated into ε^{A(3),D(3)} one by one, depending on how fine the fluctuations one wishes to see.
Moreover, because the original τ is decomposed into a vector time series with approximation and details as components, degeneracy is expected to be further lifted by this multiscale CM compared with the original CM in terms of scalar time series τ. The stationarity of the approximation and details can be inspected by evaluating their autocorrelations. The autocorrelation of D ^{(j)} decays rapidly on a timescale of 2^{j} time steps with small oscillations for longer time. Therefore, the D ^{(j)}'s are ‘approximately’ stationary with time scale of 2^{j}. On the other hand, the autocorrelation of A ^{(j)} remains approximately constant for 2^{j} steps and shows similar behavior to those of τ for time scales longer than 2^{j}. This indicates that A ^{(j)} capture all the nonstationarity of τ with time scales longer than 2^{j} (see also SI Text for more details). Hence, Eq. 1 enables us to naturally decompose the original time series into a set of hierarchical stationary processes (the details) at different time scales and their nonstationary counterpart (the approximation).
Lifetime Spectrum and the Average Interdye Distance Associated with a State in the Multiscale SSN.
Once the multiscale SSN is extracted up to the desired level by combining the subSSNs, one can then build up an unique delaytime distribution for each state in the network as shown in the inset of Fig. 3 D. The delaytime probability density function P(τ) is related to the spectrum of lifetime α(γ^{−1}) by (see SI Text for details) with ∫ dτP(τ) = 1. The conformational information of a state can be obtained from α(γ^{−1}) and γ^{−1}(t) = [γ_{0} + k _{ET}(t)]^{−1} ≈ k_{ET} ^{−1}(t) [γ_{0} is the fluorescence decay rate in the ab sence of quencher(s) and k _{ET} the ET rate] as follows: The average rate (inverse of lifetime) of the Ith combined state, γ̄_{I}, can be calculated easily from its lifetime spectrum α_{I}(γ^{−1}) and delaytime probability density function P _{I}(τ) as The averaged donor(D)–acceptor(A) distance R associated with the Ith state, R̄ _{I}, can, then, be evaluated by R̄ _{i} ≈ R _{0} − β̄^{−1} log γ̄_{I} under the assumption of k _{ET}(t) ∼ exp[−βR(t)] with β = 1.4 Å^{−1} for proteins (34).
The Autocorrelation of Lifetime Fluctuation.
What kinds of physical quantities can be extracted from such multiscale SSNs? For instance, the autocorrelation function of lifetime fluctuation C(t) = δγ^{−1}(t)δγ^{−1}(0), where δγ^{−1}(t) = γ^{−1}(t) − γ^{−1} is readily elucidated from the inferred SSNs: The autocorrelation function C(t = 2^{n}) is represented from the multiscale SSN with transition step of 2^{n} as where P _{2(n)}(S _{J}, S _{I}) = P _{2n}(S _{J}S _{I})P(S _{I}) is the joint probability of visiting S _{I} followed by S _{J} after 2^{n} steps; the higherorder correlation functions can be also derived straightforwardly (see SI Text for details). Fig. 4 B shows that the multiscale SSN can naturally produce the autocorrelation function which agrees well with the photonbyphoton based calculation (15, 35). The physical origin of anomaly presented in the autocorrelation function C(t) of the fluorescence lifetime fluctuation was conjectured as follows (15): the conformational states corresponding to local minima on the multidimensional energy landscape have vastly different trapping times because the energy barrier heights for the interconversion among local minima are expected to be broadly distributed. Such a broad distribution of trapping time at a particular D–A distance R should give rise to a rugged “transient” potential for short time scales, resulting in subdiffusion and the stretched exponential in C(t). However, for longer time scales, the apparent potential becomes a smooth harmonic mean force potential and converges to a single state. In the following, we will show that our multiscale SSN naturally reveals such timedependent topographical features of the underlying network in the state space.
Time ScaleDependent Topographical Features of the SSN.
Fig. 5 illustrates how the SSN topography depends on the time scale by projecting the network onto a twodimensional plane composed of the average FADTyr distance R _{I} − R _{0} and the average D–D distance of the state from the others, where a state is represented by a circle as in Fig. 2. Here, three combined SSNs are shown with transition time steps of 2^{4}, 2^{6}, and 2^{8}, approximately corresponding to 30, 120, and 500 ms, respectively. The average D–D distance of the Ith state to all other states in the network is defined by D _{distrib}(I) = Σ_{J=1} ^{NS} P(S _{J})d _{H}(I, J), where P(S _{J}) and N _{S} denote the resident probability of the Jth state, S _{J}, and the total number of states, respectively. d _{H}(I, J) is the distance between two distributions in terms of the Hellinger distance (36) [∫_{−∞} ^{∞} (P _{I}(η)^{1/2} − P _{J}(η)^{1/2})^{2} dη]^{1/2}, where P _{I}(η) and P _{J}(η) are the transition probability P(η^{future}S _{I}) and P(η^{future}S _{J}). The smaller the value of D _{distrib}(I), the closer the Ith state is located to the center of the network. Furthermore, the variance of D _{distrib}(I) over the set of states in the network (see the black arrows in Fig. 5) measures how diverse the transition probabilities of the states is. One can see in Fig. 5 that, as the time scale increases from 2^{4} to 2^{8} steps, the variance of D _{distrib}(I) decreases and the network becomes more compact. This indicates the trend for the states to merge together for longer time scales.
The radius of the circles reflects the stability of the corresponding states. One can see that, as R̄ _{I} − R _{0} → 0, the states in the networks tend to be more stabilized, implying that R _{0} corresponds to the equilibrium FADTyr distance of the mean force potential with respect to R. The more striking feature is that the number of states can be more than one at a given value of the FADTyr distance R, and there exists a broad distribution in the size of the states especially around R _{0} at the short time scale of 2^{4} steps. The latter is in big contrast to Fig. 2, where only a single state presents at a given value of x (no degeneracy) because of the one dimensional nature of the system. This clearly indicates the degeneracy lifting properties that the multiscale SSN can differentiate states having almost the same value in the observable and therefore reflects the multidimensional nature of the underlying landscape. Furthermore, compared with longer time steps like 2^{8}, some “isolated” states with larger circles (weighted more) exist in regions that are far away from R _{0} at 2^{4} steps (note the time scale of m = 1/λ ≈ 580 ms in Fig. 2 is expected to be close to 2^{8} steps here). This is a manifestation of frustration on the multidimensional energy landscape resulting in a vast number of different trapping times at short time scale as inferred in ref. 15.
Further insight into the nature of the conformational dynamics can be acquired by considering how different states are connected in the SSN. The simplest measure of connectivity among the nodes is the degree of node k _{I}, that is, the number of transitions or links from the Ith node to the others. In Fig. 5, the color of the states (the circles) indicates the value of the normalized degree k _{I}/k _{max} where k _{max} = max{k _{1}, …, k _{NS}} is the maximum degree among all states in the chosen SSN. As indicated by the color bar in the figure, more links or transitions from a state is denoted by color towards the red end of the spectrum. The saturating red color signifies that the state connects or communicates to almost all of the states in the network. As the time scale increases, say, from 2^{4} to 2^{8}, the nodes tend to acquire more connections on average, indicated by the shift of color to the red end. This reflects the fact that the system, given more time, can explore more thoroughly the remote regions on the energy landscape.
On the other hand, one can see from the degree dependence on the stability of states in the multiscale SSNs (SI Fig. 9) that the greater the degree of the node, the larger the node size. This can be regarded as the first experimental manifestation so far observed in the network of multidimensional conformational space of biomolecules (10, 11); that is, the state tends to be more stabilized when there exist more transition paths from the state. Moreover, a large diversity of degrees for a given state size (or stability) is observed for short time scales (e.g., 2^{4} and 2^{6}), which provided us with the evidence of heterogeneity in the state connectivity. Its implication and the degree dependence of the stability of states will be discussd in more detail in SI Text.
As a summary, at a typical time scale of “subdiffusion,” e.g., 2^{4} steps as shown in Fig. 5 A, the underlying network exhibits strong diversity in the transition and morphological features of the state space, which should arise from the frustration of the multidimensional energy landscape. However, on the time scale of 2^{8} steps, which can be regarded as a turning point from the subdiffusion regime to the Brownian diffusion regime, the topographical feature of the underlying network becomes relatively compact, leading to the consolidation of all states so that the number of links from each state become uniformly close to maximum.
Conclusions
In this article, we have presented a method to extract the multiscale network in state space from a singlemolecule time series, with the ability to lift the degeneracy inherent to finite scalar time series. In contrast to models that are postulated for the underlying physical mechanism, the multiscale SSN can objectively provide us with rules about the underlying dynamics that one can learn “directly” from the experimental singlemolecule time series. The network topography depends on the time scale of observation; in general, the longer the observation, the less complex the underlying network appears.
Our method also provides a means to introduce several concepts of complex networks into singlemolecule studies, which have been developed extensively in different fields sharing similar organization such as biology, technology, or sociology (13). For instance, modules or communities and “smallworld”concepts in biological networks are expected to be relevant to specific functions and hierarchical organization of the systems. This multiscale SSN can also examine the time scale on which the concept of a Markovian process is valid. Most importantly, it provides a natural way of investigating how multiscale systems evolve in time with mutual interference across the hierarchical dynamics in different time scales.
As for the future works, a rigorous connection of the concepts in the multiscale SSNs and those in the context of dynamical theory can further enhance our understanding of the multiscale dynamics of complex systems. Moreover, we expect that, by monitoring the change of multiscale SSN that is locally constructed from a set of finite consecutive periods along the course of time (37), it will be possible to shed light on how the system adapts to timedependent external stimuli under thermal fluctuation.
Acknowledgments
We thank Satoshi Takahashi and Mikito Toda for valuable comments, X. S. Xie and G. Luo for providing time series of the ET singlemolecule experiment of Fre/FAD complex, and Kazuto Sei for helping us to draw Fig. 3. Parts of this work were supported by JSPS, JST/CREST, Priority Areas “Systems Genomics,” “Molecular Theory for Real Systems” and the 21st century COE (Center of Excellence) of Earth and Planetary Sciences, Kobe University, MEXT (to T.K.) and by U.S. National Science Foundation (to H.Y.).
Footnotes
 ^{‖}To whom correspondence should be addressed. Email: tamiki{at}es.hokudai.ac.jp

Author contributions: C.B.L. and T.K. designed research; C.B.L. and T.K. performed research; C.B.L., H.Y., and T.K. contributed new reagents/analytic tools; C.B.L. analyzed data; and C.B.L., H.Y., and T.K. wrote the paper.

↵ ^{‡}Present address: Research Institute for Electronic Science, Hokkaido University, Kita 12, Nishi 6, Kitaku, Sapporo 0600812, Japan.

The authors declare no conflict of interest.

This article is a PNAS Direct Submission.

This article contains supporting information online at www.pnas.org/cgi/content/full/0707378105/DC1.
 © 2008 by The National Academy of Sciences of the USA
References
 ↵
 ↵
 ↵
 ↵
 ↵

↵
 Frauenfelder H ,
 Sligar SG ,
 Wolynes PG

↵
 Stillinger FH

↵
 Wales DJ

↵
 Krivov SV ,
 Karplus M
 ↵

↵
 Gfeller D ,
 Rios PDL ,
 Caflisch A ,
 Rao F

↵
 Ball KD ,
 Berry RS ,
 Kunz RE ,
 Li FY ,
 Proykova A ,
 Wales DJ
 ↵

↵
 Gallos LK ,
 Song C ,
 Havlin S ,
 Makse HA

↵
 Yang H ,
 Luo G ,
 Karnchanaphanurach P ,
 Louie TM ,
 Rech I ,
 Cova S ,
 Xun L ,
 Xie XS

↵
 Talaga DS ,
 Lau WL ,
 Roder H ,
 Tang JY ,
 Jia YW ,
 DeGrado WF ,
 Hochstrasser RM
 ↵

↵
 Rhoades E ,
 Gussakovsky E ,
 Haran G

↵
 Kinoshita M ,
 Kamagata K ,
 Maeda M ,
 Goto Y ,
 Komatsuzaki T ,
 Takahashi S
 ↵
 ↵
 ↵
 ↵
 ↵
 ↵

↵
 Edman L ,
 Rigler R
 ↵
 ↵

↵
 Zwanzig R
 ↵

↵
 Shalizi CR ,
 Crutchfield JP

↵
 Daubechies I
 ↵
 ↵
 ↵
 ↵
Citation Manager Formats
More Articles of This Classification
Biological Sciences
Biophysics
Related Content
 No related articles found.
Cited by...
 Experimentally modeling stochastic processes with less memory by the use of a quantum processor
 Energy landscape in protein folding and unfolding
 Informationtheoretic lower bound on energy cost of stochastic computation
 Physics of Bacterial Morphogenesis
 Protein dynamics investigated by inherent structure analysis
 How mainchains of proteins explore the freeenergy landscape in native states