Poster No:
1800
Submission Type:
Abstract Submission
Authors:
Robert Englert1, Balint Kincses1, Raviteja Kotikalapudi1, Giuseppe Gallitto1, Jialin Li2, Kevin Hoffschlag1, Choong-Wan Woo3, Tor Wager4, Dagmar Timmann1, Ulrike Bingel1, Tamás Spisák1
Institutions:
1University Medicine Essen, Essen, NRW, 2Max Planck School of Cognition, Leipzig, Germany, 3Sungkyunkwan University, Suwon-si, Gyeonggi-do, 4Dartmouth College, Hanover, NH
First Author:
Co-Author(s):
Jialin Li
Max Planck School of Cognition
Leipzig, Germany
Introduction:
Brain function is characterized by the continuous activation and deactivation of anatomically distributed neuronal populations, leading to complex fluctuations termed "brain states". These dynamics have been explored using various descriptive techniques, revealing their neuroscientific relevance (Greene et al., 2023, Vidaurre et al., 2017). The intricate nature of understanding brain dynamics persists as we strive to bridge the explanatory gap between biophysical and cognitive perspectives (Breakspear 2017). Here we propose functional connectivity-based Hopfield neural network (fcHNN) as a novel model for macro-scale brain dynamics, incorporating the advantages of large-scale network models and data-driven approaches.
Methods:
The fcHNN framework uses the functional connectome to model recurrent activity flow (Cole et al. 2016) across large-scale brain regions and conceptualizes brain dynamics as trajectories on an n-dimensional energy landscape with local minima often referred to as 'attractor states'. Although fcHNNs are a type of artificial neural networks, the network weights are not explicitly trained, but rather set empirically with the partially correlated, functional connectivity matrix. Through the so-called Hopfield relaxation procedure, fcHNNs naturally converge towards one of the finite number of attractor states, characterized by minimal energy (Hopfield 1982) (Figure 1A). Introducing weak noise during relaxation, referred to as 'stochastic update' (Figure 1B) prevents the system from converging to attractor states and introduces heteroclinic dynamics where the system traverses the energy landscape. This can serve as a high-level generative model of macro-scale brain dynamics, guided by the connectome's topology (Figure 2A). The first two principal components of simulated activity patterns, termed the fcHNN projection, are employed to create a lower-dimensional representation of the state space. This projection effectively separates the attractor state basins and serves as a tool for examining brain dynamics.

Results:
We found that large-scale attractor states of the brain exhibit significant neuroscientific relevance, closely resembling known large-scale brain systems, showcasing the model's potential in understanding complex brain dynamics. The fcHNN projection, utilizing the first two principal components of the fcHNN state space, demonstrates a notably higher explanatory power (p<0.0001) when compared to principal components extracted directly from actual resting state fMRI data. In addition to replicating various aspects of spontaneous brain dynamics, fcHNNs are suitable for modeling responses to perturbations, as demonstrated by their ability to detect changes in neural activity patterns related to pain perception (Woo et al. 2015). Flow analysis on the fcHNN projection (difference in the average timeframe-to-timeframe transition direction), reveals that during pain, brain activity gravitates toward a distinct "ghost attractor" (Figure 2B). To assess clinical relevance of the proposed method, we compared data from autism spectrum disorder (ASD) patients and typically developing controls (Di Martino et al. 2014). Flow analysis predicted increased likelihood of states returning towards the middle on the internal-external axis and transitioning towards the extremes on the action-perception axis in ASD (Figure 2B), findings that were statistically significant and consistent with observed patterns in real data.

Conclusions:
We present the connectome-based Hopfield neural network (fcHNN) model, a lightweight computational framework initialized with empirical functional connectomes, accurately capturing and predicting diverse brain dynamics, offering a mechanistic account for the emergence of brain states, gradients, and autocorrelation structures.
Modeling and Analysis Methods:
Connectivity (eg. functional, effective, structural)
fMRI Connectivity and Network Modeling 1
Methods Development 2
Keywords:
Computational Neuroscience
FUNCTIONAL MRI
Modeling
1|2Indicates the priority used for review
Provide references using author date format
Vidaurre, D., Smith, S. M., & Woolrich, M. W. (2017). Brain network dynamics are hierarchically organized in time. Proceedings of the National Academy of Sciences, 114(48), 12827-12832.
Greene, A. S., Horien, C., Barson, D., Scheinost, D., & Constable, R. T. (2023). Why is everyone talking about brain state?. Trends in Neurosciences.
Breakspear, M. (2017). Dynamic models of large-scale brain activity. Nature neuroscience, 20(3), 340-352.
Cole, M. W., Ito, T., Bassett, D. S., & Schultz, D. H. (2016). Activity flow over resting-state networks shapes cognitive task activations. Nature neuroscience, 19(12), 1718-1726.
Hopfield, J. J. (1982). Neural networks and physical systems with emergent collective computational abilities. Proceedings of the national academy of sciences, 79(8), 2554-2558.
Woo, C. W., Roy, M., Buhle, J. T., & Wager, T. D. (2015). Distinct brain systems mediate the effects of nociceptive input and self-regulation on pain. PLoS biology, 13(1), e1002036.
Di Martino, A., Yan, C. G., Li, Q., Denio, E., Castellanos, F. X., Alaerts, K., ... & Milham, M. P. (2014). The autism brain imaging data exchange: towards a large-scale evaluation of the intrinsic brain architecture in autism. Molecular psychiatry, 19(6), 659-667.