Emergence and reconfiguration of modular structure for synaptic neural networks during learning
Poster No:
1065
Submission Type:
Abstract Submission
Authors:
Shi Gu1
Institutions:
1University of Electronic Science and Technology of China, Chengdu, 四川
First Author:
Shi Gu
University of Electronic Science and Technology of China
Chengdu, 四川
Introduction:
To study the learning dynamics of HebbFF, we developed a multi-pronged approach examining the dynamic reconfiguration of the networks from different perspectives. We begin with an analysis of the modularity over temporal scales and its relationship to variations in task accuracy and distribution entropy across diverse learning paradigms. We then explored the synchronicity of states during the training phase and its subsequent correlation with accuracy. We show that network modularization enhances with learning, and that network flexibility serves as a robust metric encapsulating model performance, in line with results from neuroscience in biological organisms. We hope that our findings will shed light on the interplay between network modularity, accuracy, and learning dynamics, and ultimately advance our understanding of artificial neural networks and their biological counterparts.
Methods:
The method section refers to an external link https://arxiv.org/abs/2311.05862 due to the format issue.
Results:
We explored the dynamic reconfiguration of artificial neural networks, focusing on a class of recurrent (RNNs) called Hebbian Feed Forward Network (HebbFF). This network determines the familiarity of a stimulus based on whether it matches a stimulus encountered at a prior time-step. We use a neural network with 120 units in the hidden layer to provide sufficient representation power for encoding the input. A more detailed result can be found at https://arxiv.org/abs/2311.05862.
Conclusions:
In sum, our discoveries contribute significantly to the broader aim of intersection between AI and Neuroscience - using AI not only to replicate but also to understand and learn from the intricate workings of the brain. The tools and methods we developed present new opportunities to study learning dynamics in both artificial and biological neural networks. Such cross-fertilization of ideas can potentially lead to more efficient, adaptable, and robust AI systems while providing insights into the neuroscience of learning and memory. We note that our research so far focused on memory tasks, which are well-suited for Hebbian feedforward networks that are inherently amenable to descriptions based on modularization. Future work should expand our approach to other cognitive tasks like multimodal matching, value decision, and perception tasks, as well as to other types of ANNs including RNN and deep feedforward networks. We hope that multimodal continual tasks learned through complex networks could serve as a digital analogue of the brain in terms of cognitive execution and may provide novel insights into how functional modules reconfigure to support complex tasks.
Learning and Memory:
Learning and Memory Other 1
Modeling and Analysis Methods:
Classification and Predictive Modeling 2
Keywords:
Computing
Plasticity
1|2Indicates the priority used for review

·Schematics

·Main
Provide references using author date format
Gu, Shi, et al. "Emergence and reconfiguration of modular structure for synaptic neural networks during continual familiarity detection." arXiv preprint arXiv:2311.05862 (2023).
You have unsaved changes.