Search

CN-121998002-A - Shared balance potential gate control pulse neuron and method for constructing neural network

CN121998002ACN 121998002 ACN121998002 ACN 121998002ACN-121998002-A

Abstract

The invention provides a shared balance potential gating nerve cell and a method for constructing a nerve network, wherein the method comprises the following steps of S1, constructing the shared balance potential gating nerve cell, S2, constructing an initial dynamic pulse nerve network by applying the shared balance potential gating nerve cell, and S3, training the initial dynamic pulse nerve network through a back propagation method over time to obtain an optimized dynamic pulse nerve network.

Inventors

  • YU QIANG
  • BAI QIANYI
  • WANG HAITENG

Assignees

  • 天津大学

Dates

Publication Date
20260508
Application Date
20260203

Claims (4)

  1. 1. A shared balance potential gating pulse neuron is characterized in that the gating pulse neuron is used for inputting synapses for each input synapse when updating membrane potential Multiplexing gating weights corresponding to the input synapses Wherein the original synaptic weight is given Replacement by equilibrium potential And gate control weight And the equilibrium potential of all input synapses to be connected to the gated impulse neuron Unified setting to same shared parameter The method comprises the following steps: -an input processing module for generating input pulse information ; -A gating modulation module for generating and updating gating variables ; Wherein: is the static attenuation coefficient of the neuron, Is a truncated function; -a membrane potential update module for updating neuronal membrane potential under shared balance potential parameter constraints ; Wherein: is in the neuron The membrane potential at the moment in time, Is the threshold of neurons; is an analog time step; -a pulse generation module for generating a pulse output based on the membrane potential ; Wherein: representing an alternative derivative which is functionally a herceptin unit step function.
  2. 2. A method of constructing a dynamic impulse neural network using the neurons of claim 1, comprising: Selecting a conventional voice data set or an event data set, and dividing the conventional voice data set or the event data set into a training set and a testing set; for each input synapse of a neuron, the balancing potential is set to a shared parameter and the gating weights of the input synapses are multiplexed The weights of the (2) are remodelled, so that the gating pulse neurons sharing the balance potential are realized; constructing an initial dynamic impulse neural network according to the shared balance potential gating impulse neurons; and training according to the initial dynamic impulse neural network by a back propagation method with time to obtain the optimized dynamic impulse neural network.
  3. 3. The method of constructing a dynamic impulse neural network according to claim 2, wherein each input synapse for a neuron is obtained by setting a balance potential as a shared parameter and multiplexing gating weights of the input synapses The weight of (2) is remodelled to realize a gating pulse neuron sharing an equilibrium potential, and the method comprises the following steps: The balance potential is shared according to the following formula Sum gating weight Multiplying remodelling each input synapse of a neuron Is a trainable weight of (1) Obtained as neurons in Membrane potential at time : Wherein: Is the threshold of neurons; is an analog time step; is a static attenuation parameter; Is the first Gating weights corresponding to bar synapses; to be connected with the neuron Input information of the bar synapse; A shared balance potential parameter corresponding to the neuron; is a dynamic membrane potential decay parameter, wherein: The dynamic membrane potential attenuation parameter The calculation formula of (2) is as follows; Wherein: limiting the calculation result to be within the interval of 0-1 for a truncated function; According to neuron presence Film potential pulse at time and last time The pulse distribution of the pulse to obtain the pulse distribution condition of the moment : Wherein: representing an alternative derivative which is functionally a herceptin unit step function.
  4. 4. The method of constructing a dynamic impulse neural network according to claim 2, wherein the early dynamic impulse neural network is trained via a back propagation method over time to obtain an optimized dynamic impulse neural network, comprising: calculated according to the following formula according to the initial dynamic impulse neural network Pulse output at time: ; Wherein: For the output layer Is used for the weight of the (c), Is the first The output pulse of the layer is provided, Is in the network Output data at the moment; Based on early dynamic impulse neural network Calculating classification labels according to the output data of the moment: and calculating a loss function according to the classification label and the real label and the following formula: Wherein: As a real tag it is possible to provide a real tag, The cross entropy loss function; according to the loss function Propagating gradients along the computational graph in both the temporal and spatial dimensions re-updates the weights of each neuron And equilibrium potential 。

Description

Shared balance potential gate control pulse neuron and method for constructing neural network Technical Field The invention relates to the field of brain-like calculation and pulse neural networks, in particular to a shared balance potential gate control pulse neuron and a network construction method thereof. Background With the rapid evolution of the artificial intelligence field, brain-like computing has become a key research direction pushing its progress. The core is to construct an artificial neural network capable of simulating basic characteristics of biological neurons and connection structures thereof. Currently, artificial neural networks based on deep learning have made breakthrough progress in multiple tasks such as image recognition, natural language processing, and the like, and partial tasks are even performed near or beyond the human level. However, such networks are often accompanied by higher energy consumption, thereby restricting their deployment in energy-sensitive scenarios such as edge computing and mobile devices. In contrast, the human brain consumes only about 20 watts of energy when performing complex cognitive tasks, exhibiting excellent energy efficiency. Therefore, the advantages of the human brain in the aspects of cognitive function and energy consumption control become an important reference in brain inspiring artificial intelligence research. Since brain information processing relies on neural signaling in the form of impulses, researchers have proposed impulse neural networks accordingly (Spiking Neural Networks, SNNs). Such networks not only constitute the key components of neuromorphic computing systems, but are also seen as powerful tools for understanding brain information processing and learning mechanisms. However, the nonlinear dynamic and time-dependent characteristics of the impulse neurons make the behavior of the impulse neurons complex, so that the training of SNNs is significantly challenged, and further improvement of learning performance of the impulse neurons is limited. Therefore, integrating the training method of the existing mature artificial neural network and drawing the related findings in the neuroscience field has become an important research path for improving SNNs learning ability currently. Currently, the most common impulse neuron in engineering applications is the integral-leakage-firing (LEAKY INTEGRATE-and-Fire, LIF) model. The neuron generally adopts a fixed membrane potential leakage parameter or a fixed time constant to describe the potential attenuation process of the neuron, the calculation structure is simple, but a membrane potential attenuation mechanism is irrelevant to input activities, and the information retention time scale is difficult to be adaptively adjusted according to the input states, so that the neuron has the problems of limited memory capacity and sensitivity to noise disturbance in complex time sequence tasks. In order to overcome the defects, partial researches introduce a dynamic conductance or gating mechanism to enable the evolution process of neuron membrane potential to be regulated by synaptic input activity, thereby enhancing the modeling capability of time sequence information. The dynamic gating Neuron (DYNAMIC GATED Neuron, DGN) introduces dynamic gating parameters and balance potentials for synapses, so that membrane potential changes depend on the dynamic state of each synapse, and the computing capacity and robustness of the network are improved to a certain extent. However, existing DGNs typically set independent equilibrium potential parameters for each synapse, resulting in a linear increase in the number of trainable parameters with the number of synapses. When the network scale is large, the design obviously increases the parameter scale, the calculation complexity and the storage cost of the network, and is not beneficial to being deployed in a resource-limited neuromorphic chip, an embedded system and low-power-consumption edge equipment. Meanwhile, certain redundancy exists among the multiple balance potential parameters, and the contribution of the redundancy to the improvement of the core computing capacity is limited. Therefore, how to reduce the number of parameters related to the balance potential and reduce the complexity and energy consumption of the network while maintaining the core computing mechanism that the dynamic gating neurons regulate the information flow based on the input activity becomes a technical problem to be solved. Disclosure of Invention Aiming at the problems existing in the prior art, the invention provides a method for constructing a lightweight dynamic gating pulse neural network based on shared balancing potential gating neurons (Shared Equilibrium Gated Neuron, SEGN), all synapses of a certain neuron share a single balancing potential parameter, the capacity of dynamically regulating and controlling information memory according to input information is maintained,