https://doi.org/10.1140/epjb/e2003-00114-7
Efficient Hopfield pattern recognition on a scale-free neural network
1
School of Physics and Astronomy, Raymond and Beverly Sackler Faculty of
Exact Sciences, Tel Aviv University, Ramat Aviv, Tel Aviv 69978, Israel
2
Institute for Theoretical Physics, Cologne University,
50923 Köln, Germany
3
Department of Physics, Technion-IIT, Haifa 32000, Israel
4
Cybernetic Vision Research Group, IFSC-USP, Caixa Postal 369,
13560-970 São Carlos, SP, Brazil
Received:
12
January
2003
Published online:
11
April
2003
Neural networks are supposed to recognise blurred images (or patterns)
of N pixels (bits) each. Application of the network to an initial blurred
version of one of P pre-assigned patterns should converge to the correct
pattern. In the “standard" Hopfield model,
the N “neurons” are connected to
each other via N2 bonds which contain the information on the stored patterns.
Thus computer time and memory in general grow with N2. The Hebb rule assigns
synaptic coupling strengths proportional to the overlap of the stored patterns
at the two coupled neurons.
Here we simulate the Hopfield model on the Barabási-Albert scale-free network,
in which each newly added neuron is connected to only m other
neurons, and at the end the number of neurons with q neighbours decays
as .
Although the quality of retrieval decreases for small m, we find good
associative memory for
. Hence, these networks gain a
factor
in the computer memory and time.
PACS: 05.40.-a – Fluctuation phenomena, random processes, noise, and Brownian motion / 05.50.+q – Lattice theory and statistics (Ising, Potts, etc.) / 87.18.Sn – Neural networks
© EDP Sciences, Società Italiana di Fisica, Springer-Verlag, 2003