A New Strategy to Modify Hopfield by Using XOR Operation

Authors

DOI:

https://doi.org/10.21123/bsj.2024.9238

Keywords:

Auto-Associative Memory, Hopfield Network, Neural Network, Pattern Recognition and XOR Operation

Abstract

The Hopfield network is one of the easiest types, and its architecture is such that each neuron in the network connects to the other, thus called a fully connected neural network. In addition, this type is considered auto-associative memory, because the network returns the pattern immediately upon recognition, this network has many limitations, including memory capacity, discrepancy, orthogonally between patterns, weight symmetry, and local minimum. This paper proposes a new strategy for designing Hopfield based on XOR operation; A new strategy is proposed to solve these limitations by suggesting a new algorithm in the Hopfield network design, this strategy will increase the performance of Hopfield by modifying the architecture of the network, the training and the convergence phases, the proposed strategy based on size of pattern but will avoid learning similar pattern many time, whereas the new strategy XOR shows tolerance in the presence of noise-distorted patterns, infinite storage capacity and pattern inverse value. Experiments showed that the suggested method produced promising results by avoiding the majority of the Hopfield network's limitations. In additional it learns to recognize an infinite number of patterns with varying sizes while preserving a suitable noise ratio.

 

 

Author Biography

Rusul Hussein Hasan , College of Law, University of Baghdad, Iraq, Baghdad

 

 

References

Rettig O, Müller S, Strand M, Katic D. Which deep artificial neural network architecture to use for anomaly detection in Mobile Robots kinematic data? Machine Learning for Cyber Physical Systems. Berlin, Heidelberg: Springer Berlin Heidelberg; 2019. p. 58–65. https://doi.org/10.1007/978-3-662-58485-97

Zeigler-Hill V, Shackelford T .K, Artificial Neural Networks, Encyclopedia of Personality and Individual Differences. Springer. Cham. 2020. https://doi.org/10.1007/978-3-319-24612-3_300173

Ramamurthy G, Swamy TJ. Novel associative memories based on spherical separability. Advances in Intelligent Systems and Computing. Singapore: Springer Nature Singapore; 2022. p. 351–8, https://doi.org/10.1007/978-981-16-7088-6_32

Virvou M, Tsihrintzis GA, Jain LC. Introduction to advances in selected artificial intelligence areas. In: Learning and Analytics in Intelligent Systems. Cham: Springer International Publishing; 2022. p. 1–7. https://doi.org/10.1007/978-3-030-93052-3_1

Wan G, Wang L, Zou H, Jiang S. A new model of associative memory neural network based on an improved memristor. 39th Chinese Control Conference (CCC). IEEE; 2020, https://doi.org/10.23919/CCC50068.2020.9188654

Miikkulainen, R, Hopfield Network. Phung D., Webb, G.I., Sammut, C. (eds) Encyclopedia of Machine Learning and Data Science. Springer, New York, NY, 2023. https://doi.org/10.1007/978-1-4899-7502-7_127-2

Sharma N, Kalra K, Sarangi PK, Rani L, Saxena M, Kumar S. Pattern storage & recalling using Hopfield neural network and HOG feature based SVM classifier: An experiment with handwritten Odia numerals. International Conference on Emerging Smart Computing and Informatics (ESCI). IEEE; 2023., https://doi.org/10.1109/ESCI56872.2023.10099928

Razzaq AN, Ghazali R, El Abbadi NK, Al Naffakh HAH. Human Face Recognition Based on Local Ternary Pattern and Singular Value Decomposition. Baghdad Sci J. 2022 Oct. 1 [cited 2023 Aug. 19]; 19(5): 1090. https://doi.org/10.21123/bsj.2022.6145

Hopfield JJ. Neural networks and physical systems with emergent collective computational abilities. Proc Natl Acad Sci. 1982; 79(8): 2554–8. https://doi.org/10.1073/pnas.79.8.2554

Al-Husban A, Karoun RC, Heilat AS, Horani MA, Khennaoui AA, Grassi G, et al. Chaos in a two dimensional fractional discrete Hopfield neural network and its control. Alex Eng J. 2023; 75: 627–38. http://dx.doi.org/10.1016/j.aej.2023.05.078

Ashour MAH. Optimized Artificial Neural network models to time series. Baghdad Sci. J. 2022 Aug. 1; 19(4): 0899. https://doi.org/10.21123/bsj.2022.19.4.0899

Rusdi N ’afifah, Kasihmuddin MSM, Romli NA, Manoharam G, Mansor MA. Multi-unit Discrete Hopfield Neural Network for higher order supervised learning through logic mining: Optimal performance design and attribute selection. J King Saud Univ - Comput Inf Sci. 2023; 35(5): 101554. http://dx.doi.org/10.1016/j.jksuci.2023.101554

Kelleher JD. Deep Learning. The MIT Press; 2019, https://doi.org/10.7551/mitpress/11171.001.0001

Díaz de León JL, Gamino Carranza A. New binary associative memory model based on the XOR operation. Appl Algebra Engrg Comm Comput. 2022; 33(3): 283–320. http://dx.doi.org/10.1007/s00200-020-00446-8

Folli V, Leonetti M, Ruocco G. On the maximum storage capacity of the Hopfield model. Front Comput Neurosci. 2017; 10. http://dx.doi.org/10.3389/fncom.2016.00144

Kasihmuddin MSM, Mansor MA, Sathasivam S. Bezier Curves Satisfiability Model in Enhanced Hopfield Network. International Journal of Intelligent Systems and Applications. 2016 Dec 8;8(12):9–17. http://dx.doi.org/10.5815/ijisa.2016.12.02

Mohd Asyraf Mansor, Mohd Shareduwan M. Kasihmuddin, Saratha Sathasivam. Enhanced Hopfield Network for Pattern Satisfiability Optimization. International journal of intelligent systems and applications. 2016 Nov 8;8(11):27–33. http://dx.doi.org/10.5815/ijisa.2016.11.04

Kareem EIA, Alsalihy WAHA, Jantan A. Multi-connect architecture (MCA) associative memory: A modified Hopfield neural network. Intell Autom Soft Comput. 2012; 18(3): 279–96. http://dx.doi.org/10.1080/10798587.2008.10643243

Downloads

Published

2024-09-01

Issue

Section

article

How to Cite

1.
A New Strategy to Modify Hopfield by Using XOR Operation. Baghdad Sci.J [Internet]. 2024 Sep. 1 [cited 2024 Dec. 19];21(9):3052. Available from: https://bsj.uobaghdad.edu.iq/index.php/BSJ/article/view/9238

Similar Articles

You may also start an advanced similarity search for this article.