Mean-Field Theory in Hopfield Neural Network for Doing 2 Satisfiability Logic Programming
Автор: Saratha Sathasivam, Shehab Abdulhabib Alzaeemi, Muraly Velavan
Журнал: International Journal of Modern Education and Computer Science @ijmecs
Статья в выпуске: 4 vol.12, 2020 года.
Бесплатный доступ
The artificial neural network system's dynamical behaviors are greatly dependent on the construction of the network. Artificial Neural Network's outputs suffered from a shortage of interpretability and variation lead to severely limited the practical usability of artificial neural networks for doing the logical program. The goal for implementing a logical program in Hopfield neural network rotates rounding minimizing the energy function of the network to reaching the best global solution which ordinarily fetches local minimum solution also. Nevertheless, this problem can be overcome by utilizing the hyperbolic tangent activation function and the Boltzmann Machine in the Hopfield neural network. The foremost purpose of this article is to explore the solution quality obtained from the Hopfield neural network to solve 2 Satisfiability logic (2SAT) by using the Mean-Field Theory algorithm. We want for replacing the real unstable prompt local field for the separate neurons into the network by its average local field utility. By using the solution to the deterministic Mean-Field Theory (MFT) equation, the system will derive the training algorithms in which time-consuming stochastic measures of collections are rearranged. By evaluating the outputs of global minima ratio (zM), Root Mean Square Error (RMSE), Mean Absolute Error (MAE), Mean Absolute Percentage Error (MAPE) with computer processing unit (CPU) time as benchmarks, we find that the MFT theory successfully captures the best global solutions by relaxation effects energy function.
Logic program, Neural networks, Mean field theory, 2 Satisfiability
Короткий адрес: https://sciup.org/15017595
IDR: 15017595 | DOI: 10.5815/ijmecs.2020.04.03
Список литературы Mean-Field Theory in Hopfield Neural Network for Doing 2 Satisfiability Logic Programming
- R. Rojas, (2013). Neural networks: A systematic introduction. Springer Science & Business Media.
- J.J. Hopfield, & D.W. Tank, (1985). Neural computation of decisions in optimization problems. Biological Cybernatics, 52, 141-152.
- S. A. S. Alzaeemi, & S. Sathasivam, (2018). Hopfield neural network in agent based modeling. MOJ App Bio Biomech, 2(6), 334-341.
- S. Sathasivam, M. Mamat, M. Mansor, & M. S. M. Kasihmuddin, (2020). Hybrid Discrete Hopfield Neural Network based Modified Clonal Selection Algorithm for VLSI Circuit Verification. Pertanika Journal of Science & Technology, 28(1).
- S. Haykin, (1992). Neural Networks: A Comprehensive Foundation. New York: Macmillan College Publishing.
- S. Sathasivam, (2015). Acceleration Technique for Neuro Symbolic Integration. Applied Mathematical Sciences, 9(9), 409-417.
- R. A. Kowalski, (1979). Logic for Problem Solving. New York: Elsevier Science Publishing.
- S. A. Alzaeemi, M. A. Mansor, M. S. M. Kasihmuddin, & S. Sathasivam, (2019, December). Comparing the logic programming between Hopfield neural network and radial basis function neural network. In AIP Conference Proceedings (Vol. 2184, No. 1, p. 060044). AIP Publishing LLC.
- W. A. T. Wan Abdullah, (1991). Neural Network Logic. Benhar, O., Bosio, C., del Giudice, P., & Tabet, E. (eds.), Neural Networks: From Biology to High Energy Physics, ETS Editrice, Pisa, pp.135-142
- S. Sathasivam, (2010). Upgrading logic programming in Hopfield network. Sains Malaysiana, 39(1), 115-118.
- S. Sathasivam, (2009, August). Energy relaxation for Hopfield Network with the new learning rule. In AIP Conference Proceedings (Vol. 1159, No. 1, pp. 118-122). American Institute of Physics.
- S. Sathasivam, N. P. Fen, & M. Velavan, (2013, June). Boltzmann machine and reverse analysis method. In 2013 IEEE 7th International Power Engineering and Optimization Conference (PEOCO) (pp. 52-56). IEEE.
- M. Velavan, Z. R. bin Yahya, M. N. bin Abdul Halif, & S. Sathasivam, (2016). Mean field theory in doing logic programming using Hopfield network. Modern Applied Science, 10(1), 154.
- J. J. Hopfield, (1982). Neural networks and physical systems with emergent collective computational abilities. Proceedings National Academy of Science USA, 79, 2554-2558. http://dx.doi.org/10.1073/pnas.79.8.2554
- Y. Ding, L. Dong, L. Wang, & G. Wu, (2010a). A High Order Neural Network to Solve Crossbar Switch Problem. In: Wong, K.W. et al. (eds.) ICONIP 2010, Part II, LNCS 6444, Heidelberg: Springer pp. 692–699.
- M.S.M. Kasihmuddin, M.A. Mansor, M.F.M. Basir, S. Sathasivam, Discrete mutation Hopfield neural network in propositional satisfiability. Mathematics 2018, 7, 1133.
- C. Peterson, & J. Anderson, (1988). A Mean Field Theory Learning Algorithm for Neural Networks. Complex System 1, 995-1019.
- T. G. Pedersen, P. M. Johansen, N. C. R. Holme, P. S. Ramanujam, & S. Hvilsted, (1998). Mean-field theory of photo induced formation of surface reliefs in side-chain azobenzene polymers. Physical review letters, 80(1), 89.
- G. Kotliar, S. Y. Savrasov, K. Haule, V. S. Oudovenko, O. Parcollet, & C. A. Marianetti, (2006). Electronic structure calculations with dynamical mean-field theory. Reviews of Modern Physics, 78(3), 865.
- H., Park, K., Haule, & G. Kotliar, (2008). Cluster dynamical mean field theory of the Mott transition. Physical review letters, 101(18), 186403.
- H. Yurtseven, & M. G. Şenol, (2015). Calculation of the T–P phase diagram for oxygen using the mean field theory. Calphad, 51, 272-281.
- L. F. Arsenault, O. A. von Lilienfeld, & A. J. Millis, (2015). Machine learning for many-body physics: efficient solution of dynamical mean-field theory. arXiv preprint arXiv:1506.08858.
- J. Javanainen, & J. Ruostekoski, (2016). Light propagation beyond the mean-field theory of standard optics. Optics express, 24(2), 993-1001.
- A. B. Rubenstein, M. A. Pethe, & S. D.. Khare, (2017). MFPred: Rapid and accurate prediction of protein-peptide recognition multispecificity using self-consistent mean field theory. PLoS Computational Biology, 13(6), e1005614.
- M. S. M. Kasihmuddin, M. A. Mansor, & S. Sathasivam, (2017). Robust Artificial Bee Colony in the Hopfield Network for 2-Satisfiability Problem. Pertanika Journal of Science & Technology, 25(2).
- M. S. M. Kasihmuddin, M. A. Mansor, S. Alzaeemi, M. F. M. Basir, and S. Sathasivam. "Quality Solution of Logic Programming in Hopfield Neural Network." In Journal of Physics: Conference Series, vol. 1366, no. 1, p. 012094. IOP Publishing, 2019.
- M. Mansor, Z. Jamaludin, M. Kasihmuddin, S. Alzaeemi, F. Basir, S. Sathasivam (2020). Systematic Boolean Satisfiability Programming in Radial Basis Function Neural Network. Processes, 8(2), 214.
- M. S. M. Kasihmuddin, M. A. Mansor, & S. Sathasivam, (2018). Discrete Hopfield Neural Network in Restricted Maximum k-Satisfiability Logic Programming. Sains Malaysiana, 47(6), 1327-1335.
- M. S. M. Kasihmuddin, S. Sathasivam, & M. A. Mansor, (2017, August). Hybrid genetic algorithm in the Hopfield network for maximum 2-satisfiability problem. In AIP Conference Proceedings (Vol. 1870, No. 1, p. 050001). AIP Publishing LLC.
- L. C. Kho, M. S. M. Kasihmuddin, M. A. Mansor, & S. Sathasivam, (2019, December). 2 satisfiability logical rule by using ant colony optimization in Hopfield Neural Network. In AIP Conference Proceedings (Vol. 2184, No. 1, p. 060009). AIP Publishing LLC.
- S. Sathasivam, & M. Velavan, (2014). Boltzmann Machine and Hyperbolic Activation Function in Higher Order Network. Modern Applied Science, 8(3), 154-160.
- Sathasivam, S. & Abdullah, W.A.T. (2011). Logic mining in neural network: reverse analysis method. Computing, 91(2), 119-133.