The current development of logic satisfiability in discrete Hopfield neural networks (DHNN)has been segregated into systematic logic and non-systematic logic. Most of the research tends to improve non-systematic logical rules to various extents, such as introducing the ratio of a negative literal and a flexible hybrid logical structure that combines systematic and non-systematic structures. However, the existing non-systematic logical rule exhibited a drawback concerning the impact of negative literal within the logical structure. Therefore, this paper presented a novel class of non-systematic logic called conditional random k satisfiability for k = 1, 2 while intentionally disregarding both positive literals in second-order clauses. The proposed logic was embedded into the discrete Hopfield neural network with the ultimate goal of minimizing the cost function. Moreover, a novel non-monotonic Smish activation function has been introduced with the aim of enhancing the quality of the final neuronal state. The performance of the proposed logic with new activation function was compared with other state of the art logical rules in conjunction with five different types of activation functions. Based on the findings, the proposed logic has obtained a lower learning error, with the highest total neuron variation TV = 857 and lowest average of Jaccard index, JSI = 0.5802. On top of that, the Smish activation function highlights its capability in the DHNN based on the result ratio of improvement Zm and TV. The ratio of improvement for Smish is consistently the highest throughout all the types of activation function, showing that Smish outperforms other types of activation functions in terms of Zm and TV. This new development of logical rule with the non-monotonic Smish activation function presents an alternative strategy to the logic mining technique. This finding will be of particular interest especially to the research areas of artificial neural network, logic satisfiability in DHNN and activation function.
Citation: Nurshazneem Roslan, Saratha Sathasivam, Farah Liyana Azizan. Conditional random k satisfiability modeling for k = 1, 2 (CRAN2SAT) with non-monotonic Smish activation function in discrete Hopfield neural network[J]. AIMS Mathematics, 2024, 9(2): 3911-3956. doi: 10.3934/math.2024193
The current development of logic satisfiability in discrete Hopfield neural networks (DHNN)has been segregated into systematic logic and non-systematic logic. Most of the research tends to improve non-systematic logical rules to various extents, such as introducing the ratio of a negative literal and a flexible hybrid logical structure that combines systematic and non-systematic structures. However, the existing non-systematic logical rule exhibited a drawback concerning the impact of negative literal within the logical structure. Therefore, this paper presented a novel class of non-systematic logic called conditional random k satisfiability for k = 1, 2 while intentionally disregarding both positive literals in second-order clauses. The proposed logic was embedded into the discrete Hopfield neural network with the ultimate goal of minimizing the cost function. Moreover, a novel non-monotonic Smish activation function has been introduced with the aim of enhancing the quality of the final neuronal state. The performance of the proposed logic with new activation function was compared with other state of the art logical rules in conjunction with five different types of activation functions. Based on the findings, the proposed logic has obtained a lower learning error, with the highest total neuron variation TV = 857 and lowest average of Jaccard index, JSI = 0.5802. On top of that, the Smish activation function highlights its capability in the DHNN based on the result ratio of improvement Zm and TV. The ratio of improvement for Smish is consistently the highest throughout all the types of activation function, showing that Smish outperforms other types of activation functions in terms of Zm and TV. This new development of logical rule with the non-monotonic Smish activation function presents an alternative strategy to the logic mining technique. This finding will be of particular interest especially to the research areas of artificial neural network, logic satisfiability in DHNN and activation function.
[1] | W. A. T. W. Abdullah, Logic programming on a neural network, Int. J. Intell. Syst., 7 (1992), 513–519. https://doi.org/10.1002/int.4550070604 doi: 10.1002/int.4550070604 |
[2] | G. Detorakis, T. Bartley, E. Neftci, Contrastive Hebbian learning with random feedback weights, Neural Networks, 114 (2019), 1–14. https://doi.org/10.1016/j.neunet.2019.01.008 doi: 10.1016/j.neunet.2019.01.008 |
[3] | S. Sathasivam, Upgrading logic programming in Hopfield network, Sains Malays., 39 (2010), 115–118. |
[4] | M. S. M. Kasihmuddin, S. Sathasivam, M. A. Mansor, Hybrid genetic algorithm in the Hopfield network for logic satisfiability problem, Pertanika J. Sci. Technol., 2017. |
[5] | M. A. Mansor, M. S. M. Kasihmuddin, S. Sathasivam, Artificial immune system paradigm in the Hopfield network for 3-satisfiability problem, Pertanika J. Sci. Technol., 25 (2017), 1173–1188. |
[6] | S. Alzaeemi, M. A. Mansor, M. M. Kasihmuddin, S. Sathasivam, M. Mamat, Radial basis function neural network for 2 satisfiability programming, Indonesian J. Electron. Eng. Comput. Sci., 18 (2020), 459–469. https://doi.org/10.11591/ijeecs.v18.i1 doi: 10.11591/ijeecs.v18.i1 |
[7] | S. A. Alzaeemi, K. G. Tay, A. Huong, S. Sathasivam, M. K. M. Ali, Evolution performance of symbolic radial basis function neural network by using evolutionary algorithms, Comput. Syst. Sci. Eng., 47 (2023), 1163–1184. https://doi.org/10.32604/csse.2023.038912 doi: 10.32604/csse.2023.038912 |
[8] | S. Sathasivam, M. A. Mansor, A. I. M. Ismail, S. Z. M. Jamaludin, M. S. M. Kasihmuddin, M. Mamat, Novel random k satisfiability for k≤2 in Hopfield neural network, Sains Malays., 49 (2020), 2847–2857. https://doi.org/10.17576/jsm-2020-4911-23 doi: 10.17576/jsm-2020-4911-23 |
[9] | Y. Guo, M. S. M. Kasihmuddin, Y. Gao, M. A. Mansor, H. A. Wahab, N. E. Zamri, et al., YRAN2SAT: a novel flexible random satisfiability logical rule in discrete Hopfield neural network, Adv. Eng. Software, 171 (2022), 103169. https://doi.org/10.1016/j.advengsoft.2022.103169 doi: 10.1016/j.advengsoft.2022.103169 |
[10] | N. E. Zamri, S. A. Azhar, M. A. Mansor, A. Alway, M. S. M. Kasihmuddin, Weighted random k satisfiability for k = 1, 2 (r2SAT) in discrete Hopfield neural network, Appl. Soft Comput., 126 (2022), 109312. https://doi.org/10.1016/j.asoc.2022.109312 doi: 10.1016/j.asoc.2022.109312 |
[11] | S. S. M. Sidik, N. E. Zamri, M. S. M. Kasihmuddin, H. A. Wahab, Y. Guo, M. A. Mansor, Non-systematic weighted satisfiability in discrete Hopfield neural network using binary artificial bee colony optimization, Mathematics, 10 (2022), 1129. https://doi.org/10.3390/math10071129 doi: 10.3390/math10071129 |
[12] | S. R. Dubey, S. K. Singh, B. B. Chaudhuri, Activation functions in deep learning: a comprehensive survey and benchmark, Neurocomputing, 503 (2022), 92–108. https://doi.org/10.1016/j.neucom.2022.06.111 doi: 10.1016/j.neucom.2022.06.111 |
[13] | A. Apicella, F. Donnarumma, F. Isgrò, R. Prevete, A survey on modern trainable activation functions, Neural Networks, 138 (2021), 14–32. https://doi.org/10.1016/j.neunet.2021.01.026 doi: 10.1016/j.neunet.2021.01.026 |
[14] | A. C. Mathias, P. C. Rech, Hopfield neural network: the hyperbolic tangent and the piecewise-linear activation functions, Neural Networks, 34 (2012), 42–45. https://doi.org/10.1016/j.neunet.2012.06.006 doi: 10.1016/j.neunet.2012.06.006 |
[15] | C. Chen, F. Min, Y. Zhang, H. Bao, ReLU-type Hopfield neural network with analog hardware implementation, Chaos Solitons Fract., 167 (2023), 113068. https://doi.org/10.1016/j.chaos.2022.113068 doi: 10.1016/j.chaos.2022.113068 |
[16] | M. A. Mansor, S. Sathasivam, Accelerating activation function for 3-satisfiability logic programming, Int. J. Intell. Syst. Appl., 8 (2016), 44–50. https://doi.org/10.5815/ijisa.2016.10.05 doi: 10.5815/ijisa.2016.10.05 |
[17] | S. Abdeen, M. S. M. Kasihmuddin, N. E. Zamri, G. Manoharam, M. A. Mansor, N. Alshehri, S-type aandom k satisfiability logic in discrete Hopfield neural network using probability distribution: performance optimization and analysis, Mathematics, 11 (2023). https://doi.org/10.3390/math11040984 doi: 10.3390/math11040984 |
[18] | S. A. Karim, N. E. Zamri, A. Alway, M. S. M. Kasihmuddin, A. I. M. Ismail, M. A. Mansor, et al., Random satisfiability: a higher-order logical approach in discrete Hopfield neural network, IEEE Access, 9 (2021), 50831–50845. https://doi.org/10.1109/ACCESS.2021.3068998 doi: 10.1109/ACCESS.2021.3068998 |
[19] | V. Someetheram, M. F. Marsani, M. S. M. Kasihmuddin, N. E. Zamri, S. S. M. Sidik, S. Z. M. Jamaludin, Random maximum 2 satisfiability logic in discrete Hopfield neural network incorporating improved election algorithm, Mathematics, 10 (2022), 4734. https://doi.org/10.3390/math10244734 doi: 10.3390/math10244734 |
[20] | M. Soeken, G. Meuli, B. Schmitt, F. Mozafari, H. Riener, G. D. Micheli, Boolean satisfiability in quantum compilation, Philos. Trans. R. Soc. A, 378 (2020), 161. https://doi.org/10.1098/rsta.2019.0161 doi: 10.1098/rsta.2019.0161 |
[21] | C. Hireche, H. Drias, H. Moulai, Grid based clustering for satisfiability solving, Appl. Soft Comput., 88 (2020), 106069. https://doi.org/10.1016/j.asoc.2020.106069 doi: 10.1016/j.asoc.2020.106069 |
[22] | H. Yamashita, K. Aihara, H. Suzuki, Timescales of Boolean satisfiability solver using continuous-time dynamical system, Commun. Nonlinear Sci. Numer. Simul., 84 (2020), 105183. https://doi.org/10.1016/j.cnsns.2020.105183 doi: 10.1016/j.cnsns.2020.105183 |
[23] | L. C. Kho, M. S. M. Kasihmuddin, M. Mansor, S. Sathasivam, Logic mining in league of legends, Pertanika J. Sci. Technol., 28 (2020), 211–225. |
[24] | J. Feng, S. Lu, Performance analysis of various activation functions in artificial neural networks, J. Phys., 1237 (2019), 022030. https://doi.org/10.1088/1742-6596/1237/2/022030 doi: 10.1088/1742-6596/1237/2/022030 |
[25] | G. Wang, Z. Hao, B. Zhang, L. Jin, Convergence and robustness of bounded recurrent neural networks for solving dynamic Lyapunov equations, Inf. Sci., 588 (2022), 106–123. https://doi.org/10.1016/j.ins.2021.12.039 doi: 10.1016/j.ins.2021.12.039 |
[26] | X. Wang, H. Ren, A. Wang, Smish: a novel activation function for deep learning methods, Electronics, 11 (2022). 540. https://doi.org/10.3390/electronics11040540 doi: 10.3390/electronics11040540 |
[27] | M. S. M. Kasihmuddin, M. A. Mansor, M. F. M. Basir, S. Sathasivam, Discrete mutation Hopfield neural network in propositional satisfiability, Mathematics, 7 (2019), 1133. https://doi.org/10.3390/math7111133 doi: 10.3390/math7111133 |
[28] | Y. Koçak, G. Ü. Şiray, New activation functions for single layer feedforward neural network, Expert Syst. Appl., 164 (2021), 113977. https://doi.org/10.1016/j.eswa.2020.113977 doi: 10.1016/j.eswa.2020.113977 |
[29] | S. Kiliçarslan, C. Közkurt, S. Baş, A. Elen, Detection and classification of pneumonia using novel superior exponential (SupEx) activation function in convolutional neural networks, Expert Syst. Appl., 217 (2023), 119503. https://doi.org/10.1016/j.eswa.2023.119503 doi: 10.1016/j.eswa.2023.119503 |
[30] | D. Misra, Mish: a self regularized non-monotonic activation function, arXiv, 2019. https://doi.org/10.48550/arXiv.1908.08681 doi: 10.48550/arXiv.1908.08681 |
[31] | J. Liu, Y. Liu, Q. Zhang, A weight initialization method based on neural network with asymmetric activation function, Neurocomputing, 483 (2022), 171–182. https://doi.org/10.1016/j.neucom.2022.01.088 doi: 10.1016/j.neucom.2022.01.088 |
[32] | F. L. Azizan, S. Sathasivam, M. K. M. Ali, Hybridised intelligent dynamic model of 3-satisfiability fuzzy logic Hopfield neural network, Pertanika J. Sci. Technol., 31 (2023), 6. https://doi.org/10.47836/pjst.31.4.06 doi: 10.47836/pjst.31.4.06 |
[33] | S. Z. M. Jamaludin, N. A. Romli, M. S. M. Kasihmuddin, A. Baharum, M. A. Mansor, M. F. Marsani, Novel logic mining incorporating log linear approach, J. King Saud Univ., 34 (2022), 9011–9027. https://doi.org/10.1016/j.jksuci.2022.08.026 doi: 10.1016/j.jksuci.2022.08.026 |
[34] | J. L. Salmeron, A. Ruiz-Celma, Elliot and symmetric Elliot extreme learning machines for Gaussian noisy industrial thermal modelling, Energies, 12 (2018), 90. https://doi.org/10.3390/en12010090 doi: 10.3390/en12010090 |
[35] | S. P. Ittiyavirah, S. A. Jones, P. Siddarth, Analysis of different activation functions using backpropagation neural networks, J. Theor. Appl. Inf. Technol., 47 (2013), 1344–1348. |
[36] | H. Abdel-Nabi, G. Al-Naymat, M. Z. Ali, A. Awajan, HcLSH: a novel non-linear monotonic activation function for deep learning methods, IEEE Access, 11 (2023), 47794–47815. https://doi.org/10.1109/ACCESS.2023.3276298 doi: 10.1109/ACCESS.2023.3276298 |
[37] | J. Brownlee, Better deep learning: train faster, reduce overfitting, and make better predictions, Machine Learning Mastery, 2018. |
[38] | P. Ramachandran, B. Zoph, Q. V. Le, Searching for activation functions, arXiv, 2017. https://doi.org/10.48550/arXiv.1710.05941 doi: 10.48550/arXiv.1710.05941 |
[39] | S. Elfwing, E. Uchibe, K. Doya, Sigmoid-weighted linear units for neural network function approximation in reinforcement learning, Neural Networks, 107 (2018), 3–11. https://doi.org/10.1016/j.neunet.2017.12.012 doi: 10.1016/j.neunet.2017.12.012 |
[40] | A. Alway, N. E. Zamri, S. A. Karim, M. A. Mansor, M. S. M. Kasihmuddin, M. M. Bazuhair, Major 2 satisfiability logic in discrete Hopfield neural network, Int. J. Comput. Math., 99 (2022), 924–948. https://doi.org/10.1080/00207160.2021.1939870 doi: 10.1080/00207160.2021.1939870 |
[41] | F. L. Azizan, S. Sathasivam, M. K. M. Ali, N. Roslan, C. Feng, Hybridised network of fuzzy logic and a genetic algorithm in solving 3-satisfiability Hopfield neural networks, Axioms, 12 (2023), 250. https://doi.org/10.3390/axioms12030250 doi: 10.3390/axioms12030250 |
[42] | Y. Gao, Y. Guo, N. A. Romli, M. S. M. Kasihmuddin, W. Chen, M. A. Mansor, et al., GRAN3SAT: creating flexible higher-order logic satisfiability in the discrete Hopfield neural network, Mathematics, 10 (2022), 1899. https://doi.org/10.3390/math10111899 doi: 10.3390/math10111899 |
[43] | S. Sathasivam, W. A. T. W. Abdullah, Logic learning in Hopfield networks, arXiv, 2008. https://doi.org/10.48550/arXiv.0804.4075 doi: 10.48550/arXiv.0804.4075 |
[44] | G. Pinkas, Symmetric neural networks and propositional logic satisfiability, Neural Comput., 3 (1991), 282–291. https://doi.org/10.1162/neco.1991.3.2.282 doi: 10.1162/neco.1991.3.2.282 |
[45] | P. Ong, Z. Zainuddin, Optimizing wavelet neural networks using modified cuckoo search for multi-step ahead chaotic time series prediction, Appl. Soft Comput., 80 (2019), 374–386. https://doi.org/10.1016/j.asoc.2019.04.016 doi: 10.1016/j.asoc.2019.04.016 |
[46] | G. Maguolo, L. Nanni, S. Ghidoni, Ensemble of convolutional neural networks trained with different activation functions, Expert Syst. Appl., 166 (2021), 114048. https://doi.org/10.1016/j.eswa.2020.114048 doi: 10.1016/j.eswa.2020.114048 |
[47] | D. L. Elliott, A better activation function for artificial neural networks, Unive. Md., 1993. |
[48] | Y. Zhang, P. Li, C. Xu, X. Peng, R. Qiao, Investigating the effects of a fractional operator on the evolution of the ENSO model: bifurcations, stability and numerical analysis, Fractal Fract., 7 (2023), 602. https://doi.org/10.3390/fractalfract7080602 doi: 10.3390/fractalfract7080602 |
[49] | C. Xu, Z. Liu, P. Li, J. Yan, L. Yao, Bifurcation mechanism for fractional-order three-triangle multi-delayed neural networks, Neural Process. Lett., 55 (2023), 6125–6151. https://doi.org/10.1007/s11063-022-11130-y doi: 10.1007/s11063-022-11130-y |
[50] | P. Li, Y. Lu, C. Xu, J. Ren, Insight into Hopf bifurcation and control methods in fractional order BAM neural networks incorporating symmetric structure and delay, Cogn. Comput., 15 (2023), 1825–1867. https://doi.org/10.1007/s12559-023-10155-2 doi: 10.1007/s12559-023-10155-2 |
[51] | P. Li, X. Peng, C. Xu, L. Han, S. Shi, Novel extended mixed controller design for bifurcation control of fractional-order Myc/E2F/miR-17-92 network model concerning delay, Math. Methods Appl. Sci., 46 (2023), 18878–18898. https://doi.org/10.1002/mma.9597 doi: 10.1002/mma.9597 |
[52] | A. G. Hussien, D. Oliva, E. H. Houssein, A. A. Juan, X. Yu, Binary whale optimization algorithm for dimensionality reduction, Mathematics, 8 (2020), 1821. https://doi.org/10.3390/math8101821 doi: 10.3390/math8101821 |
[53] | E. Pashaei, N. Aydin, Binary black hole algorithm for feature selection and classification on biological data, Appl. Soft Comput., 56 (2017), 94–106. https://doi.org/10.1016/j.asoc.2017.03.002 doi: 10.1016/j.asoc.2017.03.002 |
[54] | N. A. Rusdi, M. S. M. Kasihmuddin, N. A. Romli, G. Manoharam, M. A. Mansor, Multi-unit discrete Hopfield neural network for higher order supervised learning through logic mining: optimal performance design and attribute selection, J. King Saud Univ., 35 (2023), 101554. https://doi.org/10.1016/j.jksuci.2023.101554 doi: 10.1016/j.jksuci.2023.101554 |
[55] | S. A. Alzaeemi, S. Sathasivam, M. K. M. Ali, K. G. Tay, M. Velavan, Hybridized intelligent neural network optimization model for forecasting prices of rubber in Malaysia, Comput. Syst. Sci. Eng., 47 (2023), 1471–1491. https://doi.org/10.32604/csse.2023.037366 doi: 10.32604/csse.2023.037366 |