A New Physics-Inspired Discriminative Classifier

Document Type : Research Article

Authors

1 Department of Electrical and Computer Engineering, University of Neyshabur, Neyshabur, Iran

2 Fiber Optics Group, Institute of Science and High Technology and Environmental Sciences, Kerman, Iran

3 Department of Computer and Information Technology, Institute of Science and High Technology and Environmental Sciences, Kerman, Iran

4 Faculty of Engineering, University of Zabol, Zabol, Iran

Abstract

Concepts and laws of physics have been a valuable source of inspiration for engineers to overcome human challenges and problems. Classification is an important example of such problems that play a major role in various fields of engineering sciences. It is shown that discriminative classifiers tend to outperform their generative counterparts, especially in the presence of sufficient labeled training data. In this paper, we present a new physics-inspired discriminative classification method using minimum potential lines.  To do this, we first consider two groups of fixed point charges (as two classes of data) and a movable classifier line between them. Then, we find a stable position for the classifier line by minimizing the total potential integral on the classifier line due to the two groups of point charges. Surprisingly, it will be shown that the obtained classifier is actually an uncertainty-based classifier that minimizes the total uncertainty of the classifier line. Experimental results show the effectiveness of the proposed approach.

Keywords

Main Subjects


[1] A. Ng, M. Jordan, On discriminative vs. generative classifiers: A comparison of logistic regression and naive bayes, Advances in neural information processing systems, 14 (2001).
[2] T. Jebara, Machine learning: discriminative and generative, Springer Science & Business Media, 2012.
[3] C.M. Bishop, N.M. Nasrabadi, Pattern recognition and machine learning, Springer, 2006.
[4] D. Berrar, Bayes’ theorem and naive Bayes classifier, Encyclopedia of bioinformatics and computational biology: ABC of bioinformatics, 403 (2018) 412.
[5] S. Dreiseitl, L. Ohno-Machado, Logistic regression and artificial neural network classification models: a methodology review, Journal of biomedical informatics, 35(5-6) (2002) 352-359.
[6] D.W. Hosmer Jr, S. Lemeshow, R.X. Sturdivant, Applied logistic regression, John Wiley & Sons, 2013.
[7] C. Cortes, V. Vapnik, Support-vector networks, Machine learning, 20 (1995) 273-297.
[8] Z. Akram-Ali-Hammouri, M. Fernández-Delgado, E. Cernadas, S. Barro, Fast support vector classification for large-scale problems, IEEE Transactions on Pattern Analysis and Machine Intelligence, 44(10) (2021) 6184-6195.
[9] A. Ghods, D.J. Cook, A survey of deep network techniques all classifiers can adopt, Data mining and knowledge discovery, 35 (2021) 46-87.
[10] Z. Li, F. Liu, W. Yang, S. Peng, J. Zhou, A survey of convolutional neural networks: analysis, applications, and prospects, IEEE transactions on neural networks and learning systems,  (2021).
[11] Z. He, Z. Wu, G. Xu, Y. Liu, Q. Zou, Decision tree for Sequences, IEEE transactions on Knowledge and Data Engineering,  (2021).
[12] S. Tsang, B. Kao, K.Y. Yip, W.-S. Ho, S.D. Lee, Decision trees for uncertain data, IEEE transactions on knowledge and data engineering, 23(1) (2009) 64-78.
[13] Z. Yu, H. Chen, J. Liu, J. You, H. Leung, G. Han, Hybrid $ k $-nearest neighbor classifier, IEEE transactions on cybernetics, 46(6) (2015) 1263-1275.
[14] T. Liao, Z. Lei, T. Zhu, S. Zeng, Y. Li, C. Yuan, Deep metric learning for k nearest neighbor classification, IEEE Transactions on Knowledge and Data Engineering, 35(1) (2021) 264-275.
[15] M. Bansal, A. Goyal, A. Choudhary, A comparative analysis of K-nearest neighbor, genetic, support vector machine, decision tree, and long short term memory algorithms in machine learning, Decision Analytics Journal, 3 (2022) 100071.
[16] D.E. Goldberg, Genetic algorithms, pearson education India, 2013.
[17] S. Katoch, S.S. Chauhan, V. Kumar, A review on genetic algorithm: past, present, and future, Multimedia tools and applications, 80 (2021) 8091-8126.
[18] M. Dorigo, M. Birattari, T. Stutzle, Ant colony optimization, IEEE computational intelligence magazine, 1(4) (2006) 28-39.
[19] Y. Liu, B. Cao, A novel ant colony optimization algorithm with Levy flight, Ieee Access, 8 (2020) 67205-67213.
[20] J. Kennedy, R. Eberhart, Particle swarm optimization, in:  Proceedings of ICNN'95-international conference on neural networks, IEEE, 1995, pp. 1942-1948.
[21] X. Xia, L. Gui, F. Yu, H. Wu, B. Wei, Y.-L. Zhang, Z.-H. Zhan, Triple archives particle swarm optimization, IEEE transactions on cybernetics, 50(12) (2019) 4862-4875.
[22] J.J. Hopfield, Neural networks and physical systems with emergent collective computational abilities, Proceedings of the national academy of sciences, 79(8) (1982) 2554-2558.
[23] S. Hochreiter, M.C. Mozer, K. Obermayer, Coulomb classifiers: Generalizing support vector machines via an analogy to electrostatic systems, Advances in neural information processing systems, 15 (2002).
[24] D. Ruta, B. Gabrys, A framework for machine learning based on dynamic physical fields, Natural Computing, 8 (2009) 219-237.
[25] M. Budka, B. Gabrys, Electrostatic field framework for supervised and semi-supervised learning from incomplete data, Natural Computing, 10 (2011) 921-945.
[26] L. Peng, B. Yang, Y. Chen, A. Abraham, Data gravitation based classification, Information Sciences, 179(6) (2009) 809-819.
[27] P. Shafigh, S.Y. Hadi, E. Sohrab, Gravitation based classification, Information Sciences, 220 (2013) 319-330.
[28] L. Peng, H. Zhang, H. Zhang, B. Yang, A fast feature weighting algorithm of data gravitation classification, Information Sciences, 375 (2017) 54-78.
[29] A. Cano, A. Zafra, S. Ventura, Weighted data gravitation classification for standard and imbalanced data, IEEE transactions on cybernetics, 43(6) (2013) 1672-1687.