Abstract
In the last decade, twin support vector machine (TWSVM) classifiers have achieved considerable emphasis on pattern classification tasks. However, the TWSVM formulation still suffers from the following two shortcomings: (1) TWSVM deals with the inverse matrix calculation in the Wolfe-dual problems, which is intractable for large-scale datasets with numerous features and samples, and (2) TWSVM minimizes the empirical risk instead of the structural risk in its formulation. With the advent of huge amounts of data today, these disadvantages render TWSVM an ineffective choice for pattern classification tasks. In this article, we propose an efficient large-scale least squares twin support vector machine (LS-LSTSVM) for pattern classification that rectifies all the aforementioned shortcomings. The proposed LS-LSTSVM introduces different Lagrangian functions to eliminate the need for calculating inverse matrices. The proposed LS-LSTSVM also does not employ kernel-generated surfaces for the non-linear case, and thus uses the kernel trick directly. This ensures that the proposed LS-LSTSVM model is superior to the original TWSVM and LSTSVM. Lastly, the structural risk is minimized in LS-LSTSVM. This exhibits the essence of statistical learning theory, and consequently, classification accuracy on datasets can be improved due to this change. The proposed LS-LSTSVM is solved using the sequential minimal optimization (SMO) technique, making it more suitable for large-scale problems. We further proved the convergence of the proposed LS-LSTSVM. Exhaustive experiments on several real-world benchmarks and NDC-based large-scale datasets demonstrate that the proposed LS-LSTSVM is feasible for large datasets and, in most cases, performed better than existing algorithms.
- C. Blake and C. J. Merz. 1998. UCI Repository of Machine Learning Databases, Dept. of Information and Computer Science, Univ. of California, Irvine.Google Scholar
- Danushka Bollegala, Yutaka Matsuo, and Mitsuru Ishizuka. 2011. A web search engine-based approach to measure semantic similarity between words. IEEE Transactions on Knowledge and Data Engineering 23, 7 (2011), 977–990. Google Scholar
Digital Library
- Christopher J. C. Burges. 1998. A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery 2, 2 (1998), 121–167. Google Scholar
Digital Library
- Chih-Chung Chang and Chih-Jen Lin. 2011. LIBSVM: A library for support vector machines. ACM Transactions on Intelligent Systems and Technology 2, 3, Article 27 (May 2011), 27 pages. DOI:https://doi.org/10.1145/1961189.1961199 Google Scholar
Digital Library
- Feng Chu, Guosheng Jin, and Lipo Wang. 2005. Cancer diagnosis and protein secondary structure prediction using support vector machines. In Support Vector Machines: Theory and Applications. Springer, 343–363.Google Scholar
- C. Cortes and V. Vapnik. 1995. Support vector networks. Machine Learning 20 (1995), 273–297. Google Scholar
Digital Library
- Janez Demšar. 2006. Statistical comparisons of classifiers over multiple data sets. Journal of Machine Learning Research 7, (Jan.2006), 1–30. Google Scholar
Digital Library
- Richard O. Duda, Peter E. Hart, and David G. Stork. 2012. Pattern Classification. John Wiley & Sons. Google Scholar
Digital Library
- Manuel Fernández-Delgado, Eva Cernadas, Senén Barro, and Dinani Amorim. 2014. Do we need hundreds of classifiers to solve real world classification problems?Journal of Machine Learning Research 15, 1 (Jan. 2014), 3133–3181. Retrieved from http://dl.acm.org/citation.cfm?id=2627435.2697065. Google Scholar
Digital Library
- Michael Grant and Stephen Boyd. 2014. CVX: Matlab software for disciplined convex programming, version 2.1. http://cvxr.com/cvx/citing/.Google Scholar
- Michael Grant, Stephen Boyd, and Yinyu Ye. 2009. cvx users’ guide. Technical Report, Technical Report Build 711, Citeseer.Google Scholar
- Chih-Wei Hsu and Chih-Jen Lin. 2002. A comparison of methods for multiclass support vector machines. IEEE Transactions on Neural Networks 13, 2 (2002), 415–425. Google Scholar
Digital Library
- Jayadeva, R. Khemchandani, and S. Chandra. 2007. Twin support vector machines for pattern classification. IEEE Transactions on Pattern Analysis and Machine Intelligence 29, 5 (2007), 905–910. Google Scholar
Digital Library
- S. S. Keerthi and S. K. Shevade. 2003. SMO algorithm for least squares SVM. In Proceedings of the International Joint Conference on Neural Networks, 2003, Vol. 3. IEEE, 2088–2093.Google Scholar
- S. Sathiya Keerthi and Elmer G. Gilbert. 2002. Convergence of a generalized SMO algorithm for SVM classifier design. Machine Learning 46, 1–3 (2002), 351–360. Google Scholar
Digital Library
- S. Sathiya Keerthi, Shirish Krishnaj Shevade, Chiranjib Bhattacharyya, and Karuturi Radha Krishna Murthy. 2001. Improvements to Platt’s SMO algorithm for SVM classifier design. Neural Computation 13, 3 (2001), 637–649. Google Scholar
Digital Library
- Reshma Khemchandani, Pooja Saigal, and Suresh Chandra. 2018. Angle-based twin support vector machine. Annals of Operations Research 269, 1–2 (2018), 387–417.Google Scholar
Cross Ref
- Reshma Khemchandani and Sweta Sharma. 2016. Robust least squares twin support vector machine for human activity recognition. Applied Soft Computing 47 (2016), 33–46. Google Scholar
Digital Library
- M. A. Kumar and M. Gopal. 2009. Least squares twin support vector machines for pattern classification. Expert Systems with Applications 36 (2009), 7535–7543. Google Scholar
Digital Library
- Yuanqing Li and Cuntai Guan. 2008. Joint feature re-extraction and classification using an iterative semi-supervised support vector machine algorithm. Machine Learning 71, 1 (2008), 33–53. Google Scholar
Digital Library
- Dalian Liu, Dewei Li, Yong Shi, and Yingjie Tian. 2018. Large-scale linear nonparallel SVMs. Soft Computing 22, 6 (2018), 1945–1957. Google Scholar
Digital Library
- O. L. Mangasarian and E. W. Wild. 2006. Multisurface proximal support vector classification via generalized eigenvalues. IEEE Transactions on Pattern Analysis and Machine Intelligence 28, 1 (2006), 69–74. Google Scholar
Digital Library
- Olvi L. Mangasarian. 1994. Nonlinear Programming. SIAM.Google Scholar
- D. R. Musicant. 1998. NDC: Normally Distributed Clustered Datasets. Retrieved from http://www.cs.wisc.edu/dmi/svm/ndc/.Google Scholar
- Jalal A. Nasiri, Nasrollah Moghadam Charkari, and Kourosh Mozafari. 2014. Energy-based model of least squares twin support vector machines for human action recognition. Signal Processing 104 (2014), 248–257. Google Scholar
Digital Library
- Edgar Osuna, Robert Freund, and Federico Girosit. 1997. Training support vector machines: An application to face detection. In Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition. IEEE, 130–136. Google Scholar
Digital Library
- Mahesh Pal and P. M. Mather. 2005. Support vector machines for classification in remote sensing. International Journal of Remote Sensing 26, 5 (2005), 1007–1011.Google Scholar
Cross Ref
- B. Richhariya, A. Sharma, and M. Tanveer. 2018. Improved universum twin support vector machine. In 2018 IEEE Symposium Series on Computational Intelligence (SSCI’18). IEEE, 2045–2052.Google Scholar
- B. Richhariya and M. Tanveer. 2018. EEG signal classification using universum support vector machine. Expert Systems with Applications 106 (2018), 169–182.Google Scholar
Cross Ref
- B. Richhariya and M. Tanveer. 2021. An efficient angle based universum least squares twin support vector machine for pattern classification. ACM Transactions on Internet Technology (TOIT) (In press) (2021). https://doi.org/10.1145/3387131Google Scholar
- B. Richhariya and M. Tanveer. 2020. A reduced universum twin support vector machine for class imbalance learning. Pattern Recognition 102 (2020), 107150.Google Scholar
Digital Library
- B. Richhariya, M. Tanveer, A. H. Rashid, and Alzheimer’s Disease Neuroimaging Initiative. 2020. Diagnosis of Alzheimer’s disease using universum support vector machine based recursive feature elimination (USVM-RFE). Biomedical Signal Processing and Control 59 (2020), 101903.Google Scholar
Cross Ref
- Bernhard Schölkopf, Alexander J. Smola, Francis Bach, et al. 2002. Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. MIT Press. Google Scholar
Digital Library
- Xigao Shao, Kun Wu, and Bifeng Liao. 2013. Single directional SMO algorithm for least squares support vector machines. Computational Intelligence and Neuroscience 2013, Article 968438 (2013).Google Scholar
- Yuan-Hai Shao, Wei-Jie Chen, Zhen Wang, Chun-Na Li, and Nai-Yang Deng. 2015. Weighted linear loss twin support vector machine for large-scale classification. Knowledge-Based Systems 73 (2015), 276–288. Google Scholar
Digital Library
- Yuan-Hai Shao, Chun-Na Li, Ming-Zeng Liu, Zhen Wang, and Nai-Yang Deng. 2018. Sparse Lq-norm least squares support vector machine with feature selection. Pattern Recognition 78 (2018), 167–181. Google Scholar
Digital Library
- Yuan-Hai Shao, Chun-Hua Zhang, Xiao-Bo Wang, and Nai-Yang Deng. 2011. Improvements on twin support vector machines. IEEE Transactions on Neural Networks 22, 6 (2011), 962–968. Google Scholar
Digital Library
- S. Sharma, R. Rastogi, and S. Chandra. 2021. Large-scale twin parametric support vector machine using Pinball loss function. IEEE Transactions on Systems, Man, and Cybernetics: Systems 51, 2 (2021), 987--1003.Google Scholar
Cross Ref
- Shirish K. Shevade, S. Sathiya Keerthi, Chiranjib Bhattacharyya, and Karaturi Radha Krishna Murthy. 2000. Improvements to the SMO algorithm for SVM regression. IEEE Transactions on Neural Networks 11, 5 (2000), 1188–1193. Google Scholar
Digital Library
- M. Tanveer. 2015. Application of smoothing techniques for linear programming twin support vector machines. Knowledge and Information Systems 45, 1 (2015), 191–214. Google Scholar
Digital Library
- M. Tanveer. 2015. Newton method for implicit Lagrangian twin support vector machines. International Journal of Machine Learning and Cybernetics 6, 6 (2015), 1029–1040.Google Scholar
Cross Ref
- M. Tanveer. 2015. Robust and sparse linear programming twin support vector machines. Cognitive Computation 7, 1 (2015), 137–149.Google Scholar
Cross Ref
- M. Tanveer, C. Gautam, and Ponnuthurai N. Suganthan. 2019. Comprehensive evaluation of twin SVM based classifiers on UCI datasets. Applied Soft Computing 83 (2019), 105617.Google Scholar
Digital Library
- M. Tanveer, M. A. Khan, and S.-S. Ho. 2016. Robust energy-based least squares twin support vector machines. Applied Intelligence. 45, 1 (2016), 174–186. Google Scholar
Digital Library
- M. Tanveer, T. Rajani, and M. A. Ganaie. 2019. Improved sparse pinball twin SVM. In 2019 IEEE International Conference on Systems, Man and Cybernetics (SMC’19). IEEE, 3287–3291.Google Scholar
- M. Tanveer, B. Richhariya, R. U. Khan, A. H. Rashid, P. Khanna, M. Prasad, and C. T. Lin. 2020. Machine learning techniques for the diagnosis of Alzheimer’s disease: A review. ACM Transactions on Multimedia Computing, Communications, and Applications (TOMM) 16, 1s (2020), 1–35. Google Scholar
Digital Library
- M. Tanveer, A Sharma, and Ponnuthurai N. Suganthan. 2019. General twin support vector machine with pinball loss function. Information Sciences 494 (2019), 311–327.Google Scholar
Digital Library
- M. Tanveer, A. Sharma, and Ponnuthurai N. Suganthan. 2020. Least squares KNN-based weighted multiclass twin SVM. Neurocomputing (In Press) (2020). DOI:https://doi.org/10.1016/j.neucom.2020.02.132Google Scholar
- M. Tanveer and K. Shubham. 2017. Smooth twin support vector machines via unconstrained convex minimization. Filomat 31, 8 (2017), 2195–2210.Google Scholar
- M. Tanveer, Aruna Tiwari, Rahul Choudhary, and Sanchit Jalan. 2019. Sparse pinball twin support vector machines. Applied Soft Computing 78 (2019), 164–175.Google Scholar
Digital Library
- Yingjie Tian, Xuchan Ju, Zhiquan Qi, and Yong Shi. 2014. Improved twin support vector machine. Science China Mathematics 57, 2 (2014), 417–432.Google Scholar
Cross Ref
- Vladimir Vapnik. 2013. The Nature of Statistical Learning Theory. Springer Science & Business Media. Google Scholar
Digital Library
- Zhen Wang, Yuan-Hai Shao, Lan Bai, Chun-Na Li, Li-Ming Liu, and Nai-Yang Deng. 2018. Insensitive stochastic gradient twin support vector machines for large scale problems. Information Sciences 462 (2018), 114–131.Google Scholar
Cross Ref
- Yitian Xu, Zhiji Yang, and Xianli Pan. 2016. A novel twin support-vector machine with pinball loss. IEEE Transactions on Neural Networks and Learning Systems 28, 2 (2016), 359–370.Google Scholar
Cross Ref
- Zhi-Qiang Zeng, Hong-Bin Yu, Hua-Rong Xu, Yan-Qi Xie, and Ji Gao. 2008. Fast training support vector machines using parallel sequential minimal optimization. In 2008 3rd International Conference on Intelligent System and Knowledge Engineering, Vol. 1. IEEE, 997–1001.Google Scholar
Cross Ref
Index Terms
Large-Scale Least Squares Twin SVMs
Recommendations
Robust energy-based least squares twin support vector machines
Twin support vector machine (TSVM), least squares TSVM (LSTSVM) and energy-based LSTSVM (ELS-TSVM) satisfy only empirical risk minimization principle. Moreover, the matrices in their formulations are always positive semi-definite. To overcome these ...
Benchmarking tree-based least squares twin support vector machine classifiers
Least square twin support vector machine is an emerging learning method applied in classification problem. This paper present a tree-based least square twin support vector machine (T-LSTWSVM) for classification. Classification procedure depends on the ...
Least squares twin support vector machines for pattern classification
In this paper we formulate a least squares version of the recently proposed twin support vector machine (TSVM) for binary classification. This formulation leads to extremely simple and fast algorithm for generating binary classifiers based on two non-...






Comments