Forecasting of Campus Placement for Students Using Ensemble Voting Classifier

Main Article Content

Shawni Dutta
Samir Kumar Bandyopadhyay

Abstract

Campus placement is a measure of students’ performance in a course. A forecasting method is proposed in this paper to predict possible campus placement of any institution. Data mining and knowledge discovery processes on academic career of students are applied. Supervised machine learning technique based classifiers are used for achieving this process. It uses an ensemble approach based voting classifier for choosing best classifier models to achieve better result over other classifiers. Experimental results have indicated 86.05% accuracy of ensemble based approach which is significantly better over other classifiers.

Keywords:
Campus placement prediction, ensemble voting classifier, automated tool, higher education system, machine learning.

Article Details

How to Cite
Dutta, S., & Bandyopadhyay, S. K. (2020). Forecasting of Campus Placement for Students Using Ensemble Voting Classifier. Asian Journal of Research in Computer Science, 5(4), 1-12. https://doi.org/10.9734/ajrcos/2020/v5i430138
Section
Original Research Article

References

Manvitha P, Swaroopa N. Campus placement prediction using supervised machine learning techniques. Int J Appl Eng Res. 2019;14(9):2188–2191.

Harinath S, Prasad A, SHS, Mathew T. Student placement prediction using machine learning. Int Res J Eng Technol. 2008;6(4):4577–4579.

Bhatt H, Mehta S, Lynette AID, RD. Use of ID3 decision tree algorithm for placement prediction. Int J Comput Sci Inf Technol. 2015;6(5):4785–4789.

Sreenivasa Rao K, Swapna N, Praveen Kumar P. Educational data mining for student placement prediction using machine learning algorithms. Int J Eng Technol. 2017;7(1.2):43-46.
DOI: 10.14419/ijet.v7i1.2.8988

Jeevalatha T, Ananthi NAN, Kumar DS. Performance analysis of undergraduate students placement selection using decision tree algorithms. Int J Comput Appl. 2014;108(15):27–31.
DOI: 10.5120/18988-0436

Giri A, Bhagavath MVV, Pruthvi B, Dubey N. A placement prediction system using k-nearest neighbors classifier. Proc - 2016 2nd Int Conf Cogn Comput Inf Process. CCIP. 2016;3–6.
DOI: 10.1109/CCIP.2016.7802883

Kabakchieva D. Predicting student performance by using data mining methods for Classification. Cybernetics and Information Technologies. 2013;13(1): 61-72.
DOI: 10.2478/cait-2013-0006

Jantawan B, Tsai C. The application of data mining to build classification model for predicting graduate employment. International Journal of Computer Science and Information Security. 2013;11(10).

Ben Roshan D. Campus Recruitment, Version 1; 2020.
[Retrieved April 21,2020]
Available:https://www.kaggle.com/benroshan/factors-affecting-campus-placement

Murtagh F. Multilayer perceptrons for classification and regression. Neurocomputing. 1991;2(5–6):183–197.
DOI: 10.1016/0925-2312(91)90023-5

Rish I. An empirical study of the naïve bayes classifier In IJCAI workshop on Empirical Methods in AI. 2001;41–46.

Walters DE. Bayes’s Theorem and the Analysis of Binomial Random Variables. Biometrical J. 1988;30(7):817–825.
DOI: 10.1002/bimj.4710300710

Sharma H, Kumar S. A survey on decision tree algorithms of classification in data mining. Int J Sci Res. 2016;5(4):2094–2097.
DOI: 10.21275/v5i4.nov162954

Cunningham P, Delany SJ. K -Nearest Neighbour Classifiers. Mult Classif Syst. 2007;1–17.
DOI: 10.1016/S0031-3203(00)00099-6

Wilbur WJ, Kim W. Stochastic Gradient Descent and the Prediction of mesh for pubmed records. AMIA Annu. Symp. Proc. 2014;2014:1198–1207.

Breiman L. ST4_Method_Random_Forest. Mach Learn. 2001;45(1):5–32.
DOI: 10.1017/CBO9781107415324.004

Biggio B, Corona I, Fumera G, G Giacinto, Roli F. Bagging classifiers for fighting poisoning attacks in adversarial classification tasks. Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics). 2011; 6713 :(350–359).
DOI: 10.1007/978-3-642-21557-5_37.

Friedman J, Hastie T, Tibshirani R. Additive logistic regression: A statistical view of boosting. Ann Stat. 2000; 28(2):337–407.
DOI: 10.1214/aos/1016218223

Geurts P, Ernst D, Wehenkel L. Extremely randomized trees. Mach Learn. 2006; 63(1):3–42.
DOI: 10.1007/s10994-006-6226-1

Natekin A, Knoll A. Gradient boosting machines, a tutorial. Front. Neurorobot. 2013; 7.
DOI: 10.3389/fnbot.2013.00021

ASL, Senthil NC. Heavy rainfall prediction using gini index in decision tree. International Journal of Recent Technology and Engineering (IJRTE). 2019;8(4):4558–4562.
DOI: 10.35940/ijrte.D8503.118419

HMSMN. A review on evaluation metrics for data classification evaluations. Int. J Data Min Knowl Manag Process. 2015; 5(2):01–11.
DOI: 10.5121/ijdkp.2015.5201

Vieira SM, Kaymak U, Sousa JMC. Cohen’s kappa coefficient as a performance measure for feature selection. 2010 IEEE World Congr Comput Intell WCCI. 2010;2016:18-23.
DOI: 10.1109/FUZZY.2010.5584447

Saha S, Ekbal A. Combining multiple classifiers using vote based classifier ensemble technique for named entity recognition. Data Knowl Eng. 2013;85:15–39.
DOI: 10.1016/j.datak.2012.06.003