Main Article Content
Campus placement is a measure of students’ performance in a course. A forecasting method is proposed in this paper to predict possible campus placement of any institution. Data mining and knowledge discovery processes on academic career of students are applied. Supervised machine learning technique based classifiers are used for achieving this process. It uses an ensemble approach based voting classifier for choosing best classifier models to achieve better result over other classifiers. Experimental results have indicated 86.05% accuracy of ensemble based approach which is significantly better over other classifiers.
Harinath S, Prasad A, SHS, Mathew T. Student placement prediction using machine learning. Int Res J Eng Technol. 2008;6(4):4577–4579.
Bhatt H, Mehta S, Lynette AID, RD. Use of ID3 decision tree algorithm for placement prediction. Int J Comput Sci Inf Technol. 2015;6(5):4785–4789.
Sreenivasa Rao K, Swapna N, Praveen Kumar P. Educational data mining for student placement prediction using machine learning algorithms. Int J Eng Technol. 2017;7(1.2):43-46.
Jeevalatha T, Ananthi NAN, Kumar DS. Performance analysis of undergraduate students placement selection using decision tree algorithms. Int J Comput Appl. 2014;108(15):27–31.
Giri A, Bhagavath MVV, Pruthvi B, Dubey N. A placement prediction system using k-nearest neighbors classifier. Proc - 2016 2nd Int Conf Cogn Comput Inf Process. CCIP. 2016;3–6.
Kabakchieva D. Predicting student performance by using data mining methods for Classification. Cybernetics and Information Technologies. 2013;13(1): 61-72.
Jantawan B, Tsai C. The application of data mining to build classification model for predicting graduate employment. International Journal of Computer Science and Information Security. 2013;11(10).
Ben Roshan D. Campus Recruitment, Version 1; 2020.
[Retrieved April 21,2020]
Murtagh F. Multilayer perceptrons for classification and regression. Neurocomputing. 1991;2(5–6):183–197.
Rish I. An empirical study of the naïve bayes classifier In IJCAI workshop on Empirical Methods in AI. 2001;41–46.
Walters DE. Bayes’s Theorem and the Analysis of Binomial Random Variables. Biometrical J. 1988;30(7):817–825.
Sharma H, Kumar S. A survey on decision tree algorithms of classification in data mining. Int J Sci Res. 2016;5(4):2094–2097.
Cunningham P, Delany SJ. K -Nearest Neighbour Classifiers. Mult Classif Syst. 2007;1–17.
Wilbur WJ, Kim W. Stochastic Gradient Descent and the Prediction of mesh for pubmed records. AMIA Annu. Symp. Proc. 2014;2014:1198–1207.
Breiman L. ST4_Method_Random_Forest. Mach Learn. 2001;45(1):5–32.
Biggio B, Corona I, Fumera G, G Giacinto, Roli F. Bagging classifiers for fighting poisoning attacks in adversarial classification tasks. Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics). 2011; 6713 :(350–359).
Friedman J, Hastie T, Tibshirani R. Additive logistic regression: A statistical view of boosting. Ann Stat. 2000; 28(2):337–407.
Geurts P, Ernst D, Wehenkel L. Extremely randomized trees. Mach Learn. 2006; 63(1):3–42.
Natekin A, Knoll A. Gradient boosting machines, a tutorial. Front. Neurorobot. 2013; 7.
ASL, Senthil NC. Heavy rainfall prediction using gini index in decision tree. International Journal of Recent Technology and Engineering (IJRTE). 2019;8(4):4558–4562.
HMSMN. A review on evaluation metrics for data classification evaluations. Int. J Data Min Knowl Manag Process. 2015; 5(2):01–11.
Vieira SM, Kaymak U, Sousa JMC. Cohen’s kappa coefficient as a performance measure for feature selection. 2010 IEEE World Congr Comput Intell WCCI. 2010;2016:18-23.
Saha S, Ekbal A. Combining multiple classifiers using vote based classifier ensemble technique for named entity recognition. Data Knowl Eng. 2013;85:15–39.