Study of Parametric Performance Evaluation of Machine Learning and Statistical Classifiers

Автор: Yugal kumar, G. Sahoo

Журнал: International Journal of Information Technology and Computer Science(IJITCS) @ijitcs

Статья в выпуске: 6 Vol. 5, 2013 года.

Бесплатный доступ

Most of the researchers/ scientists are facing data explosion problem presently. Large amount of data is available in the world i.e. data from science, industry, business, survey and many other areas. The main task is how to prune the data and extract valuable information from these data which can be used for decision making. The answer of this question is data mining. Data Mining is popular topic among researchers. There is lot of work that cannot be explored in the field of data mining till now. A large number of data mining tools/software’s are available which are used for mining the valuable information from the datasets and draw new conclusion based on the mined information. These tools used different type of classifiers to classify the data. Many researchers have used different type of tools with different classifiers to obtained desired results. In this paper three classifiers i.e. Bayes, Neural Network and Tree are used with two datasets to obtain desired results. The performance of these classifiers is analyzed with the help of Mean Absolute Error, Root Mean-Squared Error, Time Taken, Correctly Classified Instance, Incorrectly Classified instance and Kappa Statistic parameter.

Еще

Bayes Net, J48, Mean Absolute Error, Naive Bayes, Root Mean-Squared Error

Короткий адрес: https://sciup.org/15011914

IDR: 15011914

Список литературы Study of Parametric Performance Evaluation of Machine Learning and Statistical Classifiers

  • Desouza, K.C. (2001) ,Artificial intelligence for healthcare management In Proceedings of the First International Conference on Management of Healthcare and Medical Technology Enschede, Netherlands Institute for Healthcare Technology Management
  • J. Han and M. Kamber, (2000) “Data Mining: Concepts and Techniques,” Morgan Kaufmann.
  • Ritu Chauhan, Harleen Kaur, M.Afshar Alam, (2010) “Data Clustering Method for Discovering Clusters in Spatial Cancer Databases”, International Journal of Computer Applications (0975 – 8887) Volume 10– No.6.
  • Rakesh Agrawal,Tomasz Imielinski and Arun Swami, (1993)” Data mining : A Performance perspective“. IEEE Transactions on Knowledge and Data Engineering, 5(6):914-925.
  • Daniel Grossman and Pedro Domingos (2004). Learning Bayesian Network Classifiers by Maximizing Conditional Likelihood. In Press of Proceedings of the 21st International Conference on Machine Learning, Banff, Canada.
  • www.ics.uci.edu/~mle
  • Ridgeway G, Madigan D, Richardson T (1998) Interpretable boosted naive Bayes classification. In: Agrawal R, StolorzP, Piatetsky-Shapiro G (eds) Proceedings of the fourth international conference on knowledge discovery and data mining.. AAAI Press, Menlo Park pp 101–104.
  • Weka: Data Mining Software in Java http://www.cs.waikato.ac.nz/ml/weka/
  • Zak S.H., (2003), “Systems and Control”, NY: Oxford Uniniversity Press.
  • Hassoun M.H, (1999), “Fundamentals of Artificial Neural Networks”, Cambridge, MA: MIT press.
  • Yoav Freund, Robert E. Schapire, (1999) "Large Margin Classification Using the Perceptron Algorithm." In: Machine Learning, 37(3).
  • Yunhua Hu, Hang Li, Yunbo Cao, Li Teng, Dmitriy Meyerzon, Qinghua Zheng, (2006), “ Automatic extraction of titles from general documents using machine learning”, in Information Processing and Management( publish by Elsevier) 42, 1276–1293.
  • Michael Collins and Nigel Duffy, (2002), “New Ranking Algorithms for Parsing and Tagging: Kernels over Discrete Structures, and the Voted Perceptron” in Proceedings of the 40th Annual Meeting of the Association for Computational Linguistics (ACL), Philadelphia, pp. 263-270.
  • Ian H.Witten and Elbe Frank, (2005) "Data mining Practical Machine Learning Tools and Techniques," Second Edition, Morgan Kaufmann, San Fransisco
Еще
Статья научная