Determining Contribution of Features in Clustering Multidimensional Data Using Neural Network
Автор: Suneetha Chittineni, Raveendra Babu Bhogapathi
Журнал: International Journal of Information Technology and Computer Science(IJITCS) @ijitcs
Статья в выпуске: 10 Vol. 4, 2012 года.
Бесплатный доступ
Feature contribution means that what features actually participates more in grouping data patterns that maximizes the system’s ability to classify object instances. In this paper, modified K-means fast learning artificial neural network (K-FLANN) was used to cluster multidimensional data. The operation of neural network depends on two parameters namely tolerance (δ) and vigilance (ρ). By setting the vigilance parameter, it is possible to extract significant attributes from an array of input attributes and thus determine the principal features that contribute to the particular output. Exhaustive search and Heuristic search techniques are applied to determine the features that contribute to cluster data. Experiments are conducted to predict the network's ability to extract important factors in the presented test data and comparisons are made between two search methods.
Clustering, Feature Selection, Heuristic Search, Fast Learning Artificial Neural Network
Короткий адрес: https://sciup.org/15011767
IDR: 15011767
Список литературы Determining Contribution of Features in Clustering Multidimensional Data Using Neural Network
- Martin H.C. Law, Mario A.T. Figueiredo, and Anil K.Jain, 2004. “Simultaneous Feature Selection and Clustering Using Mixture Models”, IEEE Transactions on Pattern Analysis and Machine Intelligence.pp:1144-1166.
- Deng Cai, Chiyuan Zhang, Xiaofei He, 2010. “Unsupervised feature selection for multi-cluster data”. Proceedings of the 16th ACM SIGKDD International conference on Knowledge discovery and data mining. KDD’10, DOI: 10.1145 /1835804.1835848.
- Pabitra Mitra, C.A. Murthy, Sankar k.pal, 2002. “Unsupervised feature selection using feature similarity”. IEEE Transactions on pattern analysis and machine intelligence, vol.24, No.4, pp: 1-13
- Mark A. Hall, Lloyd A. Smith, 1997. “Feature Subset Selection: A Correlation Based Filter Approach”. International Conference on Neural Information Processing and Intelligent Information Systems.pp:855-858. http://www.cs.waikato.ac.nz/ml/publications/1997/Hall-Smith97.pdf
- Hwanjo Yu, Jinoh Oh, Wook-Shin Han, 2009. “Efficient Feature Weighting Methods for Ranking”. Proceeding s of the CIKM’09, Hong Kong, China.
- G. Forman, 2003. “An extensive empirical study of feature selection metrics for text classification”. Journal of Machine Learning Research, pp: 1289-1305.
- Hartigan, J. (1975). Clustering algorithms, New York: John Wiley & Sons.
- I.Guyon and A. Elisseeff. “An introduction to variable and feature selection”. Journal of Machine Learning research, 3 (2003), pp: 1157-1182.
- Guyon, J. Weston, S. Barnhill, and V. Vapnik, 2002. “Gene selection for cancer classification using support vector machines”. Machine Learning, 46(1-3): pp: 389–422.
- D.J.Evans, L.P. Tay, 1995. Fast Learning Artificial Neural Network for continuous Input Applications. Kybernetes.24 (3), 1995.
- L.G. Heins, D.R. Tauritz,” Adaptive Resonance Theory (ART): An Introduction”. May/June 1995.
- Kohonen T., 1982.”Self-organized formation of topologically correct feature maps”. Biological Cybernetics, Vol. 43, 1982, pp. 49-59.
- H. Liu and L. Yu, 2005. “Toward integrating feature selection algorithms for classification and clustering”. IEEE Transactions on Knowledge and Data Engineering, 17(4): pp: 491–502.
- V. Roth and T. Lange, 2003. “Feature selection in clustering problems””. In Advances in Neural Information Processing Systems.
- Merz, C.J., Murphy, P.M. 1998. “UCI Repository of machine learning databases”, Irvine, CA: University of California, Department of Information and Computer Science.
- Alex T.L.P., Sandeep Prakash, 2002. “K- Means Fast Learning Artificial Neural Network, An Alternate Network for Classification”, proceedings of the 9 th international conference on neural information processing (ICNOP’02), vol.2, pp: 925-929.
- L. P. Wong, L. P. Tay, 2003. “Centroid stability with K-means fast learning artificial neural networks”. Proceedings of the International Joint Conference on Neural Networks (IJCNN), pp: 1517–1522.
- Carpenter, G. A. and Gross berg S. , 1987. “A Massively Parallel Architecture for a Self-Organizing Neural Pattern Recognition Machine”, Computer vision, Graphics, and Image Processing, Vol. 37, pp: 54-115.
- Grossberg S, 1976. “Adaptive pattern classification and universal recoding, Parallel development and coding of neural feature detector”, biological Cybernet, Vol. 23, pp.121-134.
- Tay, L. P., Evans D. J., 1994. “Fast Learning Artificial Neural Network Using Nearest Neighbors Recall”, Neural Parallel and scientific computations, Vol. 2 (1).
- L. P. Tay, J. M. Zurada, L. P. Wong, and J. Xu, 2007. “The Hierarchical Fast Learning Artificial Neural Network (HieFLANN) - an autonomous platform for hierarchical neural network construction”. IEEE Transactions on Neural Networks, vol. 18, , November 2007, pp: 1645-1657
- X. Yin and L. P. Tay, Feature Extraction Using the K-Means Fast Learning Artificial Neural Network, ICICS, Vol.2, pp. 1004- 1008, 2003.
- R. Kohavi and G. John, 1997. “Wrappers for Feature Subset Selection”. Artificial Intelligence, vol. 97, no: 1-2, pp: 273-324.
- P. Pudil, J. Novovicova` and J. Kittler, 1994. “Floating Search Methods in Feature Selection”. Pattern Recognition Letters, vol. 15, pp. 1119-
- Kohonen T., (1989). “Self Organization and Associative Memory”. 3rd ed. Berlin: Springer-Verlag.
- J. H. Friedman and J. J. Meulman. Clustering objects on subsets of attributes. http://citeseer.ist.psu.edu/friedman02clustering.html, 2002.
- A.L. Blum and P. Langley. Selection of relevant features and examples in machine learning, Artificial Intelligence, 97:245–271, 1997.
- H. Liu and H. Motoda, editors. Feature Extraction, Construction and Selection: A Data Mining Perspective. Boston: Kluwer Academic Publishers, 1998. 2nd Printing, 2001.