
Dr. Amrita
Related Authors
Scott R. Stroud
The University of Texas at Austin
David Seamon
Kansas State University
Armando Marques-Guedes
UNL - New University of Lisbon
Bogdana Huma
Vrije Universiteit Amsterdam
Vivienne Orchard
University of Southampton
Paul Tobin
Dublin Institute of Technology
Jaclyn L Ocumpaugh
Teachers College, Columbia University
Roshan Chitrakar
Nepal College of Information Technology
Rebecca Garden
SUNY: Upstate Medical University
Gert-Jan van der Heiden
Radboud University Nijmegen
Uploads
Papers by Dr. Amrita
method. In practice, it has been observed that the high dimensionality of the feature vector degrades classification performance. To reduce the dimensionality, without compromising the performance, a new hybrid feature selection method has been introduced and its performance is measured on KDD Cup’99 dataset by the classifiers Naïve Bayes and C4.5. Three sets of experiments have been
conducted using full feature set, reduced sets of features obtained using four well known feature selection methods as Correlation-based Feature Selection (CFS),
Consistency-based Feature Selection (CON), Information Gain (IG), Gain Ratio (GR) and the proposed method on the said dataset and classifiers. In first experiment, classifier NaĂŻve Bayes and C4.5 yielded classification accuracy 97.5% and
99.8% respectively. In second set of experiments, the best performance (accuracy) of these classifiers was achieved as 99.1% and 99.8% by the method IG. In third experiment, 6 features are obtained using proposed method and noted the same as 99.4% and 99.9%. The proposed hybrid feature selection method outperformed earlier mentioned methods on various metrics.
method. In practice, it has been observed that the high dimensionality of the feature vector degrades classification performance. To reduce the dimensionality, without compromising the performance, a new hybrid feature selection method has been introduced and its performance is measured on KDD Cup’99 dataset by the classifiers Naïve Bayes and C4.5. Three sets of experiments have been
conducted using full feature set, reduced sets of features obtained using four well known feature selection methods as Correlation-based Feature Selection (CFS),
Consistency-based Feature Selection (CON), Information Gain (IG), Gain Ratio (GR) and the proposed method on the said dataset and classifiers. In first experiment, classifier NaĂŻve Bayes and C4.5 yielded classification accuracy 97.5% and
99.8% respectively. In second set of experiments, the best performance (accuracy) of these classifiers was achieved as 99.1% and 99.8% by the method IG. In third experiment, 6 features are obtained using proposed method and noted the same as 99.4% and 99.9%. The proposed hybrid feature selection method outperformed earlier mentioned methods on various metrics.