Skip to main content
Presentation
GA-Facilitated Classifier Optimization with Varying Similarity Measures
GECCO '05 Proceedings of the 2005 Conference on Genetic and Evolutionary Computation
  • Michael R. Peterson, Wright State University - Main Campus
  • Travis E. Doom, Wright State University - Main Campus
  • Michael L. Raymer, Wright State University - Main Campus
Document Type
Conference Proceeding
Publication Date
6-1-2005
Abstract

Genetic algorithms are powerful tools for k-nearest neighbors classification. Traditional knn classifiers employ Euclidian distance to assess neighbor similarity, though other measures may also be used. GAs can search for optimal linear weights of features to improve knn performance using both Euclidian distance and cosine similarity. GAs also optimize additive feature offsets in search of an optimal point of reference for assessing angular similarity using the cosine measure. This poster explores weight and offset optimization for knn with varying similarity measures, including Euclidian distance (weights only), cosine similarity, and Pearson correlation. The use of offset optimization here represents a novel technique for enhancing Pearson/knn classification performance. Experiments compare optimized and non-optimized classifiers using public domain datasets. While unoptimized Euclidian knn often outperforms its cosine and Pearson counterparts, optimized Pearson and cosine knn classifiers show equal or improved accuracy compared to weight-optimized Euclidian knn.

Comments

This paper was presented at the GECCO '05 Genetic and Evolutionary Computation Conference Washington, DC, June 25 - 29, 2005.

Citation Information
Michael R. Peterson, Travis E. Doom and Michael L. Raymer. "GA-Facilitated Classifier Optimization with Varying Similarity Measures" GECCO '05 Proceedings of the 2005 Conference on Genetic and Evolutionary Computation (2005) p. 1549 - 1550 ISSN: 1-59593-010-8
Available at: http://works.bepress.com/travis_doom/20/