Skip to main content
Unpublished Paper
Notes on Support Vector Machines
(2012)
  • Matt Bogard, Western Kentucky University
Abstract
The most basic idea of support vector machines is to find a line (or hyperplane) that separates classes of data, and use this information to classify new examples. But we don’t want just any line, we want a line that maximizes the distance between classes (we want the best line). This line turns out to be a separating hyperplane that is equidistant between the supporting hyperplanes that ‘support’ the sets that make up each distinct class. The notes that follow discuss the concepts of supporting and separating hyperplanes and inner products as they relate to support vector machines (SVMs). Using simple examples, much detail is given to the mathematical notation used to represent hyperplanes, as well as how the SVM classification works.
Keywords
  • support vector machines,
  • data mining,
  • machine learning
Publication Date
Summer May, 2012
Citation Information
Matt Bogard. "Notes on Support Vector Machines" (2012)
Available at: http://works.bepress.com/matt_bogard/20/