stem/AI/Classification/Supervised/SVM.md
Andy Pack efa7a84a8b vault backup: 2023-12-27 21:56:22
Affected files:
.obsidian/graph.json
.obsidian/workspace-mobile.json
.obsidian/workspace.json
Languages/Spanish/Spanish.md
STEM/AI/Classification/Classification.md
STEM/AI/Classification/Decision Trees.md
STEM/AI/Classification/Logistic Regression.md
STEM/AI/Classification/Random Forest.md
STEM/AI/Classification/Supervised/SVM.md
STEM/AI/Classification/Supervised/Supervised.md
STEM/AI/Neural Networks/Activation Functions.md
STEM/AI/Neural Networks/CNN/CNN.md
STEM/AI/Neural Networks/CNN/GAN/DC-GAN.md
STEM/AI/Neural Networks/CNN/GAN/GAN.md
STEM/AI/Neural Networks/Deep Learning.md
STEM/AI/Neural Networks/Properties+Capabilities.md
STEM/AI/Neural Networks/SLP/Perceptron Convergence.md
2023-12-27 21:56:22 +00:00

2.2 KiB
Raw Permalink Blame History

tags
ai
classification

Towards Data Science: SVM Towards Data Science: SVM an overview

  • Dividing line between two classes
    • Optimal hyperplane for a space
    • Margin maximising hyperplane
  • Can be used for
  • Alternative to Eigenmodels for supervised classification
  • For smaller datasets
    • Hard to scale on larger sets

  • Support vector points

    • Closest points to the hyperplane
    • Lines to hyperplane are support vectors
  • Maximise margin between classes

  • Take dot product of test point with vector perpendicular to support vector

  • Sign determines class

Pros

  • Linear or non-linear discrimination
  • Effective in higher dimensions
  • Effective when number of features higher than training examples
  • Best for when classes are separable
  • Outliers have less impact

Cons

  • Long time for larger datasets
  • Doesnt do well when overlapping
  • Selecting appropriate kernel

Parameters

  • C
    • How smooth the decision boundary is
    • Larger C makes more curvy
  • Gamma
    • Controls area of influence for data points
    • High gamma reduces influence of faraway points

Hyperplane

\beta_0+\beta_1X_1+\beta_2X_2+\cdot\cdot\cdot+\beta_pX_p=0
  • $p$-dimensional space
  • If $X$ satisfies equation
    • On plane
  • Maximal margin hyperplane
  • Perpendicular distance from each observation to given plane
    • Best plane has highest distance
  • If support vector points shift
    • Plane shifts
    • Hyperplane only depends on the support vectors
      • Rest don't matter

Linearly Separable

  • Not linearly separable
  • Add another dimension
    • z=x^2+y^2
  • Square of the distance of the point from the origin
  • Now separable
  • Let z=k
    • $k$ is a constant
  • Project linear separator back to 2D
    • Get circle