stem/AI/Classification/Supervised/SVM.md
Andy Pack efa7a84a8b vault backup: 2023-12-27 21:56:22
Affected files:
.obsidian/graph.json
.obsidian/workspace-mobile.json
.obsidian/workspace.json
Languages/Spanish/Spanish.md
STEM/AI/Classification/Classification.md
STEM/AI/Classification/Decision Trees.md
STEM/AI/Classification/Logistic Regression.md
STEM/AI/Classification/Random Forest.md
STEM/AI/Classification/Supervised/SVM.md
STEM/AI/Classification/Supervised/Supervised.md
STEM/AI/Neural Networks/Activation Functions.md
STEM/AI/Neural Networks/CNN/CNN.md
STEM/AI/Neural Networks/CNN/GAN/DC-GAN.md
STEM/AI/Neural Networks/CNN/GAN/GAN.md
STEM/AI/Neural Networks/Deep Learning.md
STEM/AI/Neural Networks/Properties+Capabilities.md
STEM/AI/Neural Networks/SLP/Perceptron Convergence.md
2023-12-27 21:56:22 +00:00

79 lines
2.2 KiB
Markdown
Raw Blame History

This file contains invisible Unicode characters

This file contains invisible Unicode characters that are indistinguishable to humans but may be processed differently by a computer. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

---
tags:
- ai
- classification
---
[Towards Data Science: SVM](https://towardsdatascience.com/support-vector-machines-svm-c9ef22815589)
[Towards Data Science: SVM an overview](https://towardsdatascience.com/https-medium-com-pupalerushikesh-svm-f4b42800e989)
- Dividing line between two classes
- Optimal hyperplane for a space
- Margin maximising hyperplane
- Can be used for
- [Classification](../Classification.md)
- SVC
- Regression
- SVR
- Alternative to Eigenmodels for [supervised](../../Learning.md#Supervised) classification
- For smaller datasets
- Hard to scale on larger sets
![](../../../img/svm.png)
- Support vector points
- Closest points to the hyperplane
- Lines to hyperplane are support vectors
- Maximise margin between classes
- Take dot product of test point with vector perpendicular to support vector
- Sign determines class
# Pros
- Linear or non-linear discrimination
- Effective in higher dimensions
- Effective when number of features higher than training examples
- Best for when classes are separable
- Outliers have less impact
# Cons
- Long time for larger datasets
- Doesnt do well when overlapping
- Selecting appropriate kernel
# Parameters
- C
- How smooth the decision boundary is
- Larger C makes more curvy
- ![](../../../img/svm-c.png)
- Gamma
- Controls area of influence for data points
- High gamma reduces influence of faraway points
# Hyperplane
$$\beta_0+\beta_1X_1+\beta_2X_2+\cdot\cdot\cdot+\beta_pX_p=0$$
- $p$-dimensional space
- If $X$ satisfies equation
- On plane
- Maximal margin hyperplane
- Perpendicular distance from each observation to given plane
- Best plane has highest distance
- If support vector points shift
- Plane shifts
- Hyperplane only depends on the support vectors
- Rest don't matter
![](../../../img/svm-optimal-plane.png)
# Linearly Separable
- Not linearly separable
![](../../../img/svm-non-linear.png)
- Add another dimension
- $z=x^2+y^2$
- Square of the distance of the point from the origin
![](../../../img/svm-non-linear-project.png)
- Now separable
- Let $z=k$
- $k$ is a constant
- Project linear separator back to 2D
- Get circle
![](../../../img/svm-non-linear-separated.png)