stem/AI/Classification/Supervised/SVM.md

74 lines
2.1 KiB
Markdown
Raw Normal View History

vault backup: 2023-06-07 09:02:27 Affected files: STEM/AI/Classification/Classification.md STEM/AI/Classification/Decision Trees.md STEM/AI/Classification/Gradient Boosting Machine.md STEM/AI/Classification/Logistic Regression.md STEM/AI/Classification/Random Forest.md STEM/AI/Classification/Supervised.md STEM/AI/Classification/Supervised/README.md STEM/AI/Classification/Supervised/SVM.md STEM/AI/Classification/Supervised/Supervised.md STEM/AI/Learning.md STEM/AI/Neural Networks/Learning/Boltzmann.md STEM/AI/Neural Networks/Learning/Competitive Learning.md STEM/AI/Neural Networks/Learning/Credit-Assignment Problem.md STEM/AI/Neural Networks/Learning/Hebbian.md STEM/AI/Neural Networks/Learning/Learning.md STEM/AI/Neural Networks/Learning/README.md STEM/AI/Neural Networks/RNN/Autoencoder.md STEM/AI/Neural Networks/RNN/Deep Image Prior.md STEM/AI/Neural Networks/RNN/MoCo.md STEM/AI/Neural Networks/RNN/Representation Learning.md STEM/AI/Neural Networks/RNN/SimCLR.md STEM/img/comp-learning.png STEM/img/competitive-geometric.png STEM/img/confusion-matrix.png STEM/img/decision-tree.png STEM/img/deep-image-prior-arch.png STEM/img/deep-image-prior-results.png STEM/img/hebb-learning.png STEM/img/moco.png STEM/img/receiver-operator-curve.png STEM/img/reinforcement-learning.png STEM/img/rnn+autoencoder-variational.png STEM/img/rnn+autoencoder.png STEM/img/simclr.png STEM/img/sup-representation-learning.png STEM/img/svm-c.png STEM/img/svm-non-linear-project.png STEM/img/svm-non-linear-separated.png STEM/img/svm-non-linear.png STEM/img/svm-optimal-plane.png STEM/img/svm.png STEM/img/unsup-representation-learning.png
2023-06-07 09:02:27 +01:00
[Towards Data Science: SVM](https://towardsdatascience.com/support-vector-machines-svm-c9ef22815589)
[Towards Data Science: SVM an overview](https://towardsdatascience.com/https-medium-com-pupalerushikesh-svm-f4b42800e989)
- Dividing line between two classes
- Optimal hyperplane for a space
- Margin maximising hyperplane
- Can be used for
- Classification
- SVC
- Regression
- SVR
- Alternative to Eigenmodels for supervised classification
- For smaller datasets
- Hard to scale on larger sets
![](../../../img/svm.png)
- Support vector points
- Closest points to the hyperplane
- Lines to hyperplane are support vectors
- Maximise margin between classes
- Take dot product of test point with vector perpendicular to support vector
- Sign determines class
# Pros
- Linear or non-linear discrimination
- Effective in higher dimensions
- Effective when number of features higher than training examples
- Best for when classes are separable
- Outliers have less impact
# Cons
- Long time for larger datasets
- Doesnt do well when overlapping
- Selecting appropriate kernel
# Parameters
- C
- How smooth the decision boundary is
- Larger C makes more curvy
- ![](../../../img/svm-c.png)
- Gamma
- Controls area of influence for data points
- High gamma reduces influence of faraway points
# Hyperplane
$$\beta_0+\beta_1X_1+\beta_2X_2+\cdot\cdot\cdot+\beta_pX_p=0$$
- $p$-dimensional space
- If $X$ satisfies equation
- On plane
- Maximal margin hyperplane
- Perpendicular distance from each observation to given plane
- Best plane has highest distance
- If support vector points shift
- Plane shifts
- Hyperplane only depends on the support vectors
- Rest don't matter
![](../../../img/svm-optimal-plane.png)
# Linearly Separable
- Not linearly separable
![](../../../img/svm-non-linear.png)
- Add another dimension
- $z=x^2+y^2$
- Square of the distance of the point from the origin
![](../../../img/svm-non-linear-project.png)
- Now separable
- Let $z=k$
- $k$ is a constant
- Project linear separator back to 2D
- Get circle
![](../../../img/svm-non-linear-separated.png)