stem/AI/Neural Networks/CNN/Examples.md
andy 1c441487f9 vault backup: 2023-06-02 17:17:29
Affected files:
.obsidian/workspace-mobile.json
.obsidian/workspace.json
Lab/Scratch Domain.md
Money/Econ.md
STEM/AI/Classification/Classification.md
STEM/AI/Classification/README.md
STEM/AI/Classification/Supervised.md
STEM/AI/Neural Networks/CNN/Examples.md
STEM/AI/Neural Networks/CNN/FCN/FCN.md
STEM/AI/Neural Networks/CNN/FCN/FlowNet.md
STEM/AI/Neural Networks/CV/Filters.md
STEM/img/coordinate-change.png
STEM/img/gaussian-class.png
Tattoo/Engineering.md
Want.md
2023-06-02 17:17:29 +01:00

43 lines
925 B
Markdown

# LeNet
- 1990's
![lenet-1989](../../../img/lenet-1989.png)
- 1989
![lenet-1998](../../../img/lenet-1998.png)
- 1998
# AlexNet
2012
- [[Activation Functions#ReLu|ReLu]]
- Normalisation
![alexnet](../../../img/alexnet.png)
# VGG
2015
- 16 layers over AlexNet's 8
- Looking at vanishing gradient problem
- Xavier
- Similar kernel size throughout
- Gradual filter increase
![vgg-spec](../../../img/vgg-spec.png)
![vgg-arch](../../../img/vgg-arch.png)
# GoogLeNet
2015
- [[Inception Layer]]s
- Multiple [[Deep Learning#Loss Function|Loss]] Functions
![googlenet](../../../img/googlenet.png)
## [Inception Layer](Inception%20Layer.md)
![googlenet-inception](../../../img/googlenet-inception.png)
## Auxiliary [[Deep Learning#Loss Function|Loss]] Functions
- Two other SoftMax blocks
- Help train really deep network
- Vanishing gradient problem
![googlenet-auxilliary-loss](../../../img/googlenet-auxilliary-loss.png)