stem/AI/Neural Networks/CNN/Examples.md
andy b30da1d29c vault backup: 2023-06-01 08:11:37
Affected files:
.obsidian/graph.json
.obsidian/workspace.json
Money/Assets/Derivative.md
STEM/AI/Neural Networks/CNN/Examples.md
STEM/AI/Neural Networks/Deep Learning.md
STEM/AI/Neural Networks/MLP/Decision Boundary.md
STEM/CS/Languages/dotNet.md
STEM/Semiconductors/Equations.md
Tattoo/Engineering.md
2023-06-01 08:11:37 +01:00

43 lines
905 B
Markdown

# LeNet
- 1990's
![lenet-1989](../../../img/lenet-1989.png)
- 1989
![lenet-1998](../../../img/lenet-1998.png)
- 1998
# AlexNet
2012
- [[Activation Functions#ReLu|ReLu]]
- Normalisation
![alexnet](../../../img/alexnet.png)
# VGG
2015
- 16 layers over AlexNet's 8
- Looking at vanishing gradient problem
- Xavier
- Similar kernel size throughout
- Gradual filter increase
![vgg-spec](../../../img/vgg-spec.png)
![vgg-arch](../../../img/vgg-arch.png)
# GoogLeNet
2015
- [[Inception Layer]]s
- Multiple [[Deep Learning#Loss Function|Loss]] Functions
![googlenet](../../../img/googlenet.png)
## [[Inception Layer]]
![googlenet-inception](../../../img/googlenet-inception.png)
## Auxiliary [[Deep Learning#Loss Function|Loss]] Functions
- Two other SoftMax blocks
- Help train really deep network
- Vanishing gradient problem
![googlenet-auxilliary-loss](../../../img/googlenet-auxilliary-loss.png)