stem/AI/Neural Networks/CNN/Examples.md
andy acb7dc429e vault backup: 2023-05-27 00:50:46
Affected files:
.obsidian/graph.json
.obsidian/workspace-mobile.json
.obsidian/workspace.json
STEM/AI/Neural Networks/Architectures.md
STEM/AI/Neural Networks/CNN/CNN.md
STEM/AI/Neural Networks/CNN/Examples.md
STEM/AI/Neural Networks/CNN/FCN/FCN.md
STEM/AI/Neural Networks/CNN/GAN/DC-GAN.md
STEM/AI/Neural Networks/CNN/GAN/GAN.md
STEM/AI/Neural Networks/CNN/Interpretation.md
STEM/AI/Neural Networks/Deep Learning.md
STEM/AI/Neural Networks/MLP/MLP.md
STEM/AI/Neural Networks/SLP/Least Mean Square.md
STEM/AI/Neural Networks/Transformers/Attention.md
STEM/AI/Neural Networks/Transformers/Transformers.md
STEM/img/feedforward.png
STEM/img/multilayerfeedforward.png
STEM/img/recurrent.png
STEM/img/recurrentwithhn.png
2023-05-27 00:50:46 +01:00

43 lines
705 B
Markdown

# LeNet
- 1990's
![[lenet-1989.png]]
- 1989
![[lenet-1998.png]]
- 1998
# AlexNet
2012
- [[Activation Functions#ReLu|ReLu]]
- Normalisation
![[alexnet.png]]
# VGG
2015
- 16 layers over AlexNet's 8
- Looking at vanishing gradient problem
- Xavier
- Similar kernel size throughout
- Gradual filter increase
![[vgg-spec.png]]
![[vgg-arch.png]]
# GoogLeNet
2015
- [[Inception Layer]]s
- Multiple [[Deep Learning#Loss Function|Loss]] Functions
![[googlenet.png]]
## [[Inception Layer]]
![[googlenet-inception.png]]
## Auxiliary [[Deep Learning#Loss Function|Loss]] Functions
- Two other SoftMax blocks
- Help train really deep network
- Vanishing gradient problem
![[googlenet-auxilliary-loss.png]]