andy
acb7dc429e
Affected files: .obsidian/graph.json .obsidian/workspace-mobile.json .obsidian/workspace.json STEM/AI/Neural Networks/Architectures.md STEM/AI/Neural Networks/CNN/CNN.md STEM/AI/Neural Networks/CNN/Examples.md STEM/AI/Neural Networks/CNN/FCN/FCN.md STEM/AI/Neural Networks/CNN/GAN/DC-GAN.md STEM/AI/Neural Networks/CNN/GAN/GAN.md STEM/AI/Neural Networks/CNN/Interpretation.md STEM/AI/Neural Networks/Deep Learning.md STEM/AI/Neural Networks/MLP/MLP.md STEM/AI/Neural Networks/SLP/Least Mean Square.md STEM/AI/Neural Networks/Transformers/Attention.md STEM/AI/Neural Networks/Transformers/Transformers.md STEM/img/feedforward.png STEM/img/multilayerfeedforward.png STEM/img/recurrent.png STEM/img/recurrentwithhn.png
43 lines
705 B
Markdown
43 lines
705 B
Markdown
# LeNet
|
|
- 1990's
|
|
![[lenet-1989.png]]
|
|
- 1989
|
|
![[lenet-1998.png]]
|
|
- 1998
|
|
|
|
# AlexNet
|
|
2012
|
|
|
|
- [[Activation Functions#ReLu|ReLu]]
|
|
- Normalisation
|
|
|
|
![[alexnet.png]]
|
|
|
|
# VGG
|
|
2015
|
|
|
|
- 16 layers over AlexNet's 8
|
|
- Looking at vanishing gradient problem
|
|
- Xavier
|
|
- Similar kernel size throughout
|
|
- Gradual filter increase
|
|
|
|
![[vgg-spec.png]]
|
|
![[vgg-arch.png]]
|
|
|
|
# GoogLeNet
|
|
2015
|
|
|
|
- [[Inception Layer]]s
|
|
- Multiple [[Deep Learning#Loss Function|Loss]] Functions
|
|
|
|
![[googlenet.png]]
|
|
|
|
## [[Inception Layer]]
|
|
![[googlenet-inception.png]]
|
|
## Auxiliary [[Deep Learning#Loss Function|Loss]] Functions
|
|
- Two other SoftMax blocks
|
|
- Help train really deep network
|
|
- Vanishing gradient problem
|
|
|
|
![[googlenet-auxilliary-loss.png]] |