stem/AI/Neural Networks/CNN/Examples.md
andy d7ab8f329a vault backup: 2023-06-05 17:01:29
Affected files:
Money/Assets/Financial Instruments.md
Money/Assets/Security.md
Money/Markets/Markets.md
Politcs/Now.md
STEM/AI/Neural Networks/CNN/Examples.md
STEM/AI/Neural Networks/CNN/FCN/FCN.md
STEM/AI/Neural Networks/CNN/FCN/FlowNet.md
STEM/AI/Neural Networks/CNN/FCN/Highway Networks.md
STEM/AI/Neural Networks/CNN/FCN/ResNet.md
STEM/AI/Neural Networks/CNN/FCN/Skip Connections.md
STEM/AI/Neural Networks/CNN/FCN/Super-Resolution.md
STEM/AI/Neural Networks/CNN/GAN/DC-GAN.md
STEM/AI/Neural Networks/CNN/GAN/GAN.md
STEM/AI/Neural Networks/CNN/GAN/StackGAN.md
STEM/AI/Neural Networks/CNN/Inception Layer.md
STEM/AI/Neural Networks/CNN/Interpretation.md
STEM/AI/Neural Networks/CNN/Max Pooling.md
STEM/AI/Neural Networks/CNN/Normalisation.md
STEM/AI/Neural Networks/CNN/UpConv.md
STEM/AI/Neural Networks/CV/Layer Structure.md
STEM/AI/Neural Networks/MLP/MLP.md
STEM/AI/Neural Networks/Neural Networks.md
STEM/AI/Neural Networks/RNN/LSTM.md
STEM/AI/Neural Networks/RNN/RNN.md
STEM/AI/Neural Networks/RNN/VQA.md
STEM/AI/Neural Networks/SLP/Least Mean Square.md
STEM/AI/Neural Networks/SLP/Perceptron Convergence.md
STEM/AI/Neural Networks/SLP/SLP.md
STEM/AI/Neural Networks/Transformers/LLM.md
STEM/AI/Neural Networks/Transformers/Transformers.md
STEM/AI/Properties.md
STEM/CS/Language Binding.md
STEM/Light.md
STEM/Maths/Tensor.md
STEM/Quantum/Orbitals.md
STEM/Quantum/Schrödinger.md
STEM/Quantum/Standard Model.md
STEM/Quantum/Wave Function.md
Tattoo/Music.md
Tattoo/Plans.md
Tattoo/Sources.md
2023-06-05 17:01:29 +01:00

43 lines
945 B
Markdown

# LeNet
- 1990's
![lenet-1989](../../../img/lenet-1989.png)
- 1989
![lenet-1998](../../../img/lenet-1998.png)
- 1998
# AlexNet
2012
- [[Activation Functions#ReLu|ReLu]]
- Normalisation
![alexnet](../../../img/alexnet.png)
# VGG
2015
- 16 layers over AlexNet's 8
- Looking at vanishing gradient problem
- Xavier
- Similar kernel size throughout
- Gradual filter increase
![vgg-spec](../../../img/vgg-spec.png)
![vgg-arch](../../../img/vgg-arch.png)
# GoogLeNet
2015
- [Inception Layer](Inception%20Layer.md)s
- Multiple [[Deep Learning#Loss Function|Loss]] Functions
![googlenet](../../../img/googlenet.png)
## [Inception Layer](Inception%20Layer.md)
![googlenet-inception](../../../img/googlenet-inception.png)
## Auxiliary [[Deep Learning#Loss Function|Loss]] Functions
- Two other SoftMax blocks
- Help train really deep network
- Vanishing gradient problem
![googlenet-auxilliary-loss](../../../img/googlenet-auxilliary-loss.png)