stem/AI/Neural Networks/CNN/Examples.md
andy acb7dc429e vault backup: 2023-05-27 00:50:46
Affected files:
.obsidian/graph.json
.obsidian/workspace-mobile.json
.obsidian/workspace.json
STEM/AI/Neural Networks/Architectures.md
STEM/AI/Neural Networks/CNN/CNN.md
STEM/AI/Neural Networks/CNN/Examples.md
STEM/AI/Neural Networks/CNN/FCN/FCN.md
STEM/AI/Neural Networks/CNN/GAN/DC-GAN.md
STEM/AI/Neural Networks/CNN/GAN/GAN.md
STEM/AI/Neural Networks/CNN/Interpretation.md
STEM/AI/Neural Networks/Deep Learning.md
STEM/AI/Neural Networks/MLP/MLP.md
STEM/AI/Neural Networks/SLP/Least Mean Square.md
STEM/AI/Neural Networks/Transformers/Attention.md
STEM/AI/Neural Networks/Transformers/Transformers.md
STEM/img/feedforward.png
STEM/img/multilayerfeedforward.png
STEM/img/recurrent.png
STEM/img/recurrentwithhn.png
2023-05-27 00:50:46 +01:00

705 B

LeNet

  • 1990's !lenet-1989.png
  • 1989 !lenet-1998.png
  • 1998

AlexNet

2012

!alexnet.png

VGG

2015

  • 16 layers over AlexNet's 8
  • Looking at vanishing gradient problem
    • Xavier
  • Similar kernel size throughout
  • Gradual filter increase

!vgg-spec.png !vgg-arch.png

GoogLeNet

2015

!googlenet.png

Inception Layer

!googlenet-inception.png

Auxiliary Deep Learning#Loss Function Functions

  • Two other SoftMax blocks
  • Help train really deep network
    • Vanishing gradient problem

!googlenet-auxilliary-loss.png