stem/AI/Neural Networks/CNN/FCN/ResNet.md
andy 25f73797e3 vault backup: 2023-05-27 23:02:51
Affected files:
.obsidian/graph.json
.obsidian/workspace.json
STEM/AI/Neural Networks/Activation Functions.md
STEM/AI/Neural Networks/CNN/CNN.md
STEM/AI/Neural Networks/CNN/FCN/FCN.md
STEM/AI/Neural Networks/CNN/FCN/FlowNet.md
STEM/AI/Neural Networks/CNN/FCN/Highway Networks.md
STEM/AI/Neural Networks/CNN/FCN/ResNet.md
STEM/AI/Neural Networks/CNN/FCN/Skip Connections.md
STEM/AI/Neural Networks/CNN/GAN/DC-GAN.md
STEM/AI/Neural Networks/CNN/GAN/GAN.md
STEM/AI/Neural Networks/CNN/UpConv.md
STEM/img/highway-vs-residual.png
STEM/img/imagenet-error.png
STEM/img/resnet-arch.png
STEM/img/resnet-arch2.png
STEM/img/skip-connections 1.png
STEM/img/upconv-matrix-result.png
STEM/img/upconv-matrix-transposed-result.png
STEM/img/upconv-matrix.png
STEM/img/upconv-transposed-matrix.png
STEM/img/upconv.png
2023-05-27 23:02:51 +01:00

747 B

  • Residual networks
  • 152 layers
  • Skips every two layers
    • Residual block
  • Later layers learning the identity function
    • Skips help
    • Deep network should be at least as good as shallower one by allowing some layers to do very little
  • Vanishing gradient
    • Allows shortcut paths for gradients
  • Accuracy saturation
    • Adding more layers to suitably deep network increases training error

Design

  • Skips across pairs of Convolutional Layer
    • Elementwise addition
  • All layer 3x3 kernel
  • Spatial size halves each layer
  • Filters doubles each layer
  • FCN

!imagenet-error.png

!resnet-arch.png !resnet-arch2.png