stem/AI/Neural Networks/CNN
andy 23991f92c9 vault backup: 2023-05-31 21:29:04
Affected files:
.obsidian/app.json
.obsidian/workspace-mobile.json
.obsidian/workspace.json
Events/🪣🪣🪣.md
STEM/AI/Neural Networks/CNN/FCN/README.md
STEM/AI/Neural Networks/CNN/GAN/README.md
STEM/AI/Neural Networks/CNN/README.md
STEM/AI/Neural Networks/MLP/README.md
STEM/AI/Neural Networks/README.md
STEM/AI/Neural Networks/RNN/README.md
STEM/AI/Neural Networks/SLP/README.md
STEM/AI/Neural Networks/Transformers/README.md
Untitled.canvas
2023-05-31 21:29:04 +01:00
..
FCN vault backup: 2023-05-31 21:29:04 2023-05-31 21:29:04 +01:00
GAN vault backup: 2023-05-31 21:29:04 2023-05-31 21:29:04 +01:00
CNN.md vault backup: 2023-05-31 17:33:05 2023-05-31 17:33:05 +01:00
Convolutional Layer.md vault backup: 2023-05-26 18:29:17 2023-05-26 18:29:17 +01:00
Examples.md vault backup: 2023-05-27 00:50:46 2023-05-27 00:50:46 +01:00
Inception Layer.md vault backup: 2023-05-26 18:29:17 2023-05-26 18:29:17 +01:00
Interpretation.md vault backup: 2023-05-27 22:17:56 2023-05-27 22:17:56 +01:00
Max Pooling.md vault backup: 2023-05-26 18:29:17 2023-05-26 18:29:17 +01:00
Normalisation.md vault backup: 2023-05-26 18:29:17 2023-05-26 18:29:17 +01:00
README.md vault backup: 2023-05-31 21:29:04 2023-05-31 21:29:04 +01:00
UpConv.md vault backup: 2023-05-27 23:02:51 2023-05-27 23:02:51 +01:00

Before 2010s

  • Data hungry
    • Need lots of training data
  • Processing power
  • Niche
    • No-one cared/knew about CNNs

After

Full Connected

MLP

  • Move from Convolutional Layer operations towards vector output
  • Stochastic drop-out
    • Sub-sample channels and only connect some to MLP layers

As a Descriptor

  • Most powerful as a deeply learned feature extractor
  • MLP classifier at the end isn't fantastic
    • Use SVM to classify prior to penultimate layer

!cnn-descriptor.png

Finetuning

  • Observations
  • Reuse weights from other network
  • Freeze weights in first 3-5 Convolutional Layer
    • Learning rate = 0
    • Randomly initialise remaining layers
    • Continue with existing weights

!fine-tuning-freezing.png

Training

!under-over-fitting.png