stem/AI/Neural Networks/CNN/CNN.md
andy 4cc2e79866 vault backup: 2023-05-31 22:21:56
Affected files:
.obsidian/global-search.json
.obsidian/workspace.json
Health/Alexithymia.md
Health/BWS.md
STEM/AI/Neural Networks/Activation Functions.md
STEM/AI/Neural Networks/Architectures.md
STEM/AI/Neural Networks/CNN/CNN.md
STEM/AI/Neural Networks/MLP/Back-Propagation.md
STEM/AI/Neural Networks/Transformers/Attention.md
STEM/CS/Calling Conventions.md
STEM/CS/Languages/Assembly.md
2023-05-31 22:21:56 +01:00

1.5 KiB

Before 2010s

  • Data hungry
    • Need lots of training data
  • Processing power
  • Niche
    • No-one cared/knew about CNNs

After

  • ImageNet
    • 16m images, 1000 classes
  • GPUs
    • General processing GPUs
    • CUDA
  • NIPS/ECCV 2012
    • Double digit % gain on ImageNet accuracy

Full Connected

MLP

  • Move from Convolutional Layer operations towards vector output
  • Stochastic drop-out
    • Sub-sample channels and only connect some to MLP layers

As a Descriptor

  • Most powerful as a deeply learned feature extractor
  • MLP classifier at the end isn't fantastic
    • Use SVM to classify prior to penultimate layer

!cnn-descriptor.png

Finetuning

  • Observations
  • Reuse weights from other network
  • Freeze weights in first 3-5 Convolutional Layer
    • Learning rate = 0
    • Randomly initialise remaining layers
    • Continue with existing weights

!fine-tuning-freezing.png

Training

!under-over-fitting.png