stem/AI/Neural Networks/CNN
andy 3606944190 vault backup: 2023-06-08 17:52:08
Affected files:
.obsidian/graph.json
.obsidian/workspace-mobile.json
.obsidian/workspace.json
Food/Meal Plan.md
Lab/Linux/Alpine.md
Lab/Linux/KDE.md
Lab/Scratch Domain.md
Lab/Windows/Active Directory.md
Languages/Spanish/README.md
Languages/Spanish/Spanish.md
Money/Accounts.md
Money/Monthly/23-04.md
Money/Monthly/23-05.md
Money/Monthly/23-06.md
Projects/Mixonomer.md
Projects/NoteCrawler.md
Projects/Projects.md
Projects/README.md
Projects/Selector.md
Projects/To Do App.md
Projects/img/selector-arch.png
STEM/AI/Classification/Supervised/SVM.md
STEM/AI/Neural Networks/CNN/FCN/Super-Resolution.md
STEM/AI/Neural Networks/CNN/GAN/GAN.md
STEM/AI/Neural Networks/CNN/GAN/cGAN.md
STEM/AI/Neural Networks/CNN/Interpretation.md
STEM/AI/Neural Networks/Deep Learning.md
STEM/AI/Neural Networks/MLP/MLP.md
STEM/AI/Neural Networks/Properties+Capabilities.md
STEM/AI/Neural Networks/RNN/Representation Learning.md
STEM/AI/Pattern Matching/Markov/Markov.md
STEM/AI/Searching/Informed.md
STEM/AI/Searching/README.md
STEM/AI/Searching/Searching.md
STEM/AI/Searching/Uninformed.md
STEM/CS/Languages/Javascript.md
STEM/CS/Languages/Python.md
STEM/CS/Languages/dotNet.md
STEM/CS/Resources.md
STEM/IOT/Cyber-Physical Systems.md
STEM/IOT/Networking/Networking.md
STEM/IOT/Networking/README.md
STEM/IOT/Software Services.md
STEM/img/cyberphysical-social-data.png
STEM/img/cyberphysical-system-types.png
STEM/img/cyberphysical-systems.png
STEM/img/depth-first-cons.png
STEM/img/depth-first.png
STEM/img/iot-mesh-network.png
STEM/img/iot-network-radar.png
STEM/img/iot-network-types 1.png
STEM/img/iot-network-types.png
STEM/img/markov-start-end-matrix.png
STEM/img/markov-start-end-probs.png
STEM/img/markov-start-end.png
STEM/img/markov-state-duration.png
STEM/img/markov-state.png
STEM/img/markov-weather.png
STEM/img/search-bidirectional.png
STEM/img/search-breadth-first.png
STEM/img/search-lim-goal.png
STEM/img/search-lim1.png
STEM/img/search-lim2.png
STEM/img/search-lim3-2.png
STEM/img/search-lim3.png
STEM/img/search-lim4.png
STEM/img/searching-graph-tree.png
STEM/img/searching-graph.png
Work/Freelancing.md
2023-06-08 17:52:09 +01:00
..
FCN vault backup: 2023-06-08 17:52:08 2023-06-08 17:52:09 +01:00
GAN vault backup: 2023-06-08 17:52:08 2023-06-08 17:52:09 +01:00
CNN.md vault backup: 2023-06-04 22:30:39 2023-06-04 22:30:39 +01:00
Convolutional Layer.md vault backup: 2023-05-26 18:29:17 2023-05-26 18:29:17 +01:00
Examples.md vault backup: 2023-06-06 11:48:49 2023-06-06 11:48:49 +01:00
Inception Layer.md vault backup: 2023-06-05 17:01:29 2023-06-05 17:01:29 +01:00
Interpretation.md vault backup: 2023-06-08 17:52:08 2023-06-08 17:52:09 +01:00
Max Pooling.md vault backup: 2023-06-05 17:01:29 2023-06-05 17:01:29 +01:00
Normalisation.md vault backup: 2023-06-05 17:01:29 2023-06-05 17:01:29 +01:00
README.md vault backup: 2023-05-31 21:29:04 2023-05-31 21:29:04 +01:00
UpConv.md vault backup: 2023-06-06 11:48:49 2023-06-06 11:48:49 +01:00

Before 2010s

  • Data hungry
    • Need lots of training data
  • Processing power
  • Niche
    • No-one cared/knew about CNNs

After

  • ImageNet
    • 16m images, 1000 classes
  • GPUs
    • General processing GPUs
    • CUDA
  • NIPS/ECCV 2012
    • Double digit % gain on ImageNet accuracy

Full Connected

Dense

  • Move from convolutional operations towards vector output
  • Stochastic drop-out
    • Sub-sample channels and only connect some to dense layers

As a Descriptor

  • Most powerful as a deeply learned feature extractor
  • Dense classifier at the end isn't fantastic
    • Use SVM to classify prior to penultimate layer

cnn-descriptor

Finetuning

  • Observations
    • Most CNNs have similar weights in conv1
    • Most useful CNNs have several conv layers
      • Many weights
      • Lots of training data
    • Training data is hard to get
      • Labelling
  • Reuse weights from other network
  • Freeze weights in first 3-5 conv layers
    • Learning rate = 0
    • Randomly initialise remaining layers
    • Continue with existing weights

fine-tuning-freezing

Training

  • Validation & training loss
  • Early
    • Under-fitting
    • Training not representative
  • Later
    • Overfitting
  • V.loss can help adjust learning rate
    • Or indicate when to stop training

under-over-fitting