andy
7bc4dffd8b
Affected files: STEM/AI/Neural Networks/CNN/Examples.md STEM/AI/Neural Networks/CNN/FCN/FCN.md STEM/AI/Neural Networks/CNN/FCN/ResNet.md STEM/AI/Neural Networks/CNN/FCN/Skip Connections.md STEM/AI/Neural Networks/CNN/GAN/DC-GAN.md STEM/AI/Neural Networks/CNN/GAN/GAN.md STEM/AI/Neural Networks/CNN/Interpretation.md STEM/AI/Neural Networks/CNN/UpConv.md STEM/AI/Neural Networks/Deep Learning.md STEM/AI/Neural Networks/MLP/MLP.md STEM/AI/Neural Networks/Properties+Capabilities.md STEM/AI/Neural Networks/SLP/Least Mean Square.md STEM/AI/Neural Networks/SLP/SLP.md STEM/AI/Neural Networks/Transformers/Transformers.md STEM/AI/Properties.md STEM/CS/Language Binding.md STEM/CS/Languages/dotNet.md STEM/Signal Proc/Image/Image Processing.md |
||
---|---|---|
.. | ||
FCN | ||
GAN | ||
CNN.md | ||
Convolutional Layer.md | ||
Examples.md | ||
Inception Layer.md | ||
Interpretation.md | ||
Max Pooling.md | ||
Normalisation.md | ||
README.md | ||
UpConv.md |
Before 2010s
- Data hungry
- Need lots of training data
- Processing power
- Niche
- No-one cared/knew about CNNs
After
- ImageNet
- 16m images, 1000 classes
- GPUs
- General processing GPUs
- CUDA
- NIPS/ECCV 2012
- Double digit % gain on ImageNet accuracy
Full Connected
- Move from convolutional operations towards vector output
- Stochastic drop-out
- Sub-sample channels and only connect some to dense layers
As a Descriptor
- Most powerful as a deeply learned feature extractor
- Dense classifier at the end isn't fantastic
- Use SVM to classify prior to penultimate layer
Finetuning
- Observations
- Most CNNs have similar weights in conv1
- Most useful CNNs have several conv layers
- Many weights
- Lots of training data
- Training data is hard to get
- Labelling
- Reuse weights from other network
- Freeze weights in first 3-5 conv layers
- Learning rate = 0
- Randomly initialise remaining layers
- Continue with existing weights