vault backup: 2023-05-27 23:02:51

Affected files:
.obsidian/graph.json
.obsidian/workspace.json
STEM/AI/Neural Networks/Activation Functions.md
STEM/AI/Neural Networks/CNN/CNN.md
STEM/AI/Neural Networks/CNN/FCN/FCN.md
STEM/AI/Neural Networks/CNN/FCN/FlowNet.md
STEM/AI/Neural Networks/CNN/FCN/Highway Networks.md
STEM/AI/Neural Networks/CNN/FCN/ResNet.md
STEM/AI/Neural Networks/CNN/FCN/Skip Connections.md
STEM/AI/Neural Networks/CNN/GAN/DC-GAN.md
STEM/AI/Neural Networks/CNN/GAN/GAN.md
STEM/AI/Neural Networks/CNN/UpConv.md
STEM/img/highway-vs-residual.png
STEM/img/imagenet-error.png
STEM/img/resnet-arch.png
STEM/img/resnet-arch2.png
STEM/img/skip-connections 1.png
STEM/img/upconv-matrix-result.png
STEM/img/upconv-matrix-transposed-result.png
STEM/img/upconv-matrix.png
STEM/img/upconv-transposed-matrix.png
STEM/img/upconv.png
This commit is contained in:
andy 2023-05-27 23:02:51 +01:00
parent 33ac3007bc
commit 25f73797e3
20 changed files with 67 additions and 19 deletions

View File

@ -53,7 +53,7 @@ Rectilinear
- For deep networks
- $y=max(0,x)$
- CNNs
- Breaks associativity of successive convolutions
- Breaks associativity of successive [[convolution]]s
- Critical for learning complex functions
- Sometimes small scalar for negative
- Leaky ReLu

View File

@ -15,7 +15,7 @@
# Full Connected
[[MLP|Dense]]
- Move from convolutional operations towards vector output
- Move from [[Convolutional Layer|convolutional]] operations towards vector output
- Stochastic drop-out
- Sub-sample channels and only connect some to [[MLP|dense]] layers
@ -28,14 +28,14 @@
# Finetuning
- Observations
- Most CNNs have similar weights in conv1
- Most useful CNNs have several conv layers
- Most CNNs have similar weights in [[Convolutional Layer|conv1]]
- Most useful CNNs have several [[Convolutional Layer|conv layers]]
- Many weights
- Lots of training data
- Training data is hard to get
- Labelling
- Reuse weights from other network
- Freeze weights in first 3-5 conv layers
- Freeze weights in first 3-5 [[Convolutional Layer|conv layers]]
- Learning rate = 0
- Randomly initialise remaining layers
- Continue with existing weights

View File

@ -1,6 +1,6 @@
Fully [[Convolution]]al Network
Convolutional and up-convolutional layers with [[Activation Functions#ReLu|ReLu]] but no others (pooling)
[[Convolutional Layer|Convolutional]] and [[UpConv|up-convolutional layers]] with [[Activation Functions#ReLu|ReLu]] but no others (pooling)
- All some sort of Encoder-Decoder
Contractive → [[UpConv]]

View File

@ -7,7 +7,7 @@ Optical Flow
![[flownet.png]]
# Skip Connections
# [[Skip Connections]]
- Further through the network information is condensed
- Less high frequency information
- Link encoder layers to [[upconv]] layers

View File

@ -0,0 +1,9 @@
- [[Skip connections]] across individual layers
- Conditionally
- Soft gates
- Learn vs carry
- Gradients propagate further
- Inspired by [[LSTM]] [[RNN]]s
![[highway-vs-residual.png]]
![[skip-connections 1.png]]

View File

@ -12,14 +12,18 @@
# Design
- Skips across pairs of conv layers
- Skips across pairs of [[Convolutional Layer|conv layers]]
- Elementwise addition
- All layer 3x3 kernel
- Spatial size halves each layer
- Filters doubles each layer
- Fully convolutional
- [[FCN|Fully convolutional]]
- No fc layer
- No pooling
- No [[Max Pooling|pooling]]
- Except at end
- No dropout
![[imagenet-error.png]]
![[resnet-arch.png]]
![[resnet-arch2.png]]

View File

@ -1,16 +1,16 @@
- Output of conv, c, layers are added to inputs of upconv, d, layers
- Output of [[Convolutional Layer|conv]], c, layers are added to inputs of [[upconv]], d, layers
- Element-wise, not channel appending
- Propagate high frequency information to later layers
- Two types
- Additive
- Resnet
- Super-resolution auto-encoder
- [[ResNet]]
- [[Super-resolution]] auto-encoder
- Concatenative
- Densely connected architectures
- DenseNet
- FlowNet
- [[FlowNet]]
![[skip-connections.png]]
![[STEM/img/skip-connections.png]]
[AI Summer - Skip Connections](https://theaisummer.com/skip-connections/)
[Arxiv - Visualising the Loss Landscape](https://arxiv.org/abs/1712.09913)aaaaa
[Arxiv - Visualising the Loss Landscape](https://arxiv.org/abs/1712.09913)

View File

@ -1,4 +1,4 @@
Deep Convolutional [[GAN]]
Deep [[Convolution]]al [[GAN]]
![[dc-gan.png]]
- Generator
@ -13,7 +13,7 @@ Deep Convolutional [[GAN]]
- Discriminator
- Contractive
- Cross-entropy [[Deep Learning#Loss Function|loss]]
- Conv and leaky [[Activation Functions#ReLu|ReLu]] layers only
- [[Convolutional Layer|Conv]] and leaky [[Activation Functions#ReLu|ReLu]] layers only
- Normalised output via [[Activation Functions#Sigmoid|sigmoid]]
## [[Deep Learning#Loss Function|Loss]]

View File

@ -1,4 +1,4 @@
# Fully Convolutional
# Fully [[Convolution]]al
- Remove [[Max Pooling]]
- Use strided [[upconv]]
- Remove [[MLP|FC]] layers

View File

@ -0,0 +1,35 @@
- Fractionally strided convolution
- Transposed [[convolution]]
- Like a deep interpolation
- Convolution with a fractional input stride
- Up-sampling is convolution 'in reverse'
- Not an actual inverse convolution
- For scaling up by a factor of $f$
- Consider as a [[convolution]] of stride $1/f$
- Could specify kernel
- Or learn
- Can have multiple upconv layers
- Separated by [[Activation Functions#ReLu|ReLu]]
- For non-linear up-sampling conv
- Interpolation is linear
![[upconv.png]]
# Convolution Matrix
Normal
![[upconv-matrix.png]]
- Equivalent operation with a flattened input
- Row per kernel location
- Many-to-one operation
![[upconv-matrix-result.png]]
[Understanding transposed convolutions](https://www.machinecurve.com/index.php/2019/09/29/understanding-transposed-convolutions/)
## Transposed
![[upconv-transposed-matrix.png]]
- One-to-many
![[upconv-matrix-transposed-result.png]]

BIN
img/highway-vs-residual.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 88 KiB

BIN
img/imagenet-error.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 56 KiB

BIN
img/resnet-arch.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.0 MiB

BIN
img/resnet-arch2.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.1 MiB

BIN
img/skip-connections 1.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 123 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 42 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 51 KiB

BIN
img/upconv-matrix.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 733 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 130 KiB

BIN
img/upconv.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 29 KiB