andy
25f73797e3
Affected files: .obsidian/graph.json .obsidian/workspace.json STEM/AI/Neural Networks/Activation Functions.md STEM/AI/Neural Networks/CNN/CNN.md STEM/AI/Neural Networks/CNN/FCN/FCN.md STEM/AI/Neural Networks/CNN/FCN/FlowNet.md STEM/AI/Neural Networks/CNN/FCN/Highway Networks.md STEM/AI/Neural Networks/CNN/FCN/ResNet.md STEM/AI/Neural Networks/CNN/FCN/Skip Connections.md STEM/AI/Neural Networks/CNN/GAN/DC-GAN.md STEM/AI/Neural Networks/CNN/GAN/GAN.md STEM/AI/Neural Networks/CNN/UpConv.md STEM/img/highway-vs-residual.png STEM/img/imagenet-error.png STEM/img/resnet-arch.png STEM/img/resnet-arch2.png STEM/img/skip-connections 1.png STEM/img/upconv-matrix-result.png STEM/img/upconv-matrix-transposed-result.png STEM/img/upconv-matrix.png STEM/img/upconv-transposed-matrix.png STEM/img/upconv.png
35 lines
918 B
Markdown
35 lines
918 B
Markdown
- Fractionally strided convolution
|
|
- Transposed [[convolution]]
|
|
- Like a deep interpolation
|
|
- Convolution with a fractional input stride
|
|
- Up-sampling is convolution 'in reverse'
|
|
- Not an actual inverse convolution
|
|
- For scaling up by a factor of $f$
|
|
- Consider as a [[convolution]] of stride $1/f$
|
|
- Could specify kernel
|
|
- Or learn
|
|
- Can have multiple upconv layers
|
|
- Separated by [[Activation Functions#ReLu|ReLu]]
|
|
- For non-linear up-sampling conv
|
|
- Interpolation is linear
|
|
|
|
![[upconv.png]]
|
|
|
|
# Convolution Matrix
|
|
Normal
|
|
|
|
![[upconv-matrix.png]]
|
|
|
|
- Equivalent operation with a flattened input
|
|
- Row per kernel location
|
|
- Many-to-one operation
|
|
|
|
![[upconv-matrix-result.png]]
|
|
|
|
[Understanding transposed convolutions](https://www.machinecurve.com/index.php/2019/09/29/understanding-transposed-convolutions/)
|
|
|
|
## Transposed
|
|
![[upconv-transposed-matrix.png]]
|
|
- One-to-many
|
|
|
|
![[upconv-matrix-transposed-result.png]] |