stem/AI/Neural Networks/CNN/UpConv.md
andy 7bc4dffd8b vault backup: 2023-06-06 11:48:49
Affected files:
STEM/AI/Neural Networks/CNN/Examples.md
STEM/AI/Neural Networks/CNN/FCN/FCN.md
STEM/AI/Neural Networks/CNN/FCN/ResNet.md
STEM/AI/Neural Networks/CNN/FCN/Skip Connections.md
STEM/AI/Neural Networks/CNN/GAN/DC-GAN.md
STEM/AI/Neural Networks/CNN/GAN/GAN.md
STEM/AI/Neural Networks/CNN/Interpretation.md
STEM/AI/Neural Networks/CNN/UpConv.md
STEM/AI/Neural Networks/Deep Learning.md
STEM/AI/Neural Networks/MLP/MLP.md
STEM/AI/Neural Networks/Properties+Capabilities.md
STEM/AI/Neural Networks/SLP/Least Mean Square.md
STEM/AI/Neural Networks/SLP/SLP.md
STEM/AI/Neural Networks/Transformers/Transformers.md
STEM/AI/Properties.md
STEM/CS/Language Binding.md
STEM/CS/Languages/dotNet.md
STEM/Signal Proc/Image/Image Processing.md
2023-06-06 11:48:49 +01:00

35 lines
1.1 KiB
Markdown

- Fractionally strided convolution
- Transposed [Convolution](../../../Signal%20Proc/Convolution.md)
- Like a deep interpolation
- Convolution with a fractional input stride
- Up-sampling is convolution 'in reverse'
- Not an actual inverse convolution
- For scaling up by a factor of $f$
- Consider as a [Convolution](../../../Signal%20Proc/Convolution.md) of stride $1/f$
- Could specify kernel
- Or learn
- Can have multiple upconv layers
- Separated by [ReLu](../Activation%20Functions.md#ReLu)
- For non-linear up-sampling conv
- Interpolation is linear
![upconv](../../../img/upconv.png)
# Convolution Matrix
Normal
![upconv-matrix](../../../img/upconv-matrix.png)
- Equivalent operation with a flattened input
- Row per kernel location
- Many-to-one operation
![upconv-matrix-result](../../../img/upconv-matrix-result.png)
[Understanding transposed convolutions](https://www.machinecurve.com/index.php/2019/09/29/understanding-transposed-convolutions/)
## Transposed
![upconv-transposed-matrix](../../../img/upconv-transposed-matrix.png)
- One-to-many
![upconv-matrix-transposed-result](../../../img/upconv-matrix-transposed-result.png)