stem/AI/Neural Networks/CNN/FCN/FCN.md
andy 7052c8c915 vault backup: 2023-05-26 18:52:08
Affected files:
.obsidian/graph.json
.obsidian/workspace.json
STEM/AI/Neural Networks/CNN/FCN/FCN.md
STEM/AI/Neural Networks/CNN/FCN/FlowNet.md
STEM/AI/Neural Networks/CNN/FCN/Super-Resolution.md
STEM/AI/Neural Networks/CNN/GAN/CycleGAN.md
STEM/AI/Neural Networks/CNN/GAN/DC-GAN.md
STEM/AI/Neural Networks/CNN/GAN/GAN.md
STEM/AI/Neural Networks/CNN/GAN/cGAN.md
STEM/AI/Neural Networks/CNN/Interpretation.md
STEM/AI/Neural Networks/CNN/UpConv.md
STEM/img/am-process.png
STEM/img/am.png
STEM/img/fcn-arch.png
STEM/img/fcn-eval.png
STEM/img/fcn-uses.png
STEM/img/flownet-encode.png
STEM/img/flownet-training.png
STEM/img/flownet-upconv.png
STEM/img/flownet.png
STEM/img/super-res.png
STEM/img/superres-results.png
2023-05-26 18:52:08 +01:00

867 B

Fully Convolutional Network

Convolutional and up-convolutional layers with Activation Functions#ReLu but no others (pooling)

  • All some sort of Encoder-Decoder

Contractive → UpConv

Image Sementation

  • For visual output
    • Previously image \rightarrow vector
  • Additional layers to up-sample representation to an image

!fcn-uses.png

!fcn-arch.png

Training

  • Rarely from scratch
  • Pre-trained weights
  • Replace final layers
    • FC layers
    • White-noise initialised
  • Add upconv layer(s)
    • Fine-tune train
    • Freeze others
    • Annotated GT images
  • Can use summed per-pixel log loss

Evaluation

!fcn-eval.png

  • SDS
    • Classical method
    • 52% mAP
  • FCN
    • 62% mAP
  • Intersection over Union
    • IOU
    • Jaccard
    • Averaged over all images
    • J(A,B)=\frac{|A\cap B|}{|A\cup B|}