stem/AI/Neural Networks/MLP
andy 236a5eac06 vault backup: 2023-05-31 22:51:45
Affected files:
.obsidian/app.json
.obsidian/appearance.json
.obsidian/workspace.json
Money/Markets/Commodity.md
STEM/AI/Neural Networks/CNN/CNN.md
STEM/AI/Neural Networks/CNN/GAN/CycleGAN.md
STEM/AI/Neural Networks/CNN/GAN/DC-GAN.md
STEM/AI/Neural Networks/CNN/GAN/cGAN.md
STEM/AI/Neural Networks/CV/Data Manipulations.md
STEM/AI/Neural Networks/MLP/Back-Propagation.md
STEM/CS/Code Types.md
STEM/CS/Compilers.md
STEM/Quantum/Confinement.md
2023-05-31 22:51:45 +01:00
..
Back-Propagation.md vault backup: 2023-05-31 22:51:45 2023-05-31 22:51:45 +01:00
Decision Boundary.md vault backup: 2023-05-23 09:28:54 2023-05-23 09:28:54 +01:00
MLP.md vault backup: 2023-05-27 22:17:56 2023-05-27 22:17:56 +01:00
README.md vault backup: 2023-05-31 21:29:04 2023-05-31 21:29:04 +01:00

  • Architectures
  • Single hidden layer can learn any function
    • Universal approximation theorem
  • Each hidden layer can operate as a different feature extraction layer
  • Lots of Weight Init to learn
  • Back-Propagation is supervised

!mlp-arch.png

Universal Approximation Theory

A finite Architectures MLP with 1 hidden layer can in theory approximate any mathematical function

!activation-function.png !mlp-arch-diagram.png

Weight Matrix

  • Use matrix multiplication for layer output
  • TLU is hard limiter !tlu.png
  • o_1 to o_4 must all be one to overcome -3.5 bias and force output to 1 !mlp-non-linear-decision.png
  • Can generate a non-linear Decision Boundary