stem/AI/Neural Networks/RNN
andy d7ab8f329a vault backup: 2023-06-05 17:01:29
Affected files:
Money/Assets/Financial Instruments.md
Money/Assets/Security.md
Money/Markets/Markets.md
Politcs/Now.md
STEM/AI/Neural Networks/CNN/Examples.md
STEM/AI/Neural Networks/CNN/FCN/FCN.md
STEM/AI/Neural Networks/CNN/FCN/FlowNet.md
STEM/AI/Neural Networks/CNN/FCN/Highway Networks.md
STEM/AI/Neural Networks/CNN/FCN/ResNet.md
STEM/AI/Neural Networks/CNN/FCN/Skip Connections.md
STEM/AI/Neural Networks/CNN/FCN/Super-Resolution.md
STEM/AI/Neural Networks/CNN/GAN/DC-GAN.md
STEM/AI/Neural Networks/CNN/GAN/GAN.md
STEM/AI/Neural Networks/CNN/GAN/StackGAN.md
STEM/AI/Neural Networks/CNN/Inception Layer.md
STEM/AI/Neural Networks/CNN/Interpretation.md
STEM/AI/Neural Networks/CNN/Max Pooling.md
STEM/AI/Neural Networks/CNN/Normalisation.md
STEM/AI/Neural Networks/CNN/UpConv.md
STEM/AI/Neural Networks/CV/Layer Structure.md
STEM/AI/Neural Networks/MLP/MLP.md
STEM/AI/Neural Networks/Neural Networks.md
STEM/AI/Neural Networks/RNN/LSTM.md
STEM/AI/Neural Networks/RNN/RNN.md
STEM/AI/Neural Networks/RNN/VQA.md
STEM/AI/Neural Networks/SLP/Least Mean Square.md
STEM/AI/Neural Networks/SLP/Perceptron Convergence.md
STEM/AI/Neural Networks/SLP/SLP.md
STEM/AI/Neural Networks/Transformers/LLM.md
STEM/AI/Neural Networks/Transformers/Transformers.md
STEM/AI/Properties.md
STEM/CS/Language Binding.md
STEM/Light.md
STEM/Maths/Tensor.md
STEM/Quantum/Orbitals.md
STEM/Quantum/Schrödinger.md
STEM/Quantum/Standard Model.md
STEM/Quantum/Wave Function.md
Tattoo/Music.md
Tattoo/Plans.md
Tattoo/Sources.md
2023-06-05 17:01:29 +01:00
..
LSTM.md vault backup: 2023-06-05 17:01:29 2023-06-05 17:01:29 +01:00
README.md vault backup: 2023-05-31 21:29:04 2023-05-31 21:29:04 +01:00
RNN.md vault backup: 2023-06-05 17:01:29 2023-06-05 17:01:29 +01:00
VQA.md vault backup: 2023-06-05 17:01:29 2023-06-05 17:01:29 +01:00

Recurrent Neural Network

  • Hard to train on long sequences
    • Weights hold memory
      • Implicit
    • Lots to remember

Text Analysis

  • Train sequences of text character-by-character
    • Maintains state vector representing data up to current token
    • Combines state vector with next token to create new vector
    • In theory, info from one token can propagate arbitrarily far down the sequence
      • In practice suffers from vanishing gradient
        • Can't extract precise information about previous tokens

rnn-input rnn-recurrence