stem/AI/Neural Networks/Deep Learning.md
andy 5a592c8c7c vault backup: 2023-05-26 06:37:13
Affected files:
.obsidian/graph.json
.obsidian/workspace-mobile.json
.obsidian/workspace.json
STEM/AI/Ethics.md
STEM/AI/Neural Networks/Activation Functions.md
STEM/AI/Neural Networks/CNN/CNN.md
STEM/AI/Neural Networks/Deep Learning.md
STEM/AI/Neural Networks/MLP/Back-Propagation.md
STEM/AI/Neural Networks/MLP/MLP.md
STEM/AI/Neural Networks/Neural Networks.md
STEM/AI/Neural Networks/Properties+Capabilities.md
STEM/AI/Neural Networks/RNN/LSTM.md
STEM/AI/Neural Networks/RNN/RNN.md
STEM/AI/Neural Networks/RNN/VQA.md
STEM/AI/Neural Networks/SLP/SLP.md
STEM/AI/Neural Networks/Training.md
STEM/AI/Neural Networks/Transformers/Attention.md
STEM/AI/Neural Networks/Transformers/LLM.md
STEM/AI/Neural Networks/Transformers/Transformers.md
STEM/Signal Proc/System Classes.md
STEM/img/back-prop-equations.png
STEM/img/back-prop-weight-changes.png
STEM/img/back-prop1.png
STEM/img/back-prop2.png
STEM/img/cnn+lstm.png
STEM/img/deep-digit-classification.png
STEM/img/deep-loss-function.png
STEM/img/llm-family-tree.png
STEM/img/lstm-slp.png
STEM/img/lstm.png
STEM/img/matrix-dot-product.png
STEM/img/ml-dl.png
STEM/img/photo-tensor.png
STEM/img/relu.png
STEM/img/rnn-input.png
STEM/img/rnn-recurrence.png
STEM/img/slp-arch.png
STEM/img/threshold-activation.png
STEM/img/transformer-arch.png
STEM/img/vqa-block.png
2023-05-26 06:37:13 +01:00

47 lines
1.6 KiB
Markdown

![[deep-digit-classification.png]]
# Loss Function
Objective Function
- [[Back-Propagation]]
- Difference between predicted and target outputs
![[deep-loss-function.png]]
- Test accuracy worse than train accuracy = overfitting
- Dense = fully connected
- Automates feature engineering
![[ml-dl.png]]
These are the two essential characteristics of how deep learning learns from data: the incremental, layer-by-layer way in which increasingly complex representations are developed, and the fact that these intermediate incremental representations are learned jointly, each layer being updated to follow both the representational needs of the layer above and the needs of the layer below. Together, these two properties have made deep learning vastly more successful than previous approaches to machine learning.
# Steps
Structure defining
Compilation
- Loss function
- Metric of difference between output and target
- Optimiser
- How network will update
- Metrics to monitor
- Testing and training
Data preprocess
- Reshape input frame into linear array
- Categorically encode labels
Fit
Predict
Evaluate
# Data Structure
- Tensor flow = channels last
- (samples, height, width, channels)
- Vector data
- 2D tensors of shape (samples, features)
- Time series data or sequence data
- 3D tensors of shape (samples, timesteps, features)
- Images
- 4D tensors of shape (samples, height, width, channels) or (samples, channels, height, Width)
- Video
- 5D tensors of shape (samples, frames, height, width, channels) or (samples, frames, channels , height, width)
![[photo-tensor.png]]
![[matrix-dot-product.png]]