From e353f5fccb6aa71a2e2ad4327f976e30622b5e18 Mon Sep 17 00:00:00 2001 From: andy Date: Sun, 4 Jun 2023 22:31:53 +0100 Subject: [PATCH] vault backup: 2023-06-04 22:31:53 Affected files: STEM/AI/Neural Networks/MLP/Back-Propagation.md --- AI/Neural Networks/MLP/Back-Propagation.md | 24 +++++----------------- 1 file changed, 5 insertions(+), 19 deletions(-) diff --git a/AI/Neural Networks/MLP/Back-Propagation.md b/AI/Neural Networks/MLP/Back-Propagation.md index 442a775..97f2bac 100644 --- a/AI/Neural Networks/MLP/Back-Propagation.md +++ b/AI/Neural Networks/MLP/Back-Propagation.md @@ -44,27 +44,13 @@ $$\Delta w_{ji}(n)=\eta\delta_j(n)y_i(n)$$ ## Gradients #### Output Local $$\delta_j(n)=-\frac{\partial\mathfrak E (n)}{\partial v_j(n)}$$ -$$=- -\frac{\partial\mathfrak E(n)}{\partial e_j(n)} -\frac{\partial e_j(n)}{\partial y_j(n)} -\frac{\partial y_j(n)}{\partial v_j(n)}$$ -$$= -e_j(n)\cdot -\varphi_j'(v_j(n)) -$$ +$$=-\frac{\partial\mathfrak E(n)}{\partial e_j(n)}\frac{\partial e_j(n)}{\partial y_j(n)}\frac{\partial y_j(n){\partial v_j(n)}$$ +$$=e_j(n)\cdot\varphi_j'(v_j(n))$$ #### Hidden Local -$$\delta_j(n)=- -\frac{\partial\mathfrak E (n)}{\partial y_j(n)} -\frac{\partial y_j(n)}{\partial v_j(n)}$$ -$$=- -\frac{\partial\mathfrak E (n)}{\partial y_j(n)} -\cdot -\varphi_j'(v_j(n))$$ -$$\delta_j(n)= -\varphi_j'(v_j(n)) -\cdot -\sum_k \delta_k(n)\cdot w_{kj}(n)$$ +$$\delta_j(n)=-\frac{\partial\mathfrak E (n)}{\partial y_j(n)}\frac{\partial y_j(n)}{\partial v_j(n)}$$ +$$=-\frac{\partial\mathfrak E (n)}{\partial y_j(n)}\cdot\varphi_j'(v_j(n))$$ +$$\delta_j(n)=\varphi_j'(v_j(n))\cdot\sum_k \delta_k(n)\cdot w_{kj}(n)$$ ## Weight Correction $$\text{weight correction = learning rate $\cdot$ local gradient $\cdot$ input signal of neuron $j$}$$