vault backup: 2023-06-04 22:31:53

Affected files:
STEM/AI/Neural Networks/MLP/Back-Propagation.md
This commit is contained in:
andy 2023-06-04 22:31:53 +01:00
parent 5f167f25a4
commit e353f5fccb

View File

@ -44,27 +44,13 @@ $$\Delta w_{ji}(n)=\eta\delta_j(n)y_i(n)$$
## Gradients ## Gradients
#### Output Local #### Output Local
$$\delta_j(n)=-\frac{\partial\mathfrak E (n)}{\partial v_j(n)}$$ $$\delta_j(n)=-\frac{\partial\mathfrak E (n)}{\partial v_j(n)}$$
$$=- $$=-\frac{\partial\mathfrak E(n)}{\partial e_j(n)}\frac{\partial e_j(n)}{\partial y_j(n)}\frac{\partial y_j(n){\partial v_j(n)}$$
\frac{\partial\mathfrak E(n)}{\partial e_j(n)} $$=e_j(n)\cdot\varphi_j'(v_j(n))$$
\frac{\partial e_j(n)}{\partial y_j(n)}
\frac{\partial y_j(n)}{\partial v_j(n)}$$
$$=
e_j(n)\cdot
\varphi_j'(v_j(n))
$$
#### Hidden Local #### Hidden Local
$$\delta_j(n)=- $$\delta_j(n)=-\frac{\partial\mathfrak E (n)}{\partial y_j(n)}\frac{\partial y_j(n)}{\partial v_j(n)}$$
\frac{\partial\mathfrak E (n)}{\partial y_j(n)} $$=-\frac{\partial\mathfrak E (n)}{\partial y_j(n)}\cdot\varphi_j'(v_j(n))$$
\frac{\partial y_j(n)}{\partial v_j(n)}$$ $$\delta_j(n)=\varphi_j'(v_j(n))\cdot\sum_k \delta_k(n)\cdot w_{kj}(n)$$
$$=-
\frac{\partial\mathfrak E (n)}{\partial y_j(n)}
\cdot
\varphi_j'(v_j(n))$$
$$\delta_j(n)=
\varphi_j'(v_j(n))
\cdot
\sum_k \delta_k(n)\cdot w_{kj}(n)$$
## Weight Correction ## Weight Correction
$$\text{weight correction = learning rate $\cdot$ local gradient $\cdot$ input signal of neuron $j$}$$ $$\text{weight correction = learning rate $\cdot$ local gradient $\cdot$ input signal of neuron $j$}$$