2023-05-26 06:37:13 +01:00
|
|
|
- Limits output values
|
|
|
|
- Squashing function
|
|
|
|
|
|
|
|
# Threshold
|
|
|
|
- For binary functions
|
|
|
|
- Not differentiable
|
|
|
|
- Sharp rise
|
|
|
|
- *Heaviside function*
|
|
|
|
- Unipolar
|
|
|
|
- 0 <-> +1
|
|
|
|
- Bipolar
|
|
|
|
- -1 <-> +1
|
|
|
|
|
2023-05-31 22:21:56 +01:00
|
|
|
![threshold-activation](../../img/threshold-activation.png)
|
2023-05-26 06:37:13 +01:00
|
|
|
|
|
|
|
# Sigmoid
|
2023-05-23 09:28:54 +01:00
|
|
|
- Logistic function
|
|
|
|
- Normalises
|
|
|
|
- Introduces non-linearity
|
2023-05-26 06:37:13 +01:00
|
|
|
- Alternative is $tanh$
|
|
|
|
- -1 <-> +1
|
2023-05-23 09:28:54 +01:00
|
|
|
- Easy to take derivative
|
|
|
|
$$\frac d {dx} \sigma(x)=
|
|
|
|
\frac d {dx} \left[
|
|
|
|
\frac 1 {1+e^{-x}}
|
|
|
|
\right]
|
|
|
|
=\sigma(x)\cdot(1-\sigma(x))$$
|
|
|
|
|
2023-05-31 22:21:56 +01:00
|
|
|
![sigmoid](../../img/sigmoid.png)
|
|
|
|
|
2023-05-23 09:28:54 +01:00
|
|
|
### Derivative
|
|
|
|
|
|
|
|
$$y_j(n)=\varphi_j(v_j(n))=
|
|
|
|
\frac 1 {1+e^{-v_j(n)}}$$
|
|
|
|
$$\frac{\partial y_j(n)}{\partial v_j(n)}=
|
|
|
|
\varphi_j'(v_j(n))=
|
|
|
|
\frac{e^{-v_j(n)}}{(1+e^{-v_j(n)})^2}=
|
|
|
|
y_j(n)(1-y_j(n))$$
|
|
|
|
- Nice derivative
|
|
|
|
- Max value of $\varphi_j'(v_j(n))$ occurs when $y_j(n)=0.5$
|
|
|
|
- Min value of 0 when $y_j=0$ or $1$
|
2023-05-31 22:11:34 +01:00
|
|
|
- Initial [weights](Weight%20Init.md) chosen so not saturated at 0 or 1
|
2023-05-23 09:28:54 +01:00
|
|
|
|
|
|
|
If $y=\frac u v$
|
|
|
|
Where $u$ and $v$ are differential functions
|
|
|
|
|
|
|
|
$$\frac{dy}{dx}=\frac d {dx}\left(\frac u v\right)$$
|
|
|
|
|
|
|
|
$$\frac{dy}{dx}=
|
2023-05-26 06:37:13 +01:00
|
|
|
\frac {v \frac d {dx}(u) - u\frac d {dx}(v)} {v^2}$$
|
|
|
|
|
|
|
|
# ReLu
|
|
|
|
Rectilinear
|
|
|
|
- For deep networks
|
|
|
|
- $y=max(0,x)$
|
2023-05-26 18:29:17 +01:00
|
|
|
- CNNs
|
2023-05-31 22:11:34 +01:00
|
|
|
- Breaks associativity of successive [convolutions](../../Signal%20Proc/Convolution.md)
|
2023-05-26 18:29:17 +01:00
|
|
|
- Critical for learning complex functions
|
|
|
|
- Sometimes small scalar for negative
|
|
|
|
- Leaky ReLu
|
2023-05-26 06:37:13 +01:00
|
|
|
|
2023-05-31 22:21:56 +01:00
|
|
|
![relu](../../img/relu.png)
|
2023-05-26 18:29:17 +01:00
|
|
|
|
|
|
|
# SoftMax
|
|
|
|
- Output is per-class vector of likelihoods
|
|
|
|
- Should be normalised into probability vector
|
|
|
|
|
|
|
|
## AlexNet
|
|
|
|
$$f(x_i)=\frac{\text{exp}(x_i)}{\sum_{j=1}^{1000}\text{exp}(x_j)}$$
|