2023-12-22 16:39:03 +00:00
|
|
|
---
|
|
|
|
tags:
|
|
|
|
- ai
|
|
|
|
---
|
2023-06-06 11:48:49 +01:00
|
|
|
- [Self-attention](Attention.md)
|
2023-05-26 06:37:13 +01:00
|
|
|
- Weighting significance of parts of the input
|
|
|
|
- Including recursive output
|
2023-06-05 17:01:29 +01:00
|
|
|
- Similar to [RNN](../RNN/RNN.md)s
|
2023-05-26 06:37:13 +01:00
|
|
|
- Process sequential data
|
|
|
|
- Translation & text summarisation
|
|
|
|
- Differences
|
|
|
|
- Process input all at once
|
2023-06-05 17:01:29 +01:00
|
|
|
- Largely replaced [LSTM](../RNN/LSTM.md) and gated recurrent units (GRU) which had attention mechanics
|
2023-05-26 06:37:13 +01:00
|
|
|
- No recurrent structure
|
|
|
|
|
2023-06-05 17:01:29 +01:00
|
|
|
![transformer-arch](../../../img/transformer-arch.png)
|
2023-05-26 06:37:13 +01:00
|
|
|
|
|
|
|
## Examples
|
|
|
|
- BERT
|
|
|
|
- Bidirectional Encoder Representations from Transformers
|
|
|
|
- Google
|
|
|
|
- Original GPT
|
|
|
|
|
|
|
|
[transformers-explained-visually-part-1-overview-of-functionality](https://towardsdatascience.com/transformers-explained-visually-part-1-overview-of-functionality-95a6dd460452)
|
|
|
|
# Architecture
|
|
|
|
## Input
|
|
|
|
- Byte-pair encoding tokeniser
|
|
|
|
- Mapped via word embedding into vector
|
|
|
|
- Positional information added
|
|
|
|
|
|
|
|
## Encoder/Decoder
|
|
|
|
- Similar to seq2seq models
|
|
|
|
- Create internal representation
|
|
|
|
- Encoder layers
|
|
|
|
- Create encodings that contain information about which parts of input are relevant to each other
|
|
|
|
- Subsequent encoder layers receive previous encoding layers output
|
|
|
|
- Decoder layers
|
|
|
|
- Takes encodings and does opposite
|
|
|
|
- Uses incorporated textual information to produce output
|
|
|
|
- Has attention to draw information from output of previous decoders before drawing from encoders
|
2023-06-05 17:01:29 +01:00
|
|
|
- Both use [Attention](Attention.md)
|
2023-06-06 11:48:49 +01:00
|
|
|
- Both use [dense](../MLP/MLP.md) layers for additional processing of outputs
|
2023-05-26 06:37:13 +01:00
|
|
|
- Contain residual connections & layer norm steps
|