I've got a background in deep learning and I still struggle to understand the attention mechanism. I know it's a key/value store but I'm not sure what it's doing to the tensor when it passes through different layers.
this post was submitted on 16 Dec 2022
24 points (96.2% liked)
AI
4006 readers
11 users here now
Artificial intelligence (AI) is intelligence demonstrated by machines, unlike the natural intelligence displayed by humans and animals, which involves consciousness and emotionality. The distinction between the former and the latter categories is often revealed by the acronym chosen.
founded 3 years ago
@behohippy @saint Instead of timestep by timestep sequence modeling the attention allows us to pass sequential model in a parallel NN just like fully connected one, where the positional encoding helps us to know the sequence of each and we can remove the keys having less attention value...
What are you eating which needs that large of a napkin?