list of efficient attention modules
awesome
transformer
attention
attention-is-all-you-need
multihead-attention
reformer
self-attention
transformer-network
longformer
linformer
-
Updated
Aug 23, 2021 - Python