Implementation of attention block in Manet #717
QueLastimaSenor
started this conversation in
General
Replies: 0 comments
# for free
to join this conversation on GitHub.
Already have an account?
# to comment
-
Hi! I've got a question about implementation of PAB in Manet(https://arxiv.org/pdf/2009.02130.pdf, i suppose it's a kernel attention in the article).
![2023-02-13 - 24-50-43 - Multi-Attention-Network for Semantic Segmentation of Fine Resolution Remote Sensing Images](https://user-images.githubusercontent.com/76542612/218339436-15ab44b7-3c9c-4d1a-af1c-3b460ab845a0.png)
![2023-02-13 - 24-54-12 - segmentation-models-pytorch-decoder-py at master · qubvel-segmentation-models-pytorch](https://user-images.githubusercontent.com/76542612/218339520-57d24f38-4aff-4040-b2cd-e42a66a8f5c1.png)
I can't find any resemblance between pab and attention kernel.
It seems like you've written just an attention mechanism. Am i wrong?
Beta Was this translation helpful? Give feedback.
All reactions