Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

about the query_pos and query_sine_embed #61

Open
Pujin823 opened this issue Mar 28, 2023 · 0 comments
Open

about the query_pos and query_sine_embed #61

Pujin823 opened this issue Mar 28, 2023 · 0 comments

Comments

@Pujin823
Copy link

hey, thanks for ur wonderful work. but here i got some problem when i read the code,

  1. what is the difference between the query_pos and query_sine_embed. it seems that query_sine_embed is the pos embedding vector, but in your code, the first decoder layer, q = q_content + q_pos. Why
  2. what is the function of the hyperparameter of self.keep_query_pos? in the original paper, your key insight seems to keep content query and pos query apart so that they can compute attention respectively
    I would appreciated it if could give me a insight.
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant