Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Location Embedding of End token in sequential prediction module. #7

Open
anilbatra2185 opened this issue Aug 5, 2021 · 0 comments
Open

Comments

@anilbatra2185
Copy link

Hi @LuoweiZhou ,

Thanks for sharing the code of an interesting work. I am re-implementing the code in Pytorch, as I am unable to install luatorch due to some older libraries.

Query about Location Embedding: As the flaten output of MaxPool is 200 which will be used for location embedding. However, there is only one additional token is used i.e. Start embedding token. I could not find how to identify the end token or stop the sequential prediction.

Appreciate you response!

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant