Pack_padded_sequence,flatten_parameters and pad_packed_sequence? #2825
Unanswered
AutumnRoom
asked this question in
Q&A
Replies: 1 comment
-
We do not have any specific RNN utilities at this time like the one you mentioned from pytorch. If you have inputs of varying sequence lengths that you want to pad, you can use Regarding We don't depend on cuDNN for better portability, so we don't have this option. Maybe in the future we can have similar careful optimizations, but at this time we don't have this. Side note: you might be interested in this stacked LSTM implementation. |
Beta Was this translation helpful? Give feedback.
0 replies
# for free
to join this conversation on GitHub.
Already have an account?
# to comment
-
Are the functions pack_padded_sequence, flatten_parameters, and pad_packed_sequence in PyTorch needed for BiLSTM calculations in Burn?
Beta Was this translation helpful? Give feedback.
All reactions