Skip to content

Add padding direction #2121

New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Open
wants to merge 4 commits into
base: main
Choose a base branch
from
Open

Add padding direction #2121

wants to merge 4 commits into from

Conversation

smsm8898
Copy link

give small flexibility to pad transformer

@joecummings
Copy link
Contributor

@SM-Jang Thanks for the addition! A couple of notes:

  1. Can you refer to our style guide and make sure the code passes our linting?
  2. Can you add a brief note on the motivation behind this addition?

@smsm8898
Copy link
Author

smsm8898 commented Mar 21, 2023

  1. Can you refer to our style guide and make sure the code passes our linting?
    Okay, I check flake8
    and I follow change the name as begin:bool, default=False
    To verify it, i modify the unittest and pass it all
    (test/torchtext_unittest/test_transfoms.py)

  2. Can you add a brief note on the motivation behind this addition?
    When I work for modeling timeserise, I have to pad on begin or end.
    The old PadTransform only give left pad... So I have to use torch.nn.functional.pad()

ex)
...
self.query_transformer = Sequential(
# Truncate(10),
VocabTransform(query_vocab),
ToTensor(),
)
...
x = query_transformer(x)
x = torch.nn.functional.pad(x, (0, pad_amount), value=self.pad_value)
...

That's why

# for free to join this conversation on GitHub. Already have an account? # to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants