-
Notifications
You must be signed in to change notification settings - Fork 361
Add save_every_iter option #173
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
base: master
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There seems to be a logical problem with some of the code ❤️
@@ -388,6 +378,63 @@ def to_type(inputs, type): | |||
raise ValueError(type + " is not a valid type. [Options: float, int]") | |||
return inputs | |||
|
|||
@staticmethod | |||
def _save_adv_examples( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Perhaps adding a type constraint or default parameters value for each parameter here would help other maintainers have a clear definition of parameter types.
torchattacks/attack.py
Outdated
save_dict = { | ||
"adv_inputs": adv_input_list_cat, | ||
"labels": label_list_cat, | ||
} # nopep8 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here # nopep8
can be removed.
torchattacks/attack.py
Outdated
|
||
save_dict["save_type"] = save_type | ||
torch.save(save_dict, save_path) | ||
if save_path is not None and save_every_iter: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The input parameters of these two self._save_adv_examples
functions seem to be the same, so why are two placed here?
PR Type and Checklist
What kind of change does this PR introduce?
When saving a large number of samples, saving each batch repeatedly will consume a lot of time.
I added an option to control whether generated samples are saved every epoch.
model
.supported_mode
whether the attack supports targeted mode.