Skip to content
This repository has been archived by the owner on Nov 23, 2023. It is now read-only.

if "aff_loss_edge" and "aff_loss_nedge" do not have a coefficient of "dec" that decreases with "step", can "aff_loss" converge? #6

Open
ef-ever opened this issue Oct 17, 2018 · 1 comment

Comments

@ef-ever
Copy link

ef-ever commented Oct 17, 2018

Excuse me, if "aff_loss_edge" and "aff_loss_nedge" do not have a coefficient of "dec" that decreases with "step", can "aff_loss" converge? When I run "train_affinity.py" with the parameters mentioned in "train_pspnet_affinity.sh", "aff_loss_edge" will not fall until 2.88.

@ef-ever ef-ever changed the title if "aff_loss_edge" and "aff_loss_nedge" do not have "dec" with "step" as a coefficient, can "aff_loss" converge? if "aff_loss_edge" and "aff_loss_nedge" do not have a coefficient of "dec" that decreases with "step", can "aff_loss" converge? Oct 17, 2018
@twke18
Copy link
Owner

twke18 commented Oct 17, 2018

Yes, according to our empirical experience, it is necessary to exponentially decrease the affinity field loss during training.

# for free to subscribe to this conversation on GitHub. Already have an account? #.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants