Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Loss without regularization #15

Open
Likelihoood opened this issue Jan 9, 2018 · 3 comments
Open

Loss without regularization #15

Likelihoood opened this issue Jan 9, 2018 · 3 comments

Comments

@Likelihoood
Copy link

Likelihoood commented Jan 9, 2018

loss_no_reg

@VinceShieh
Copy link
Owner

regularization is done in the weights update

@Likelihoood
Copy link
Author

Yes, I know it is done in weights update.
I mean that after each iteration the printing loss does not contain reg loss( r1 * ||w|| + r2 * ||v|| ).
I add a function to calculate reg loss:
reg_loss
and after each iteration, print the reg loss
add_reg_loss

But...
Due to the weights being updated in each sample of each iteration, the weight which are used to cal reg loss are not the same one as the beginning of the iteration.
Actually the tr_loss has the same situation.

Please help consider it .

@VinceShieh
Copy link
Owner

hmm, I suggest that, 1. make a toy sample (ex. 3 samples with 5 feature) 2. do not add regularization 3. print the weights at the beginning of each iteration and right after each update, IN THE SAME FUNCTION CALL 4. add one-way regularization and go to step 3 and check again.

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants