Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

loss increases sometimes #22

Open
dswah opened this issue Oct 17, 2016 · 1 comment
Open

loss increases sometimes #22

dswah opened this issue Oct 17, 2016 · 1 comment

Comments

@dswah
Copy link
Owner

dswah commented Oct 17, 2016

on some real problems, we see that the negative log likelihood (with l1 penalty) increases.
the increase happens as a result of the update to theta.

image

here is a plot of the difference in loss after the update to theta. it looks like sometimes the loss increases!
image

but the lambda updates are protected by the backtracking:
image

@dswah
Copy link
Owner Author

dswah commented Oct 17, 2016

the loss should NEVER increase due to a Theta update because the objective is convex and we have a closed-form solution for the update.

# for free to join this conversation on GitHub. Already have an account? # to comment
Projects
None yet
Development

No branches or pull requests

1 participant