We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
on some real problems, we see that the negative log likelihood (with l1 penalty) increases. the increase happens as a result of the update to theta.
here is a plot of the difference in loss after the update to theta. it looks like sometimes the loss increases!
but the lambda updates are protected by the backtracking:
The text was updated successfully, but these errors were encountered:
the loss should NEVER increase due to a Theta update because the objective is convex and we have a closed-form solution for the update.
Sorry, something went wrong.
No branches or pull requests
on some real problems, we see that the negative log likelihood (with l1 penalty) increases.
the increase happens as a result of the update to theta.
here is a plot of the difference in loss after the update to theta. it looks like sometimes the loss increases!
![image](https://cloud.githubusercontent.com/assets/11619412/19448745/f5406f40-9457-11e6-9864-c705e6e0ef01.png)
but the lambda updates are protected by the backtracking:
![image](https://cloud.githubusercontent.com/assets/11619412/19448768/09d86188-9458-11e6-9cff-99dc64ea6bc9.png)
The text was updated successfully, but these errors were encountered: