Skip to content

We built an optimization technique that, at each learning step, automatically learns which best learning rate to use for gradient descent.

Notifications You must be signed in to change notification settings

Ravoxsg/UofT_Adaptive_LR_optimizer

 
 

Repository files navigation

Faster gradient descent via an adaptive learning rate

This repo contains code for work done by Mathieu Ravaut and Satya Krishna Gorti.

In this project we show a new adaptive learning rate algorithm on gradient descent and explore its effectiveness on the following tasks:

  • Linear regression using Boston Housing Prices dataset
  • Logistic regression using MNIST dataset
  • Image Classification with neural networks using CIFAR-10 and CIFAR-100

The report of this work can be seen here

About

We built an optimization technique that, at each learning step, automatically learns which best learning rate to use for gradient descent.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 68.0%
  • Python 19.7%
  • TeX 12.3%