A Julia implementation of Simultaneous Perturbation Stochastic Approximation
-
Updated
Apr 24, 2020 - Julia
A Julia implementation of Simultaneous Perturbation Stochastic Approximation
Gradient-free online optimization loosely based on Adaptive Moment Estimation (Adam)
Add a description, image, and links to the gradient-free-optimization topic page so that developers can more easily learn about it.
To associate your repository with the gradient-free-optimization topic, visit your repo's landing page and select "manage topics."