Skip to content

Latest commit

 

History

History
33 lines (15 loc) · 1.66 KB

README.md

File metadata and controls

33 lines (15 loc) · 1.66 KB

MP

Mathematical programming and optimization methods

This folder contains implementations of different methods for finding extrema in mathematical functions. Each method has its own strengths and may be suitable for different types of optimization problems.

Methods Included:

  1. Gradient Method with Step Splitting

The gradient method with step splitting is an iterative optimization algorithm that uses the gradient (first-order derivative) of a function to find its minimum.

  1. Fastest Descent Method (Gradient Descent Method)

The fastest descent method, commonly known as the gradient descent method, is an iterative optimization algorithm used for finding the minimum of a function. It involves taking steps proportional to the negative of the gradient of the function at the current point.

  1. Newton's Method

Newton's method is an iterative numerical technique for finding the roots (or minima/maxima) of a function. It employs second-order derivative information in its optimization process.

  1. Conjugate Gradient Method

The conjugate gradient method is an iterative technique for solving systems of linear equations, which can also be applied to optimization problems. It combines aspects of the gradient descent method and direct methods for solving linear systems.

  1. Coordinate Descent Method

The coordinate descent method is an optimization algorithm that updates one variable at a time, holding the others fixed. It iteratively minimizes the function with respect to each variable.