This Python package enables numerical optimization by offering a vast array of minimization algorithms.
First off, there are several local optimization algorithms included: line search methods, steepest descent, back-and-forth coordinate descent, quasi-Newton BFGS, Newton, and Newton-CG.
For those who need trust-region methods, minfx has you covered with Cauchy point, Dogleg, CG-Steihaug, and Exact trust region options.
If conjugate gradient methods are more your speed, you'll find four included in this package: Fletcher-Reeves, Polak-Ribiere, Polak-Ribiere +, and Hestenes-Stiefel.
Miscellaneous optimization methods like Grid search, Simplex, and Levenberg-Marquardt are also included in the minfx package.
In addition to these main algorithms, there are several step selection subalgorithms available, including backtracking line search, Nocedal and Wright interpolation-based line search, Nocedal and Wright line search for the Wolfe conditions, and More and Thuente line search.
Finally, there are several Hessian modification algorithms included, such as eigenvalue modification, Cholesky with added multiple of the identity, Gill, Murray, and Wright modified Cholesky algorithm (GMW81), and The Schnabel and Eskow 1999 algorithm (SE99).
Overall, the minfx project offers an impressively comprehensive range of numerical optimization algorithms that should meet the needs of nearly any programmer. Whether you're looking for local optimization, trust region methods, conjugate gradient methods, or anything in between, this package is definitely worth checking out.
Version 1.0.2: N/A