Welcome to oppy’s documentation!
The optimization package oppy is implemented in the programming language Python. Besides algorithms for solving constrained, unconstrained and nonlinear optimization problems, the package contains builtin iterative methods for solving linear systems.
Advanced methods for optimization are included such as SQP (Square Quadratic Programming), Augmented Lagrangian and different newtontype methods. Furthermore certain Krylov methods are implemented for solving linear systems in a stable way.
The goal is to provide a straightforward integration of the library to other applications such that other methods benefit from it. Currently we are working on a combination of optimal control problems with the model order reduction method.
For access, further questions, remarks and ideas please contact .
The idea behind oppy was to provide some optimization methods which are used in the group of Prof. Dr. Volkwein quite often. After a while oppy grew up to a whole optimization package.
The package is still in develop mode. If you want to install oppy use
pip install git+https://gitlab.inf.unikonstanz.de/agvolkwein/oppy
Available subpackages
conOpt
Subpackage which provide some methods for constraint optimization. For problems which are subject to equality and inequality constraints like
\[\min f(x)\]\[\text{s. t. } e(x) = 0\]\[g(x) \leq 0\]
we can use
 Penalty Method
 Augmented Lagrangian Method
 SQP with a BFGS update strategy (at the moment only equality constraint)
and for box constraint problems like
\[\min f(x)\]\[\text{s. t. } x_a \leq x \leq x_b\]
we can use
 Projected gradient Method
 The LBFGSB Method
 Projected NewtonKrylov Method (if you can provide the action of the second derivative)
itMet
Iterative methods for solving linear systems like
\[Ax = b.\]
Here we can use either stationary methods like
 Jacobi
 GaußSeidel
 SOR
or we use krylov methods like
 steepest descent
 CG
 GMRES
For future release we are planing to add preconditioning in the krylov methods. There of course you will be able to use the stationary methods as precondition method.
leastSquares
Subpackage which provide some methods for linear and nonlinear least squares problems, e.g:
\[\text{min} Ax  b_2\]
and
\[\text{min} \frac{1}{2}f(x)_2^2\]
Right now we can solve this kind of problems with the following methods.

 Linear Least Squares

 linear least squares (solving normal equation)

 Nonlinear Least Squares

 GaussNewton algorithm with several choices.
 LevenbergMarquardt with linesearch or trustregion algorithm.
linOpt
Linear optimization methods. With the methods in this subpackage you can solve linear programming
\[\text{max } c^T x\]\[\text{s. t. } Ax \leq b\]\[x \geq 0\]
with or without integer constraints. For that kind of problems we have the following methods:
 simplex
 branch and bound
multOpt
Scalarization methods for solving (possibly boxconstrained) multiobjective optimization problems of the form
\[\min (f_1(x), \ldots, f_k(x)),\]\[\text{s. t. } x_a \leq x \leq x_b.\]
The general idea of scalarization methods is to transform the multiobjective optimization problem into a series of scalar optimization problems. which can then be solved by using methods from unconstrained or constrained optimization (see the subpackages unconOpt or conOpt). Here we can use the following three scalarization methods
 WeightedSum Method (WSM)
 Euclidean Reference Point Method (ERPM)
 PascolettiSerafini Method (PSM)
unconOpt
Subpackage which provide some methods for unconstrained optimization, e.g:
\[\min_{x \in \mathbb{R}^n} f(x)\]
Right now we can solve this kind of problems with line search based first and secondorder methods.
 Gradient Method
 Newton Method
 Nonlinear CG (with different strategies like FletcherReves)
 QuasiNewton Methods (with different strategies like BFGS, Broyden, DFP, …)
where we can use the line search methods
 Armijo
 WolfePowell