################################
Welcome to oppy's documentation!
################################
.. figure:: oppypipe.gif
:align: center
Heat development of a heating pipe model with a defect cooling pipe.
The optimization package oppy is implemented in the programming
language `Python `_. Besides algorithms for
solving constrained, unconstrained and non-linear optimization problems,
the package contains built-in iterative methods for solving linear systems.
Advanced methods for optimization are included such as SQP
(Square Quadratic Programming), Augmented Lagrangian and different
newton-type methods. Furthermore certain Krylov methods are
implemented for solving linear systems in a stable way.
The goal is to provide a straightforward integration of the
library to other applications such that other methods benefit from it.
Currently we are working on a combination of optimal control
problems with the model order reduction method.
For access, further questions, remarks and ideas please
contact :email:`us `.
The idea behind oppy was to provide some optimization methods which are used
in the group of Prof. Dr. Volkwein quite often. After a while oppy
grew up to a whole optimization package.
The package is still in develop mode. If you want to install oppy use
pip install git+https://gitlab.inf.uni-konstanz.de/ag-volkwein/oppy
Available subpackages
=====================
conOpt
------
Subpackage which provide some methods for constraint optimization. For
problems which are subject to equality and inequality constraints like
.. math:: \min f(x)
.. math:: \text{s. t. } e(x) = 0
.. math:: g(x) \leq 0
we can use
#. Penalty Method
#. Augmented Lagrangian Method
#. SQP with a BFGS update strategy (at the moment only equality constraint)
and for box constraint problems like
.. math:: \min f(x)
.. math:: \text{s. t. } x_a \leq x \leq x_b
we can use
#. Projected gradient Method
#. The L-BFGS-B Method
#. Projected Newton-Krylov Method (if you can provide the
action of the second derivative)
itMet
-----
Iterative methods for solving linear systems like
.. math:: Ax = b.
Here we can use either stationary methods like
#. Jacobi
#. Gauß-Seidel
#. SOR
or we use krylov methods like
#. steepest descent
#. CG
#. GMRES
For future release we are planing to add preconditioning in the
krylov methods. There of course you will be able to use the
stationary methods as precondition method.
leastSquares
------------
Subpackage which provide some methods for linear and nonlinear least
squares problems, e.g:
.. math:: \text{min} ||Ax - b||_2
and
.. math:: \text{min} \frac{1}{2}||f(x)||_2^2
Right now we can solve this kind of problems with the following methods.
* Linear Least Squares
#. linear least squares (solving normal equation)
* Nonlinear Least Squares
#. Gauss-Newton algorithm with several choices.
#. Levenberg-Marquardt with line-search or trust-region algorithm.
linOpt
------
Linear optimization methods. With the methods in this subpackage you
can solve linear programming
.. math:: \text{max } c^T x
.. math:: \text{s. t. } Ax \leq b
.. math:: x \geq 0
with or without integer constraints. For that kind of problems we have
the following methods:
#. simplex
#. branch and bound
multOpt
-------
Scalarization methods for solving (possibly box-constrained) multiobjective
optimization problems of the form
.. math:: \min (f_1(x), \ldots, f_k(x)),
.. math:: \text{s. t. } x_a \leq x \leq x_b.
The general idea of scalarization methods is to transform the
multiobjective optimization problem into a series of scalar optimization
problems. which can then be solved by using methods from unconstrained or
constrained optimization (see the subpackages unconOpt or conOpt). Here we
can use the following three scalarization methods
#. Weighted-Sum Method (WSM)
#. Euclidean Reference Point Method (ERPM)
#. Pascoletti-Serafini Method (PSM)
unconOpt
--------
Subpackage which provide some methods for unconstrained optimization, e.g:
.. math:: \min_{x \in \mathbb{R}^n} f(x)
Right now we can solve this kind of problems with line search based first-
and second-order methods.
#. Gradient Method
#. Newton Method
#. Nonlinear CG (with different strategies like Fletcher-Reves)
#. Quasi-Newton Methods (with different strategies like
BFGS, Broyden, DFP, ...)
where we can use the line search methods
#. Armijo
#. Wolfe-Powell
.. toctree::
:maxdepth: 2
:caption: Contents:
What is oppy/index_whatisoppy.rst
Installation/index installation.md
Notebooks_Web/index_notebooks.rst
oppy - How To's/index howtos.md
oppy API/index_choosing.rst
Hall of fame/index_hof.rst
Literature/index_literature.rst
.. Indices and tables
==================
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`