Multi-parametric Optimization and Control. Efstratios N. Pistikopoulos

Чтение книги онлайн.

Читать онлайн книгу Multi-parametric Optimization and Control - Efstratios N. Pistikopoulos страница 11

Multi-parametric Optimization and Control - Efstratios N. Pistikopoulos

Скачать книгу

conditions are the Karush–Kuhn–Tucker (KKT) Necessary Conditions and they are the basis for the solution of nonlinear optimization problems.

      1.1.2.2 Karun–Kush–Tucker First‐Order Sufficient Optimality Conditions

      Consider the sets

and
. Then, if the following conditions hold:

        is pseudo‐convex at with respect to all other feasible points x.

        for all are quasi‐convex at with respect to all other feasible points x.

        for all are quasi‐convex at with respect to all other feasible points x.

        for all are quasi‐concave at with respect to all other feasible points x.

      then

is a global optimum of problem (1.10). If the aforementioned conditions hold only within a ball of radius
around
, then
is a local optimum of problem (1.10).

      1.1.3 Interpretation of Lagrange Multipliers

be the global minimum of problem (1.12), and that the gradient of the equality constraints are linearly independent. In addition, assume that the corresponding Lagrange multiplier is
. The vector images is a perturbation vector. The solution of problem (1.12) is a function of the perturbation vector along with the multiplier. Hence, the Lagrange function can be written as

      (1.13)equation

      Calculating the partial derivative of the Lagrange function with respect to the perturbation vector, we have

      (1.14)equation

      which yields

      (1.15)equation

      Hence, the Lagrange multipliers can be interpreted as a measure of sensitivity of the objective function with respect to the perturbation vector of the constraints at the optimum point images.

      1.2.1 Basic Sensitivity Theorem

      where images is the vector of the continuous optimization variables, images is the vector of the uncertain parameters, and the sets images, images correspond to the inequality and equality constraint sets, respectively.

      Theorem 1.1 (Basic Sensitivity Theorem, [1])

       Let a general multi‐parametric programming problem be described by (1.16). Assume that the functions defining problem (1.16) are twice differentiable in and their gradients with respect to and the constraints are once continuously differentiable in in a neighborhood of . In addition, assume that the second‐order sufficient conditions for a local minimum of the problem hold at with associated Lagrange multipliers and . Lastly, let the gradients (for such that ) and be linearly independent (i.e. LICQ holds), and for such that , i.e. strict complementary slackness (SCS) holds.

       Then, the first‐order sensitivity results for a second‐order local minimizing point are known as the basic sensitivity theorem (BST), and the following properties hold:

        is a local isolated minimizing point of the problem and the associated Lagrange multipliers and are unique.

       For in the neighborhood or , there exists a unique, but continuously differentiable vector function satisfying the second‐order sufficient conditions for a local minimum of the problem with associated unique Lagrange multipliers and .

       For near the set of binding inequalities is unchanged, SCS holds and the binding constraint gradients are linearly independent at .

       Proof

      See [1].

      (1.17)equation

      and the vector images is defined as follows:

      (1.18)equation

      Furthermore, if there exists images for which

      (1.19)

Скачать книгу