Convex Optimization. Mikhail Moklyachuk

Чтение книги онлайн.

Читать онлайн книгу Convex Optimization - Mikhail Moklyachuk страница 11

Convex Optimization - Mikhail Moklyachuk

Скачать книгу

1) whose determinant is not zero. Let this be a matrix composed of the first m + 1

      columns of the matrix A. We construct the function F : ℝm + 1 → ℝm + 1 with the help of functions fk(x), k = 0, … , m. Let

image

      Here x1, … , xm + 1 are variables, and image are fixed quantities. If image is a solution of the constrained optimization problem, then image. The functions Fk(x1,… , xm + 1), k = 1, … , m + 1 satisfy conditions of the inverse function theorem. Take y = (ε, 0, … , 0). For sufficiently small values of ε, there exists a vector image such that

image

      that is

image

      where x(ε) = (x1(ε),… , xm + 1(ε), image) and image. This contradicts the fact that image is a solution to the constrained optimization problem [1.2], since both for positive and negative values of ε there are vectors close to image on which the function f0(x(ε)) takes values of both smaller and larger image. □

      Thus, to determine m + n + 1 unknown λ0, λ1, … , λm, image, we have n + m equations

image

      Note that the Lagrange multipliers are determined up to proportionality. If it is known that λ0 ≠ 0, then we can choose λ0 = 1. Then the number of equations and the number of unknowns are the same.

      Linear independence of the vectors of derivatives image is the regularity condition that guarantees that λ0 ≠ 0. However, verification of this condition is more complicated than direct verification of the fact that λ0 cannot be equal to zero.

      Since the time of Lagrange, almost a century ago, the method of Lagrange multipliers has been used with λ0 = 1, despite the fact that in the general case it is wrong.

image

      the matrix of the second-order partial derivatives of the Lagrange function L (x, λ, λ0).

      THEOREM 1.17.– Let the functions fi(x), i = 0,1, … , m be two times differentiable at a point image and continuously differentiable in some neighborhood U of the point image, and let the gradients image be linearly independent. If image is the point of local minimum of the problem [1.2], then

image

      for all λ, λ0, satisfying the condition

image

      and all h ∈ ℝn such that

image

      THEOREM 1.18.– Let the functions fi(x), i = 0, 1, … , m, be two times differentiable at a point image ∈ ℝn, satisfying the conditions

image

      Assume that for some λ, λ0, the condition holds

image

      and, in addition,

image

      for all non-zero h ∈ ℝn satisfying the condition

image

      Then image is the point of local minimum of the problem [1.2].

      We can formulate the following rule of indeterminate Lagrange multipliers for solving constrained optimization problems with equality constraints:

      1 1) Write the Lagrange function

      2 2) Write the necessary conditions for the extremum of the function L, that is:

      3 3) Find the stationary points, that is, the solutions of these equations, provided that not all Lagrange multipliers λ0, λ1, … , λm are zero.

      4 4) Find a solution to the problem among the stationary points or prove that the problem has no solutions.

      1.4.2. Problems with equality and inequality constraints

      Let fi : ℝn → ℝ be differentiable functions of n real variables. The constrained optimization problem with equality and inequality constraints is the following problem:

Скачать книгу