[Optimization]Doubt about Lagrangian multiplier's derivative

zero064

New member
Joined
Nov 6, 2015
Messages
2
The Lagrangian dual equation is like the following:

ψ(λ)=minxT(x)=f(x)+j=1λjhj+12rj=1[hj(x)]2\displaystyle \displaystyle \psi(\lambda)\, =\, \min_x\, T(\mathbf{x})\, =\, f (\mathbf{x}) \, +\, \sum_{j\, =\, 1}^{\ell}\, \lambda_j \, h_j\, +\, \dfrac{1}{2}\, r\, \sum_{j\, =\, 1}^{\ell}\, \left[h_j(\mathbf{x})\right]^2

This is T function.

T(x)=f(x)+j=1λjhj+12rj=1[hj(x)]2\displaystyle T(\mathbf{x})\, =\, f (\mathbf{x}) \, +\, \sum_{j\, =\, 1}^{\ell}\, \lambda_j \, h_j\, +\, \dfrac{1}{2}\, r\, \sum_{j\, =\, 1}^{\ell}\, \left[h_j(\mathbf{x})\right]^2

Therefore, its total derivative with respect to lambda i should be:

dψdλi=ψλi+ψTdxdλi\displaystyle \dfrac{d \psi}{d \lambda_i}\, =\, \dfrac{\partial \psi}{\partial \lambda_i}\, +\, \nabla \psi^T\, \dfrac{d \mathbf{x}}{d \lambda_i}

Because the gradient of T with respect to x is zero, the second term vanishes. But I don't know why the final result is this:

dψdλi=hi(x(λ)),\displaystyle \dfrac{d \psi}{d \lambda_i}\, =\,h_i (\mathbf{x}(\lambda)),\,. . .i=1,2,...,\displaystyle i\, =\,1,\, 2,\,...,\, \ell

In my thought, the x is also function of lamda. It should depend on lamda. So the result of partial derivative should be

hi(x(λ))+λiddλi(hi)\displaystyle h_i(\mathbf{x}(\lambda))\, +\, \lambda_i\, \dfrac{d}{d \lambda_i}\, (h_i)

Why the second term vanishes? What's wrong with my idea? My classmate also asked the professor this question, but he didn't answer.
 
Last edited by a moderator:
Top