Hi all,
I am currently facing the following optimization problem:
f(v)=rf+v′∗μ+v′∗diag(Σ)∗1/2−v′∗Σ∗v∗1/2+(1−γ)∗v′∗Σ∗v∗1/2
s.t.
1′∗v=1 (sum of the vector elements = 1)
where Σ = nxn variance-covariance-matrix, diag(Σ) = nx1 variance vector, μ,γ,rf = constants, 1′ = transpose nx1 vector of ones and finally the variable of interest v = nx1 vector with v1,...,vn.
I defined the Lagrangian functions as L=f(v)+λ∗(1′∗v−1) and took the derivates w.r.t. v and w.r.t λ. That's the point where I am stuck as I am not able to solve for lambda from the two functions and afterwards for the variable of interest v. The goal in the end is to find the maximizing vector v under the above equality constraint.
Could anybody help on how to solve this problem?
Thank you very much in advance.
I am currently facing the following optimization problem:
f(v)=rf+v′∗μ+v′∗diag(Σ)∗1/2−v′∗Σ∗v∗1/2+(1−γ)∗v′∗Σ∗v∗1/2
s.t.
1′∗v=1 (sum of the vector elements = 1)
where Σ = nxn variance-covariance-matrix, diag(Σ) = nx1 variance vector, μ,γ,rf = constants, 1′ = transpose nx1 vector of ones and finally the variable of interest v = nx1 vector with v1,...,vn.
I defined the Lagrangian functions as L=f(v)+λ∗(1′∗v−1) and took the derivates w.r.t. v and w.r.t λ. That's the point where I am stuck as I am not able to solve for lambda from the two functions and afterwards for the variable of interest v. The goal in the end is to find the maximizing vector v under the above equality constraint.
Could anybody help on how to solve this problem?
Thank you very much in advance.
Last edited: