
New Member
Quadratic Form Minimization Exercises?
I am trying to internalize this video and perhaps work an example or two. I understand the professor's point that quadratic form minimization can be used to solve Ax = b from linear algebra, using gradient descent from calculus (seemingly an exercise for computers moreso than humans), but does it work the other way as well, allowing wouldbe calculus optimization problems to be solved via linear algebra?
For example, I'm imagining taking a given quadratic equation such as z = x^2 + 2x + 3y^2 xy + 9, and solving it by first rewriting the RHS in the form (1/2)[x, y]A[x, y]^T  [x,y]b + c, where A is a matrix, and b and c are vectors of constants, and then solving A[x y]^T = b for [x y]^T where the resulting x and y are respectively the x and ycoordinates of the minimum that would have been obtained using optimization from MV calculus. Then the zcoordinate could be found trivially. Is that the idea?
I haven't found any such exercises online, and I don't know how to make up a quadratic equation which yields a positive definite A, which I understand to be essential for quadratic form minimization. Any help here would be appreciated.
Posting Permissions
 You may not post new threads
 You may not post replies
 You may not post attachments
 You may not edit your posts

Forum Rules
Bookmarks