Non convex optimization

combat1818

New member
Joined
Apr 10, 2023
Messages
1
Hi, I've been struggling with this task and would really appreciate some help:
We are given a function [math]f: R^{d}->R[/math] it is differentiable, L-smooth and non-convex, it has a global minimum [math]x^{*}[/math]We use a version of gradient descent of the form: [math]x_{t+1}=x_{t}- \eta_{t}\frac{\nabla f(x_{t})}{||\nabla f(x_{t})|| + \beta_{t}}[/math] where [math]\eta_t,\beta_t>0[/math]We want to find [math]\eta_t, \beta_t[/math] which do not depend on the parameter L and guarantee:
[math]\frac{1}{T} \sum_{t=0}^{T-1} ||\nabla f(x_t)||=\tilde{O}(T^{\frac{-1}{2}})[/math]where the O tilde notation ignores logarithmic factors.
 
I've been struggling with this task
Hello. Please share your beginning work. Thank you!

[imath]\;[/imath]
 
Top