Profile directory About Mobile apps
Log in Sign up
arXiv Statistics @arxiv_stats@qoto.org
Follow

Acceleration and Implicit Regularization in Gaussian Phase Retrieval. (arXiv:2311.12888v1 [math.OC]) http://arxiv.org/abs/2311.12888

Acceleration and Implicit Regularization in Gaussian Phase Retrieval

We study accelerated optimization methods in the Gaussian phase retrieval problem. In this setting, we prove that gradient methods with Polyak or Nesterov momentum have similar implicit regularization to gradient descent. This implicit regularization ensures that the algorithms remain in a nice region, where the cost function is strongly convex and smooth despite being nonconvex in general. This ensures that these accelerated methods achieve faster rates of convergence than gradient descent. Experimental evidence demonstrates that the accelerated methods converge faster than gradient descent in practice.

arxiv.org
November 23, 2023 at 3:20 AM · · feed2toot · 0 · 0 · 0
Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.

Trending now

#photography0 people talking
0
#news0 people talking
0
#music0 people talking
0

Resources

  • Terms of service
  • Privacy policy

Developers

  • Documentation
  • API

What is Mastodon?

qoto.org

  • About
  • v3.5.19-qoto

More…

  • Source code
  • Mobile apps
v3.5.19-qoto · Privacy policy