Show newer

Optimal design problem with thermal radiation arxiv.org/abs/2408.00021

Optimal design problem with thermal radiation

This paper is concerned with configurations of two-material thermal conductors that minimize the Dirichlet energy for steady-state diffusion equations with nonlinear boundary conditions described mainly by maximal monotone operators. To find such configurations, a homogenization theorem will be proved and applied to an existence theorem for minimizers of a relaxation problem whose minimum value is equivalent to an original design problem. As a typical example of nonlinear boundary conditions, thermal radiation boundary conditions will be the focus, and then the Fréchet derivative of the Dirichlet energy will be derived, which is used to estimate the minimum value. Since optimal configurations of the relaxation problem involve the so-called grayscale domains that do not make sense in general, a perimeter constraint problem via the positive part of the level set function will be introduced as an approximation problem to avoid such domains, and moreover, the existence theorem for minimizers of the perimeter constraint problem will be proved. In particular, it will also be proved that the limit of minimizers for the approximation problem becomes that of the relaxation problem in a specific case, and then candidates for minimizers of the approximation problem will be constructed by employing time-discrete versions of nonlinear diffusion equations. In this paper, it will be shown that optimized configurations deeply depend on force terms as a characteristic of nonlinear problems and will also be applied to real physical problems.

arxiv.org

Convergence rates for the Adam optimizer arxiv.org/abs/2407.21078

Convergence rates for the Adam optimizer

Stochastic gradient descent (SGD) optimization methods are nowadays the method of choice for the training of deep neural networks (DNNs) in artificial intelligence systems. In practically relevant training problems, usually not the plain vanilla standard SGD method is the employed optimization scheme but instead suitably accelerated and adaptive SGD optimization methods are applied. As of today, maybe the most popular variant of such accelerated and adaptive SGD optimization methods is the famous Adam optimizer proposed by Kingma & Ba in 2014. Despite the popularity of the Adam optimizer in implementations, it remained an open problem of research to provide a convergence analysis for the Adam optimizer even in the situation of simple quadratic stochastic optimization problems where the objective function (the function one intends to minimize) is strongly convex. In this work we solve this problem by establishing optimal convergence rates for the Adam optimizer for a large class of stochastic optimization problems, in particular, covering simple quadratic stochastic optimization problems. The key ingredient of our convergence analysis is a new vector field function which we propose to refer to as the Adam vector field. This Adam vector field accurately describes the macroscopic behaviour of the Adam optimization process but differs from the negative gradient of the objective function (the function we intend to minimize) of the considered stochastic optimization problem. In particular, our convergence analysis reveals that the Adam optimizer does typically not converge to critical points of the objective function (zeros of the gradient of the objective function) of the considered optimization problem but converges with rates to zeros of this Adam vector field.

arxiv.org
Show older
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.