Gradient descent (and especially its stochastic variants) is the foundational iterative optimization tool for training deep neural networks, ranging from basic feed-forward networks to complex ...
This repository contains the code and experiments for the Master's thesis on preconditioning strategies for iterative methods applied to the Vecchia-Laplace approximation, developed using the GPBoost ...
Abstract: An iterative procedure is presented which permits the determination of a rational transfer function in the Laplace transform variable s which is optimal with respect to given input and ...
Abstract: One way to find an analytic approximation of the input impedance of thin wire antennas is to make an educated guess of the current distribution, which is then used for the induced ...
Linear solvers are major computational bottlenecks in a wide range of decision support and optimization computations. The challenges become even more pronounced on heterogeneous hardware, where ...
Therefore, a line-search algorithm is an iterative process that optimizes a nonlinear function of one parameter () within each iteration k of the optimization technique, which itself tries to optimize ...
ABSTRACT: To overcome the problem of calculation errors in the Born approximation when the forward accumulation effect is strong in VTI media, this article combines the De Wolf approximation method ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results