DSpace Repository

Convergence of Gradient Methods with Deterministic and Bounded Noise

Show simple item record

dc.contributor.author Abeynanda, H.
dc.contributor.author Lanel, G. H. J.
dc.date.accessioned 2023-04-10T04:43:48Z
dc.date.available 2023-04-10T04:43:48Z
dc.date.issued 2022
dc.identifier.citation Abeynanda, H. & Lanel, G. H. J. (2022). Convergence of Gradient Methods with Deterministic and Bounded Noise. SUIT International Conference on Advancements in Sciences and Humanities, (11) October, Colombo, 189 -195, 2022. en_US
dc.identifier.uri http://dr.lib.sjp.ac.lk/handle/123456789/12718
dc.description.abstract In this paper, we analyse the effects of noise on the gradient methods for solving a convex unconstraint optimization problem. Assuming that the objective function is with Lipschitz continuous gradients, we analyse the convergence properties of the gradient method when the noise is deterministic and bounded. Our theoretical results show that the gradient algorithm converges to the related optimality within some tolerance, where the tolerance depends on the underlying noise, step size, and the gradient Lipschitz continuity constant of the underlying objective function. Moreover, we consider an application of distributed optimization, where the objective function is a sum of two strongly convex functions. Then the related convergences are discussed based on dual decomposition together with gradient methods, where the associated noise is considered as a consequence of quantization errors. Finally, the theoretical results are verified using numerical experiments. en_US
dc.language.iso en en_US
dc.subject The gradient method; deterministic and bounded noise; distributed optimization; dual decomposition en_US
dc.title Convergence of Gradient Methods with Deterministic and Bounded Noise en_US
dc.type Article en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search DSpace


Browse

My Account