Anescu, George (2018) A Heuristic Fast Gradient Descent Method for Unimodal Optimization. Journal of Advances in Mathematics and Computer Science, 26 (5). pp. 1-20. ISSN 24569968
![[thumbnail of Anescu2652018JAMCS39798.pdf]](http://archive.go4subs.com/style/images/fileicons/text.png)
Anescu2652018JAMCS39798.pdf - Published Version
Download (823kB)
Abstract
The known gradient descent optimization methods applied to convex functions are using the gradient's magnitude in order to adaptively determine the current step size. The paper is presenting a new heuristic fast gradient descent (HFGD) approach, which uses the change in gradient's direction in order to adaptively determine the current step size. The new approach can be applied to solve classes of unimodal functions more general than the convex functions (e.g., quasi-convex functions), or as a local optimization method in multimodal optimization. Testing conducted on a testbed of 16 test functions showed an overall much better eciency and an overall better success rate of the proposed HFGD method when compared to other three known first order gradient descent methods.
Item Type: | Article |
---|---|
Subjects: | STM Library Press > Mathematical Science |
Depositing User: | Unnamed user with email support@stmlibrarypress.com |
Date Deposited: | 26 Apr 2023 05:29 |
Last Modified: | 05 Sep 2025 04:14 |
URI: | http://archive.go4subs.com/id/eprint/1083 |