optimg: General-Purpose Gradient-Based Optimization

Provides general purpose tools for helping users to implement steepest gradient descent methods for function optimization; for details see Ruder (2016) <arXiv:1609.04747v2>. Currently, the Steepest 2-Groups Gradient Descent and the Adaptive Moment Estimation (Adam) are the methods implemented. Other methods will be implemented in the future.

Version: 0.1.2
Imports: ucminf (≥ 1.1-4)
Published: 2021-10-07
Author: Vithor Rosa Franco
Maintainer: Vithor Rosa Franco <vithorfranco at gmail.com>
License: GPL-3
URL: https://github.com/vthorrf/optimg
NeedsCompilation: no
CRAN checks: optimg results


Reference manual: optimg.pdf


Package source: optimg_0.1.2.tar.gz
Windows binaries: r-devel: optimg_0.1.2.zip, r-release: optimg_0.1.2.zip, r-oldrel: optimg_0.1.2.zip
macOS binaries: r-release (arm64): optimg_0.1.2.tgz, r-release (x86_64): optimg_0.1.2.tgz, r-oldrel: optimg_0.1.2.tgz


Please use the canonical form https://CRAN.R-project.org/package=optimg to link to this page.