An implementation of various learning algorithms based on Gradient Descent for dealing with regression tasks. The variants of gradient descent algorithm are : Mini-Batch Gradient Descent (MBGD), which is an optimization to use training data partially to reduce the computation load. Stochastic Gradient Descent (SGD), which is an optimization to use a random data in learning to reduce the computation load drastically. Stochastic Average Gradient (SAG), which is a SGD-based algorithm to minimize stochastic step to average. Momentum Gradient Descent (MGD), which is an optimization to speed-up gradient descent learning. Accelerated Gradient Descent (AGD), which is an optimization to accelerate gradient descent learning. Adagrad, which is a gradient-descent-based algorithm that accumulate previous cost to do adaptive learning. Adadelta, which is a gradient-descent-based algorithm that use hessian approximation to do adaptive learning. RMSprop, which is a gradient-descent-based algorithm that combine Adagrad and Adadelta adaptive learning ability. Adam, which is a gradient-descent-based algorithm that mean and variance moment to do adaptive learning.
| Version: | 2.0.1 |
| Published: | 2017-03-11 |
| Author: | Dendi Handian, Imam Fachmi Nasrulloh, Lala Septem Riza |
| Maintainer: | Lala Septem Riza <lala.s.riza at upi.edu> |
| License: | GPL-2 | GPL-3 | file LICENSE [expanded from: GPL (≥ 2) | file LICENSE] |
| URL: | https://github.com/drizzersilverberg/gradDescentR |
| NeedsCompilation: | no |
| In views: | MachineLearning |
| CRAN checks: | gradDescent results |
| Reference manual: | gradDescent.pdf |
| Package source: | gradDescent_2.0.1.tar.gz |
| Windows binaries: | r-devel: gradDescent_2.0.1.zip, r-release: gradDescent_2.0.1.zip, r-oldrel: gradDescent_2.0.1.zip |
| OS X El Capitan binaries: | r-release: gradDescent_2.0.1.tgz |
| OS X Mavericks binaries: | r-oldrel: gradDescent_2.0.1.tgz |
| Old sources: | gradDescent archive |
Please use the canonical form https://CRAN.R-project.org/package=gradDescent to link to this page.