gradDescent (3.0)
Gradient Descent for Regression Tasks.
https://github.com/drizzersilverberg/gradDescentR
http://cran.rproject.org/web/packages/gradDescent
An implementation of various learning algorithms based on Gradient Descent for dealing with regression tasks. The variants of gradient descent algorithm are : MiniBatch Gradient Descent (MBGD), which is an optimization to use training data partially to reduce the computation load. Stochastic Gradient Descent (SGD), which is an optimization to use a random data in learning to reduce the computation load drastically. Stochastic Average Gradient (SAG), which is a SGDbased algorithm to minimize stochastic step to average. Momentum Gradient Descent (MGD), which is an optimization to speedup gradient descent learning. Accelerated Gradient Descent (AGD), which is an optimization to accelerate gradient descent learning. Adagrad, which is a gradientdescentbased algorithm that accumulate previous cost to do adaptive learning. Adadelta, which is a gradientdescentbased algorithm that use hessian approximation to do adaptive learning. RMSprop, which is a gradientdescentbased algorithm that combine Adagrad and Adadelta adaptive learning ability. Adam, which is a gradientdescentbased algorithm that mean and variance moment to do adaptive learning. Stochastic Variance Reduce Gradient (SVRG), which is an optimization SGDbased algorithm to accelerates the process toward converging by reducing the gradient. Semi Stochastic Gradient Descent (SSGD),which is a SGDbased algorithm that combine GD and SGD to accelerates the process toward converging by choosing one of the gradients at a time. Stochastic Recursive Gradient Algorithm (SARAH), which is an optimization algorithm similarly SVRG to accelerates the process toward converging by accumulated stochastic information. Stochastic Recursive Gradient Algorithm+ (SARAHPlus), which is a SARAH practical variant algorithm to accelerates the process toward converging provides a possibility of earlier termination.
Maintainer:
Lala Septem Riza
Author(s): Galih Praja Wijaya, Dendi Handian, Imam Fachmi Nasrulloh, Lala Septem Riza, Rani Megasari, Enjun Junaeti
License: GPL (>= 2)  file LICENSE
Uses: Does not use any package
Released about 1 year ago.
2 previous versions
 gradDescent_2.0.1. Released about 2 years ago.
 gradDescent_2.0. Released over 2 years ago.
Ratings
Overall: 

Documentation: 

Log in to vote.
Reviews
No one has written a review of gradDescent yet. Want to be the first? Write one now.
Related packages: BayesTree, ElemStatLearn, GAMBoost, LogicReg, ROCR, RXshrink, arules, caret, e1071, earth, effects, elasticnet, gbm, glmpath, grplasso, ipred, kernlab, klaR, lars, lasso2 … (20 best matches, based on common tags.)
Search for gradDescent on google, google scholar, rhelp, rdevel.
Visit gradDescent on R Graphical Manual.