Class AdaGrad

java.lang.Object
org.tribuo.math.optimisers.AdaGrad
All Implemented Interfaces:
com.oracle.labs.mlrg.olcut.config.Configurable, com.oracle.labs.mlrg.olcut.provenance.Provenancable<com.oracle.labs.mlrg.olcut.provenance.ConfiguredObjectProvenance>, StochasticGradientOptimiser

public class AdaGrad extends Object implements StochasticGradientOptimiser
An implementation of the AdaGrad gradient optimiser.

Creates one copy of the parameters to store learning rates.

See:

 Duchi, J., Hazan, E., and Singer, Y.
 "Adaptive Subgradient Methods for Online Learning and Stochastic Optimization"
 Journal of Machine Learning Research, 2012, 2121-2159.
 
  • Constructor Details

    • AdaGrad

      public AdaGrad(double initialLearningRate, double epsilon)
    • AdaGrad

      public AdaGrad(double initialLearningRate)
      Sets epsilon to 1e-6.
      Parameters:
      initialLearningRate - The learning rate.
  • Method Details