Class AdaGrad

java.lang.Object
org.tribuo.math.optimisers.AdaGrad
All Implemented Interfaces:
com.oracle.labs.mlrg.olcut.config.Configurable, com.oracle.labs.mlrg.olcut.provenance.Provenancable<com.oracle.labs.mlrg.olcut.provenance.ConfiguredObjectProvenance>, StochasticGradientOptimiser

public class AdaGrad extends Object implements StochasticGradientOptimiser
An implementation of the AdaGrad gradient optimiser.

Creates one copy of the parameters to store learning rates.

See:

 Duchi, J., Hazan, E., and Singer, Y.
 "Adaptive Subgradient Methods for Online Learning and Stochastic Optimization"
 Journal of Machine Learning Research, 2012, 2121-2159.
 
  • Constructor Details

    • AdaGrad

      public AdaGrad(double initialLearningRate, double epsilon)
    • AdaGrad

      public AdaGrad(double initialLearningRate)
      Sets epsilon to 1e-6.
      Parameters:
      initialLearningRate - The learning rate.
  • Method Details

    • initialise

      public void initialise(Parameters parameters)
      Description copied from interface: StochasticGradientOptimiser
      Initialises the gradient optimiser.

      Configures any learning rate parameters.

      Specified by:
      initialise in interface StochasticGradientOptimiser
      Parameters:
      parameters - The parameters to optimise.
    • step

      public Tensor[] step(Tensor[] updates, double weight)
      Description copied from interface: StochasticGradientOptimiser
      Take a Tensor array of gradients and transform them according to the current weight and learning rates.

      Can return the same Tensor array or a new one.

      Specified by:
      step in interface StochasticGradientOptimiser
      Parameters:
      updates - An array of gradients.
      weight - The weight for the current gradients.
      Returns:
      A Tensor array of gradients.
    • toString

      public String toString()
      Overrides:
      toString in class Object
    • reset

      public void reset()
      Description copied from interface: StochasticGradientOptimiser
      Resets the optimiser so it's ready to optimise a new Parameters.
      Specified by:
      reset in interface StochasticGradientOptimiser
    • copy

      public AdaGrad copy()
      Description copied from interface: StochasticGradientOptimiser
      Copies a gradient optimiser with it's configuration. Usually calls the copy constructor.
      Specified by:
      copy in interface StochasticGradientOptimiser
      Returns:
      A gradient optimiser with the same configuration, but independent state.
    • getProvenance

      public com.oracle.labs.mlrg.olcut.provenance.ConfiguredObjectProvenance getProvenance()
      Specified by:
      getProvenance in interface com.oracle.labs.mlrg.olcut.provenance.Provenancable<com.oracle.labs.mlrg.olcut.provenance.ConfiguredObjectProvenance>