Class GradientOptimiserOptions

All Implemented Interfaces:

public class GradientOptimiserOptions extends Object implements
CLI options for configuring a gradient optimiser.
  • Field Details

    • learningRate

      @Option(longName="sgo-learning-rate", usage="Learning rate for AdaGrad, AdaGradRDA, Adam, Pegasos.") public double learningRate
      Learning rate for AdaGrad, AdaGradRDA, Adam, Pegasos.
    • epsilon

      @Option(longName="sgo-epsilon", usage="Epsilon for AdaDelta, AdaGrad, AdaGradRDA, Adam.") public double epsilon
      Epsilon for AdaDelta, AdaGrad, AdaGradRDA, Adam.
    • rho

      @Option(longName="sgo-rho", usage="Rho for RMSProp, AdaDelta, SGD with Momentum.") public double rho
      Rho for RMSProp, AdaDelta, SGD with Momentum.
    • lambda

      @Option(longName="sgo-lambda", usage="Lambda for Pegasos.") public double lambda
      Lambda for Pegasos.
    • paramAve

      @Option(longName="sgo-parameter-averaging", usage="Use parameter averaging.") public boolean paramAve
      Use parameter averaging.
    • momentum

      @Option(longName="sgo-momentum", usage="Use momentum in SGD.") public SGD.Momentum momentum
      Use momentum in SGD.
  • Constructor Details

    • GradientOptimiserOptions

      public GradientOptimiserOptions()
  • Method Details

    • getOptimiser

      public StochasticGradientOptimiser getOptimiser()
      Gets the configured gradient optimiser.
      The gradient optimiser.