Package org.tribuo.math.optimisers
Class GradientOptimiserOptions
java.lang.Object
org.tribuo.math.optimisers.GradientOptimiserOptions
- All Implemented Interfaces:
com.oracle.labs.mlrg.olcut.config.Options
public class GradientOptimiserOptions
extends Object
implements com.oracle.labs.mlrg.olcut.config.Options
CLI options for configuring a gradient optimiser.
-
Nested Class Summary
Modifier and TypeClassDescriptionstatic enum
Type of the gradient optimisers available in CLIs. -
Field Summary
Modifier and TypeFieldDescriptiondouble
Epsilon for AdaDelta, AdaGrad, AdaGradRDA, Adam.double
Lambda for Pegasos.double
Learning rate for AdaGrad, AdaGradRDA, Adam, Pegasos.Use momentum in SGD.boolean
Use parameter averaging.double
Rho for RMSProp, AdaDelta, SGD with Momentum.Fields inherited from interface com.oracle.labs.mlrg.olcut.config.Options
header
-
Constructor Summary
-
Method Summary
Modifier and TypeMethodDescriptionGets the configured gradient optimiser.Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
Methods inherited from interface com.oracle.labs.mlrg.olcut.config.Options
getOptionsDescription
-
Field Details
-
learningRate
@Option(longName="sgo-learning-rate", usage="Learning rate for AdaGrad, AdaGradRDA, Adam, Pegasos.") public double learningRateLearning rate for AdaGrad, AdaGradRDA, Adam, Pegasos. -
epsilon
@Option(longName="sgo-epsilon", usage="Epsilon for AdaDelta, AdaGrad, AdaGradRDA, Adam.") public double epsilonEpsilon for AdaDelta, AdaGrad, AdaGradRDA, Adam. -
rho
@Option(longName="sgo-rho", usage="Rho for RMSProp, AdaDelta, SGD with Momentum.") public double rhoRho for RMSProp, AdaDelta, SGD with Momentum. -
lambda
@Option(longName="sgo-lambda", usage="Lambda for Pegasos.") public double lambdaLambda for Pegasos. -
paramAve
@Option(longName="sgo-parameter-averaging", usage="Use parameter averaging.") public boolean paramAveUse parameter averaging. -
momentum
Use momentum in SGD.
-
-
Constructor Details
-
GradientOptimiserOptions
public GradientOptimiserOptions()
-
-
Method Details
-
getOptimiser
Gets the configured gradient optimiser.- Returns:
- The gradient optimiser.
-