public class GradientOptimiserOptions extends Object implements com.oracle.labs.mlrg.olcut.config.Options
Modifier and Type | Class and Description |
---|---|
static class |
GradientOptimiserOptions.StochasticGradientOptimiserType
Type of the gradient optimisers available in CLIs.
|
Modifier and Type | Field and Description |
---|---|
double |
epsilon |
double |
lambda |
double |
learningRate |
SGD.Momentum |
momentum |
boolean |
paramAve |
double |
rho |
Constructor and Description |
---|
GradientOptimiserOptions() |
Modifier and Type | Method and Description |
---|---|
StochasticGradientOptimiser |
getOptimiser()
Gets the configured gradient optimiser.
|
@Option(longName="sgo-learning-rate", usage="Learning rate for AdaGrad, AdaGradRDA, Adam, Pegasos.") public double learningRate
@Option(longName="sgo-epsilon", usage="Epsilon for AdaDelta, AdaGrad, AdaGradRDA, Adam.") public double epsilon
@Option(longName="sgo-rho", usage="Rho for RMSProp, AdaDelta, SGD with Momentum.") public double rho
@Option(longName="sgo-lambda", usage="Lambda for Pegasos.") public double lambda
@Option(longName="sgo-parameter-averaging", usage="Use parameter averaging.") public boolean paramAve
@Option(longName="sgo-momentum", usage="Use momentum in SGD.") public SGD.Momentum momentum
public StochasticGradientOptimiser getOptimiser()
Copyright © 2015–2021 Oracle and/or its affiliates. All rights reserved.