public enum GradientOptimiser extends Enum<GradientOptimiser>
Enum Constant and Description |
---|
ADADELTA
The AdaDelta optimiser.
|
ADAGRAD
The AdaGrad optimiser.
|
ADAGRADDA
The AdaGrad Dual Averaging optimiser.
|
ADAM
The Adam optimiser.
|
ADAMAX
The Adamax optimiser.
|
FTRL
The FTRL optimiser.
|
GRADIENT_DESCENT
A standard gradient descent optimiser with a fixed learning rate.
|
MOMENTUM
Gradient descent with momentum.
|
NADAM
The Nadam optimiser.
|
NESTEROV
Gradient descent with Nesterov momentum.
|
RMSPROP
The RMSprop optimiser.
|
Modifier and Type | Method and Description |
---|---|
<T extends org.tensorflow.types.family.TNumber> |
applyOptimiser(org.tensorflow.Graph graph,
org.tensorflow.Operand<T> loss,
Map<String,Float> optimiserParams)
Applies the optimiser to the graph and returns the optimiser step operation.
|
Set<String> |
getParameterNames()
An unmodifiable view of the parameter names used by this gradient optimiser.
|
boolean |
validateParamNames(Set<String> paramNames)
Checks that the parameter names in the supplied set are an exact
match for the parameter names that this gradient optimiser expects.
|
static GradientOptimiser |
valueOf(String name)
Returns the enum constant of this type with the specified name.
|
static GradientOptimiser[] |
values()
Returns an array containing the constants of this enum type, in
the order they are declared.
|
public static final GradientOptimiser ADADELTA
Parameters are:
public static final GradientOptimiser ADAGRAD
Parameters are:
public static final GradientOptimiser ADAGRADDA
Parameters are:
public static final GradientOptimiser ADAM
Parameters are:
public static final GradientOptimiser ADAMAX
Parameters are:
public static final GradientOptimiser FTRL
Parameters are:
public static final GradientOptimiser GRADIENT_DESCENT
Parameters are:
public static final GradientOptimiser MOMENTUM
Parameters are:
public static final GradientOptimiser NESTEROV
Parameters are:
public static final GradientOptimiser NADAM
Parameters are:
public static final GradientOptimiser RMSPROP
Parameters are:
public static GradientOptimiser[] values()
for (GradientOptimiser c : GradientOptimiser.values()) System.out.println(c);
public static GradientOptimiser valueOf(String name)
name
- the name of the enum constant to be returned.IllegalArgumentException
- if this enum type has no constant with the specified nameNullPointerException
- if the argument is nullpublic Set<String> getParameterNames()
public boolean validateParamNames(Set<String> paramNames)
paramNames
- The gradient optimiser parameter names.public <T extends org.tensorflow.types.family.TNumber> org.tensorflow.op.Op applyOptimiser(org.tensorflow.Graph graph, org.tensorflow.Operand<T> loss, Map<String,Float> optimiserParams)
T
- The loss type (most of the time this will be TFloat32
.graph
- The graph to optimise.loss
- The loss to minimise.optimiserParams
- The optimiser parameters.Copyright © 2015–2021 Oracle and/or its affiliates. All rights reserved.