public class Adam extends Object implements StochasticGradientOptimiser
Creates two copies of the parameters to store learning rates.
See:
Kingma, D., and Ba, J. "Adam: A Method for Stochastic Optimization" arXiv preprint arXiv:1412.6980, 2014.
Constructor and Description |
---|
Adam()
Sets initialLearningRate to 0.001, betaOne to 0.9, betaTwo to 0.999, epsilon to 1e-6.
|
Adam(double initialLearningRate,
double epsilon)
Sets betaOne to 0.9 and betaTwo to 0.999
|
Adam(double initialLearningRate,
double betaOne,
double betaTwo,
double epsilon)
It's highly recommended not to modify these parameters, use one of the
other constructors.
|
Modifier and Type | Method and Description |
---|---|
Adam |
copy()
Copies a gradient optimiser with it's configuration.
|
com.oracle.labs.mlrg.olcut.provenance.ConfiguredObjectProvenance |
getProvenance() |
void |
initialise(Parameters parameters)
Initialises the gradient optimiser.
|
void |
reset()
Resets the optimiser so it's ready to optimise a new
Parameters . |
Tensor[] |
step(Tensor[] updates,
double weight)
Take a
Tensor array of gradients and transform them
according to the current weight and learning rates. |
String |
toString() |
clone, equals, finalize, getClass, hashCode, notify, notifyAll, wait, wait, wait
finalise
public Adam(double initialLearningRate, double betaOne, double betaTwo, double epsilon)
initialLearningRate
- The initial learning rate.betaOne
- The value of beta-one.betaTwo
- The value of beta-two.epsilon
- The epsilon value.public Adam(double initialLearningRate, double epsilon)
initialLearningRate
- The initial learning rate.epsilon
- The epsilon value.public Adam()
public void initialise(Parameters parameters)
StochasticGradientOptimiser
Configures any learning rate parameters.
initialise
in interface StochasticGradientOptimiser
parameters
- The parameters to optimise.public Tensor[] step(Tensor[] updates, double weight)
StochasticGradientOptimiser
Tensor
array of gradients and transform them
according to the current weight and learning rates.
Can return the same Tensor
array or a new one.
step
in interface StochasticGradientOptimiser
updates
- An array of gradients.weight
- The weight for the current gradients.Tensor
array of gradients.public void reset()
StochasticGradientOptimiser
Parameters
.reset
in interface StochasticGradientOptimiser
public Adam copy()
StochasticGradientOptimiser
copy
in interface StochasticGradientOptimiser
public com.oracle.labs.mlrg.olcut.provenance.ConfiguredObjectProvenance getProvenance()
getProvenance
in interface com.oracle.labs.mlrg.olcut.provenance.Provenancable<com.oracle.labs.mlrg.olcut.provenance.ConfiguredObjectProvenance>
Copyright © 2015–2021 Oracle and/or its affiliates. All rights reserved.