Package org.tribuo.math
Interface StochasticGradientOptimiser
- All Superinterfaces:
com.oracle.labs.mlrg.olcut.config.Configurable
,com.oracle.labs.mlrg.olcut.provenance.Provenancable<com.oracle.labs.mlrg.olcut.provenance.ConfiguredObjectProvenance>
- All Known Implementing Classes:
AdaDelta
,AdaGrad
,AdaGradRDA
,Adam
,ParameterAveraging
,Pegasos
,RMSProp
,SGD
public interface StochasticGradientOptimiser
extends com.oracle.labs.mlrg.olcut.config.Configurable, com.oracle.labs.mlrg.olcut.provenance.Provenancable<com.oracle.labs.mlrg.olcut.provenance.ConfiguredObjectProvenance>
Interface for gradient based optimisation methods.
Order of use:
Deviating from this order will cause unexpected behaviour.-
Method Summary
Modifier and TypeMethodDescriptioncopy()
Copies a gradient optimiser with it's configuration.default void
finalise()
Finalises the gradient optimisation, setting the parameters to their correct values.default void
initialise
(Parameters parameters) Initialises the gradient optimiser.void
reset()
Resets the optimiser so it's ready to optimise a newParameters
.Tensor[]
Take aTensor
array of gradients and transform them according to the current weight and learning rates.Methods inherited from interface com.oracle.labs.mlrg.olcut.config.Configurable
postConfig
Methods inherited from interface com.oracle.labs.mlrg.olcut.provenance.Provenancable
getProvenance
-
Method Details
-
initialise
Initialises the gradient optimiser.Configures any learning rate parameters.
- Parameters:
parameters
- The parameters to optimise.
-
step
Take aTensor
array of gradients and transform them according to the current weight and learning rates.Can return the same
Tensor
array or a new one.- Parameters:
updates
- An array of gradients.weight
- The weight for the current gradients.- Returns:
- A
Tensor
array of gradients.
-
finalise
default void finalise()Finalises the gradient optimisation, setting the parameters to their correct values. Used forParameterAveraging
amongst others. -
reset
void reset()Resets the optimiser so it's ready to optimise a newParameters
. -
copy
StochasticGradientOptimiser copy()Copies a gradient optimiser with it's configuration. Usually calls the copy constructor.- Returns:
- A gradient optimiser with the same configuration, but independent state.
-