Interface StochasticGradientOptimiser
- All Superinterfaces:
com.oracle.labs.mlrg.olcut.config.Configurable,com.oracle.labs.mlrg.olcut.provenance.Provenancable<com.oracle.labs.mlrg.olcut.provenance.ConfiguredObjectProvenance>
- All Known Implementing Classes:
AdaDelta,AdaGrad,AdaGradRDA,Adam,ParameterAveraging,Pegasos,RMSProp,SGD
public interface StochasticGradientOptimiser
extends com.oracle.labs.mlrg.olcut.config.Configurable, com.oracle.labs.mlrg.olcut.provenance.Provenancable<com.oracle.labs.mlrg.olcut.provenance.ConfiguredObjectProvenance>
Interface for gradient based optimisation methods.
Order of use:
Deviating from this order will cause unexpected behaviour.-
Method Summary
Modifier and TypeMethodDescriptioncopy()Copies a gradient optimiser with it's configuration.default voidfinalise()Finalises the gradient optimisation, setting the parameters to their correct values.default voidinitialise(Parameters parameters) Initialises the gradient optimiser.voidreset()Resets the optimiser so it's ready to optimise a newParameters.Tensor[]Take aTensorarray of gradients and transform them according to the current weight and learning rates.Methods inherited from interface com.oracle.labs.mlrg.olcut.config.Configurable
postConfigMethods inherited from interface com.oracle.labs.mlrg.olcut.provenance.Provenancable
getProvenance
-
Method Details
-
initialise
Initialises the gradient optimiser.Configures any learning rate parameters.
- Parameters:
parameters- The parameters to optimise.
-
step
-
finalise
default void finalise()Finalises the gradient optimisation, setting the parameters to their correct values. Used forParameterAveragingamongst others. -
reset
void reset()Resets the optimiser so it's ready to optimise a newParameters. -
copy
StochasticGradientOptimiser copy()Copies a gradient optimiser with it's configuration. Usually calls the copy constructor.- Returns:
- A gradient optimiser with the same configuration, but independent state.
-