Class Pegasos
java.lang.Object
org.tribuo.math.optimisers.Pegasos
- All Implemented Interfaces:
com.oracle.labs.mlrg.olcut.config.Configurable,com.oracle.labs.mlrg.olcut.provenance.Provenancable<com.oracle.labs.mlrg.olcut.provenance.ConfiguredObjectProvenance>,StochasticGradientOptimiser
An implementation of the Pegasos gradient optimiser used primarily for solving the SVM problem.
This gradient optimiser rewrites all the Tensors in the Parameters
with ShrinkingTensor. This means it keeps a different value in the Tensor
to the one produced when you call get(), so it can correctly apply regularisation to the parameters.
When finalise() is called it rewrites the Parameters with standard dense Tensors.
Follows the implementation in Factorie.
Pegasos is remarkably touchy about it's learning rates. The defaults work on a couple of examples, but it requires tuning to work properly on a specific dataset.
See:
Shalev-Shwartz S, Singer Y, Srebro N, Cotter A "Pegasos: Primal Estimated Sub-Gradient Solver for SVM" Mathematical Programming, 2011.
-
Constructor Summary
ConstructorsConstructorDescriptionPegasos(double baseRate, double lambda) Constructs a Pegasos optimiser with the specified parameters. -
Method Summary
Modifier and TypeMethodDescriptioncopy()Copies a gradient optimiser with it's configuration.voidfinalise()Finalises the gradient optimisation, setting the parameters to their correct values.com.oracle.labs.mlrg.olcut.provenance.ConfiguredObjectProvenancevoidinitialise(Parameters parameters) Initialises the gradient optimiser.voidreset()Resets the optimiser so it's ready to optimise a newParameters.Tensor[]Take aTensorarray of gradients and transform them according to the current weight and learning rates.toString()Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, wait, wait, waitMethods inherited from interface com.oracle.labs.mlrg.olcut.config.Configurable
postConfig
-
Constructor Details
-
Pegasos
public Pegasos(double baseRate, double lambda) Constructs a Pegasos optimiser with the specified parameters.- Parameters:
baseRate- The base learning rate.lambda- The regularisation parameter.
-
-
Method Details
-
initialise
Description copied from interface:StochasticGradientOptimiserInitialises the gradient optimiser.Configures any learning rate parameters.
- Specified by:
initialisein interfaceStochasticGradientOptimiser- Parameters:
parameters- The parameters to optimise.
-
step
Description copied from interface:StochasticGradientOptimiserTake aTensorarray of gradients and transform them according to the current weight and learning rates.Can return the same
Tensorarray or a new one.- Specified by:
stepin interfaceStochasticGradientOptimiser- Parameters:
updates- An array of gradients.weight- The weight for the current gradients.- Returns:
- A
Tensorarray of gradients.
-
toString
-
finalise
public void finalise()Description copied from interface:StochasticGradientOptimiserFinalises the gradient optimisation, setting the parameters to their correct values. Used forParameterAveragingamongst others.- Specified by:
finalisein interfaceStochasticGradientOptimiser
-
reset
public void reset()Description copied from interface:StochasticGradientOptimiserResets the optimiser so it's ready to optimise a newParameters.- Specified by:
resetin interfaceStochasticGradientOptimiser
-
copy
Description copied from interface:StochasticGradientOptimiserCopies a gradient optimiser with it's configuration. Usually calls the copy constructor.- Specified by:
copyin interfaceStochasticGradientOptimiser- Returns:
- A gradient optimiser with the same configuration, but independent state.
-
getProvenance
public com.oracle.labs.mlrg.olcut.provenance.ConfiguredObjectProvenance getProvenance()- Specified by:
getProvenancein interfacecom.oracle.labs.mlrg.olcut.provenance.Provenancable<com.oracle.labs.mlrg.olcut.provenance.ConfiguredObjectProvenance>
-