Skip navigation links

Package org.tribuo.math.optimisers

Provides implementations of StochasticGradientOptimiser.

See: Description

Package org.tribuo.math.optimisers Description

Provides implementations of StochasticGradientOptimiser.

Has implementations of SGD using a variety of simple learning rate tempering systems, along with AdaGrad, Adam, AdaDelta, RMSProp and Pegasos.

Also provides ParameterAveraging which wraps another StochasticGradientOptimiser and averages the learned parameters across the gradient descent run. This is usually used for convex problems, for non-convex ones your milage may vary.

Skip navigation links

Copyright © 2015–2021 Oracle and/or its affiliates. All rights reserved.