Package | Description |
---|---|
org.tribuo.classification.sgd.crf |
Provides an implementation of a linear chain CRF trained using Stochastic Gradient Descent.
|
org.tribuo.classification.sgd.linear |
Provides an implementation of a classification linear model using Stochastic Gradient Descent.
|
org.tribuo.math |
Contains the implementation of Tribuo's math library, it's gradient descent optimisers, kernels and a set of
math related utils.
|
org.tribuo.math.optimisers |
Provides implementations of
StochasticGradientOptimiser . |
org.tribuo.regression.sgd.linear |
Provides an implementation of linear regression using Stochastic Gradient Descent.
|
Constructor and Description |
---|
CRFTrainer(StochasticGradientOptimiser optimiser,
int epochs,
int loggingInterval,
int minibatchSize,
long seed)
Creates a CRFTrainer which uses SGD to learn the parameters.
|
CRFTrainer(StochasticGradientOptimiser optimiser,
int epochs,
int loggingInterval,
long seed)
Sets the minibatch size to 1.
|
CRFTrainer(StochasticGradientOptimiser optimiser,
int epochs,
long seed)
Sets the minibatch size to 1 and the logging interval to 100.
|
Constructor and Description |
---|
LinearSGDTrainer(LabelObjective objective,
StochasticGradientOptimiser optimiser,
int epochs,
int loggingInterval,
int minibatchSize,
long seed)
Constructs an SGD trainer for a linear model.
|
LinearSGDTrainer(LabelObjective objective,
StochasticGradientOptimiser optimiser,
int epochs,
int loggingInterval,
long seed)
Sets the minibatch size to 1.
|
LinearSGDTrainer(LabelObjective objective,
StochasticGradientOptimiser optimiser,
int epochs,
long seed)
Sets the minibatch size to 1 and the logging interval to 1000.
|
Modifier and Type | Method and Description |
---|---|
StochasticGradientOptimiser |
StochasticGradientOptimiser.copy()
Copies a gradient optimiser with it's configuration.
|
Modifier and Type | Class and Description |
---|---|
class |
AdaDelta
An implementation of the AdaDelta gradient optimiser.
|
class |
AdaGrad
An implementation of the AdaGrad gradient optimiser.
|
class |
AdaGradRDA
An implementation of the AdaGrad gradient optimiser with regularized dual averaging.
|
class |
Adam
An implementation of the Adam gradient optimiser.
|
class |
ParameterAveraging
Averages the parameters across a gradient run.
|
class |
Pegasos
An implementation of the Pegasos gradient optimiser used primarily for solving the SVM problem.
|
class |
RMSProp
An implementation of the RMSProp gradient optimiser.
|
class |
SGD
An implementation of single learning rate SGD and optionally momentum.
|
Modifier and Type | Method and Description |
---|---|
StochasticGradientOptimiser |
GradientOptimiserOptions.getOptimiser()
Gets the configured gradient optimiser.
|
Constructor and Description |
---|
ParameterAveraging(StochasticGradientOptimiser optimiser)
Adds parameter averaging around a gradient optimiser.
|
Constructor and Description |
---|
LinearSGDTrainer(RegressionObjective objective,
StochasticGradientOptimiser optimiser,
int epochs,
int loggingInterval,
int minibatchSize,
long seed)
Constructs an SGD trainer for a linear model.
|
LinearSGDTrainer(RegressionObjective objective,
StochasticGradientOptimiser optimiser,
int epochs,
int loggingInterval,
long seed)
Sets the minibatch size to 1.
|
LinearSGDTrainer(RegressionObjective objective,
StochasticGradientOptimiser optimiser,
int epochs,
long seed)
Sets the minibatch size to 1 and the logging interval to 1000.
|
Copyright © 2015–2021 Oracle and/or its affiliates. All rights reserved.