Class LabelEvaluator
java.lang.Object
org.tribuo.evaluation.AbstractEvaluator<Label, LabelMetric.Context, LabelEvaluation, LabelMetric>
org.tribuo.classification.evaluation.LabelEvaluator
- All Implemented Interfaces:
Evaluator<Label, LabelEvaluation>
public final class LabelEvaluator
extends AbstractEvaluator<Label, LabelMetric.Context, LabelEvaluation, LabelMetric>
An
Evaluator for Labels.
The default set of metrics is taken from LabelMetrics. If the supplied
model generates probabilities, then it also calculates LabelMetrics.AUCROC and
LabelMetrics.AVERAGED_PRECISION.
If the dataset contains an unknown Label (as generated by LabelFactory.getUnknownOutput())
or a valid Label which is outside of the domain of the Model then the evaluate methods will
throw IllegalArgumentException with an appropriate message.
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionprotected LabelMetric.ContextcreateContext(Model<Label> model, List<Prediction<Label>> predictions) Create the context needed for evaluation.protected LabelEvaluationcreateEvaluation(LabelMetric.Context ctx, Map<MetricID<Label>, Double> results, EvaluationProvenance provenance) Create an evaluation for the given resultsprotected Set<LabelMetric> createMetrics(Model<Label> model) Creates the appropriate set of metrics for this model, by querying for it'sOutputInfo.Methods inherited from class org.tribuo.evaluation.AbstractEvaluator
computeResults, evaluate, evaluate, evaluateMethods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitMethods inherited from interface org.tribuo.evaluation.Evaluator
createOnlineEvaluator, evaluate
-
Constructor Details
-
LabelEvaluator
public LabelEvaluator()
-
-
Method Details
-
createMetrics
Description copied from class:AbstractEvaluatorCreates the appropriate set of metrics for this model, by querying for it'sOutputInfo.- Specified by:
createMetricsin classAbstractEvaluator<Label, LabelMetric.Context, LabelEvaluation, LabelMetric>- Parameters:
model- The model to inspect.- Returns:
- The set of metrics.
-
createContext
protected LabelMetric.Context createContext(Model<Label> model, List<Prediction<Label>> predictions) Description copied from class:AbstractEvaluatorCreate the context needed for evaluation. The context might store global properties or cache computation.- Specified by:
createContextin classAbstractEvaluator<Label, LabelMetric.Context, LabelEvaluation, LabelMetric>- Parameters:
model- the model that will be evaluatedpredictions- the predictions that will be evaluated- Returns:
- the context for this model and its predictions
-
createEvaluation
protected LabelEvaluation createEvaluation(LabelMetric.Context ctx, Map<MetricID<Label>, Double> results, EvaluationProvenance provenance) Description copied from class:AbstractEvaluatorCreate an evaluation for the given results- Specified by:
createEvaluationin classAbstractEvaluator<Label, LabelMetric.Context, LabelEvaluation, LabelMetric>- Parameters:
ctx- the context that was used to compute these resultsresults- the resultsprovenance- the provenance of the results (including information about the model and dataset)- Returns:
- the evaluation
-