Class LabelEvaluator
java.lang.Object
org.tribuo.evaluation.AbstractEvaluator<Label, LabelMetric.Context, LabelEvaluation, LabelMetric>
org.tribuo.classification.evaluation.LabelEvaluator
- All Implemented Interfaces:
- Evaluator<Label, LabelEvaluation>
public final class LabelEvaluator
extends AbstractEvaluator<Label, LabelMetric.Context, LabelEvaluation, LabelMetric>
An 
Evaluator for Labels.
 
 The default set of metrics is taken from LabelMetrics. If the supplied
 model generates probabilities, then it also calculates LabelMetrics.AUCROC and
 LabelMetrics.AVERAGED_PRECISION.
 
 If the dataset contains an unknown Label (as generated by LabelFactory.getUnknownOutput())
 or a valid Label which is outside of the domain of the Model then the evaluate methods will
 throw IllegalArgumentException with an appropriate message.
- 
Constructor SummaryConstructors
- 
Method SummaryModifier and TypeMethodDescriptionprotected LabelMetric.ContextcreateContext(Model<Label> model, List<Prediction<Label>> predictions) Create the context needed for evaluation.protected LabelEvaluationcreateEvaluation(LabelMetric.Context ctx, Map<MetricID<Label>, Double> results, EvaluationProvenance provenance) Create an evaluation for the given resultsprotected Set<LabelMetric> createMetrics(Model<Label> model) Creates the appropriate set of metrics for this model, by querying for it'sOutputInfo.Methods inherited from class org.tribuo.evaluation.AbstractEvaluatorcomputeResults, evaluate, evaluate, evaluateMethods inherited from class java.lang.Objectclone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitMethods inherited from interface org.tribuo.evaluation.EvaluatorcreateOnlineEvaluator, evaluate
- 
Constructor Details- 
LabelEvaluatorpublic LabelEvaluator()
 
- 
- 
Method Details- 
createMetricsDescription copied from class:AbstractEvaluatorCreates the appropriate set of metrics for this model, by querying for it'sOutputInfo.- Specified by:
- createMetricsin class- AbstractEvaluator<Label, LabelMetric.Context, LabelEvaluation, LabelMetric>
- Parameters:
- model- The model to inspect.
- Returns:
- The set of metrics.
 
- 
createContextprotected LabelMetric.Context createContext(Model<Label> model, List<Prediction<Label>> predictions) Description copied from class:AbstractEvaluatorCreate the context needed for evaluation. The context might store global properties or cache computation.- Specified by:
- createContextin class- AbstractEvaluator<Label, LabelMetric.Context, LabelEvaluation, LabelMetric>
- Parameters:
- model- the model that will be evaluated
- predictions- the predictions that will be evaluated
- Returns:
- the context for this model and its predictions
 
- 
createEvaluationprotected LabelEvaluation createEvaluation(LabelMetric.Context ctx, Map<MetricID<Label>, Double> results, EvaluationProvenance provenance) Description copied from class:AbstractEvaluatorCreate an evaluation for the given results- Specified by:
- createEvaluationin class- AbstractEvaluator<Label, LabelMetric.Context, LabelEvaluation, LabelMetric>
- Parameters:
- ctx- the context that was used to compute these results
- results- the results
- provenance- the provenance of the results (including information about the model and dataset)
- Returns:
- the evaluation
 
 
-