public final class LabelEvaluator extends AbstractEvaluator<Label,LabelMetric.Context,LabelEvaluation,LabelMetric>
Evaluator
for Label
s.
The default set of metrics is taken from LabelMetrics
. If the supplied
model generates probabilities, then it also calculates LabelMetrics.AUCROC
and
LabelMetrics.AVERAGED_PRECISION
.
If the dataset contains an unknown Label (as generated by LabelFactory.getUnknownOutput()
)
or a valid Label which is outside of the domain of the Model
then the evaluate methods will
throw IllegalArgumentException
with an appropriate message.
Constructor and Description |
---|
LabelEvaluator() |
Modifier and Type | Method and Description |
---|---|
protected LabelMetric.Context |
createContext(Model<Label> model,
List<Prediction<Label>> predictions)
Create the context needed for evaluation.
|
protected LabelEvaluation |
createEvaluation(LabelMetric.Context ctx,
Map<MetricID<Label>,Double> results,
EvaluationProvenance provenance)
Create an evaluation for the given results
|
protected Set<LabelMetric> |
createMetrics(Model<Label> model)
Creates the appropriate set of metrics for this model, by querying for it's
OutputInfo . |
computeResults, evaluate, evaluate, evaluate
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
createOnlineEvaluator, evaluate
protected Set<LabelMetric> createMetrics(Model<Label> model)
AbstractEvaluator
OutputInfo
.createMetrics
in class AbstractEvaluator<Label,LabelMetric.Context,LabelEvaluation,LabelMetric>
model
- The model to inspect.protected LabelMetric.Context createContext(Model<Label> model, List<Prediction<Label>> predictions)
AbstractEvaluator
createContext
in class AbstractEvaluator<Label,LabelMetric.Context,LabelEvaluation,LabelMetric>
model
- the model that will be evaluatedpredictions
- the predictions that will be evaluatedprotected LabelEvaluation createEvaluation(LabelMetric.Context ctx, Map<MetricID<Label>,Double> results, EvaluationProvenance provenance)
AbstractEvaluator
createEvaluation
in class AbstractEvaluator<Label,LabelMetric.Context,LabelEvaluation,LabelMetric>
ctx
- the context that was used to compute these resultsresults
- the resultsprovenance
- the provenance of the results (including information about the model and dataset)Copyright © 2015–2021 Oracle and/or its affiliates. All rights reserved.