Uses of Class
org.tribuo.evaluation.metrics.MetricID
Package
Description
Evaluation classes for anomaly detection.
Evaluation classes for multi-class classification.
Provides infrastructure for
SequenceModel
s which
emit Label
s at each step of the sequence.Evaluation classes for clustering.
Evaluation base classes, along with code for train/test splits and cross validation.
This package contains the infrastructure classes for building evaluation metrics.
Evaluation classes for multi-label classification using
MultiLabel
.Evaluation classes for single or multi-dimensional regression.
Provides core classes for working with sequences of
Example
s.-
Uses of MetricID in org.tribuo.anomaly.evaluation
Modifier and TypeMethodDescriptionprotected AnomalyEvaluation
AnomalyEvaluator.createEvaluation
(org.tribuo.anomaly.evaluation.AnomalyMetric.Context context, Map<MetricID<Event>, Double> results, EvaluationProvenance provenance) -
Uses of MetricID in org.tribuo.classification.evaluation
Modifier and TypeMethodDescriptionprotected LabelEvaluation
LabelEvaluator.createEvaluation
(LabelMetric.Context ctx, Map<MetricID<Label>, Double> results, EvaluationProvenance provenance) -
Uses of MetricID in org.tribuo.classification.sequence
Modifier and TypeMethodDescriptionprotected LabelSequenceEvaluation
LabelSequenceEvaluator.createEvaluation
(LabelMetric.Context ctx, Map<MetricID<Label>, Double> results, EvaluationProvenance provenance) ModifierConstructorDescriptionprotected
LabelSequenceEvaluation
(Map<MetricID<Label>, Double> results, LabelMetric.Context ctx, EvaluationProvenance provenance) Constructs a LabelSequenceEvaluation using the supplied parameters. -
Uses of MetricID in org.tribuo.clustering.evaluation
Modifier and TypeMethodDescriptionprotected ClusteringEvaluation
ClusteringEvaluator.createEvaluation
(org.tribuo.clustering.evaluation.ClusteringMetric.Context context, Map<MetricID<ClusterID>, Double> results, EvaluationProvenance provenance) -
Uses of MetricID in org.tribuo.evaluation
Modifier and TypeMethodDescriptionEvaluation.asMap()
Get a map of all the metrics stored in this evaluation.AbstractEvaluator.computeResults
(C ctx, Set<? extends EvaluationMetric<T, C>> metrics) Computes each metric given the context.static <T extends Output<T>,
R extends Evaluation<T>>
Map<MetricID<T>,DescriptiveStats> Summarize all fields of a list of evaluations.static <T extends Output<T>,
R extends Evaluation<T>>
Map<MetricID<T>,DescriptiveStats> EvaluationAggregator.summarize
(Evaluator<T, R> evaluator, List<? extends Model<T>> models, Dataset<T> dataset) Summarize performance using the supplied evaluator across several models on one dataset.static <T extends Output<T>,
R extends Evaluation<T>>
Map<MetricID<T>,DescriptiveStats> EvaluationAggregator.summarize
(Evaluator<T, R> evaluator, Model<T> model, List<? extends Dataset<T>> datasets) Summarize performance according to evaluator for a single model across several datasets.static <T extends Output<T>,
R extends Evaluation<T>>
Map<MetricID<T>,DescriptiveStats> EvaluationAggregator.summarizeCrossValidation
(List<com.oracle.labs.mlrg.olcut.util.Pair<R, Model<T>>> evaluations) Summarize all fields of a list of evaluations produced byCrossValidation
.Modifier and TypeMethodDescriptiondefault double
Gets the value associated with the specific metric.Modifier and TypeMethodDescriptionprotected abstract E
AbstractEvaluator.createEvaluation
(C context, Map<MetricID<T>, Double> results, EvaluationProvenance provenance) Create an evaluation for the given results -
Uses of MetricID in org.tribuo.evaluation.metrics
-
Uses of MetricID in org.tribuo.multilabel.evaluation
Modifier and TypeMethodDescriptionprotected MultiLabelEvaluation
MultiLabelEvaluator.createEvaluation
(org.tribuo.multilabel.evaluation.MultiLabelMetric.Context context, Map<MetricID<MultiLabel>, Double> results, EvaluationProvenance provenance) -
Uses of MetricID in org.tribuo.regression.evaluation
Modifier and TypeMethodDescriptionprotected RegressionEvaluation
RegressionEvaluator.createEvaluation
(org.tribuo.regression.evaluation.RegressionMetric.Context context, Map<MetricID<Regressor>, Double> results, EvaluationProvenance provenance) -
Uses of MetricID in org.tribuo.sequence
Modifier and TypeMethodDescriptionSequenceEvaluation.asMap()
Get a map of all the metrics stored in this evaluation.AbstractSequenceEvaluator.computeResults
(C ctx, Set<? extends EvaluationMetric<T, C>> metrics) Computes each metric given the context.Modifier and TypeMethodDescriptiondefault double
Gets the value associated with the specific metric.Modifier and TypeMethodDescriptionprotected abstract E
AbstractSequenceEvaluator.createEvaluation
(C context, Map<MetricID<T>, Double> results, EvaluationProvenance provenance) Create an evaluation for the given results