public enum LabelMetrics extends Enum<LabelMetrics>
LabelMetric
s supported by the multi-class classification
evaluation package.Enum Constant and Description |
---|
ACCURACY
The accuracy.
|
AUCROC
The area under the receiver-operator curve (ROC).
|
AVERAGED_PRECISION
The averaged precision.
|
BALANCED_ERROR_RATE
The balanced error rate, i.e., the mean of the per class recalls.
|
F1
The F_1 score, i.e., the harmonic mean of the precision and the recall.
|
FN
The number of false negatives.
|
FP
The number of false positives.
|
PRECISION
The precision, i.e., the number of true positives divided by the number of predicted positives.
|
RECALL
The recall, i.e., the number of true positives divided by the number of ground truth positives.
|
TN
The number of true negatives.
|
TP
The number of true positives.
|
Modifier and Type | Method and Description |
---|---|
static double |
AUCROC(Label label,
List<Prediction<Label>> predictions)
Area under the ROC curve.
|
static double |
AUCROC(MetricTarget<Label> tgt,
List<Prediction<Label>> predictions)
Area under the ROC curve.
|
static double |
averagedPrecision(Label label,
List<Prediction<Label>> predictions) |
static double |
averagedPrecision(MetricTarget<Label> tgt,
List<Prediction<Label>> predictions) |
LabelMetric |
forTarget(MetricTarget<Label> tgt)
Gets the LabelMetric wrapped around the supplied MetricTarget.
|
ToDoubleBiFunction<MetricTarget<Label>,LabelMetric.Context> |
getImpl()
Returns the implementing function for this metric.
|
static LabelEvaluationUtil.PRCurve |
precisionRecallCurve(Label label,
List<Prediction<Label>> predictions) |
static LabelMetrics |
valueOf(String name)
Returns the enum constant of this type with the specified name.
|
static LabelMetrics[] |
values()
Returns an array containing the constants of this enum type, in
the order they are declared.
|
public static final LabelMetrics TP
public static final LabelMetrics FP
public static final LabelMetrics TN
public static final LabelMetrics FN
public static final LabelMetrics PRECISION
public static final LabelMetrics RECALL
public static final LabelMetrics F1
public static final LabelMetrics ACCURACY
public static final LabelMetrics BALANCED_ERROR_RATE
public static final LabelMetrics AUCROC
public static final LabelMetrics AVERAGED_PRECISION
public static LabelMetrics[] values()
for (LabelMetrics c : LabelMetrics.values()) System.out.println(c);
public static LabelMetrics valueOf(String name)
name
- the name of the enum constant to be returned.IllegalArgumentException
- if this enum type has no constant with the specified nameNullPointerException
- if the argument is nullpublic ToDoubleBiFunction<MetricTarget<Label>,LabelMetric.Context> getImpl()
public LabelMetric forTarget(MetricTarget<Label> tgt)
tgt
- The metric target.public static double averagedPrecision(MetricTarget<Label> tgt, List<Prediction<Label>> predictions)
tgt
- The metric target to use.predictions
- The predictions to use.LabelEvaluationUtil.averagedPrecision(boolean[], double[])
public static double averagedPrecision(Label label, List<Prediction<Label>> predictions)
label
- The Label to average across.predictions
- The predictions to use.LabelEvaluationUtil.averagedPrecision(boolean[], double[])
public static LabelEvaluationUtil.PRCurve precisionRecallCurve(Label label, List<Prediction<Label>> predictions)
label
- The Label to calculate precision and recall for.predictions
- The predictions to use.LabelEvaluationUtil.generatePRCurve(boolean[], double[])
public static double AUCROC(Label label, List<Prediction<Label>> predictions)
label
- the label corresponding to the "positive" classpredictions
- the predictions for which we'll compute the scoreUnsupportedOperationException
- if a prediction with no probability score, which are required to compute the ROC curve. (See also: Model.generatesProbabilities()
)public static double AUCROC(MetricTarget<Label> tgt, List<Prediction<Label>> predictions)
tgt
- The metric target for the positive class.predictions
- the predictions for which we'll compute the scoreUnsupportedOperationException
- if a prediction with no probability score, which are required to compute the ROC curve. (See also: Model.generatesProbabilities()
)Copyright © 2015–2021 Oracle and/or its affiliates. All rights reserved.