Enum Class LabelMetrics
- All Implemented Interfaces:
Serializable
,Comparable<LabelMetrics>
,Constable
An enum of the default
LabelMetric
s supported by the multi-class classification
evaluation package.-
Nested Class Summary
Nested classes/interfaces inherited from class java.lang.Enum
Enum.EnumDesc<E extends Enum<E>>
-
Enum Constant Summary
Enum ConstantDescriptionThe accuracy.The area under the receiver-operator curve (ROC).The averaged precision.The balanced error rate, i.e., the mean of the per class recalls.The F_1 score, i.e., the harmonic mean of the precision and the recall.The number of false negatives.The number of false positives.The precision, i.e., the number of true positives divided by the number of predicted positives.The recall, i.e., the number of true positives divided by the number of ground truth positives.The number of true negatives.The number of true positives. -
Method Summary
Modifier and TypeMethodDescriptionstatic double
AUCROC
(Label label, List<Prediction<Label>> predictions) Area under the ROC curve.static double
AUCROC
(MetricTarget<Label> tgt, List<Prediction<Label>> predictions) Area under the ROC curve.static double
averagedPrecision
(Label label, List<Prediction<Label>> predictions) static double
averagedPrecision
(MetricTarget<Label> tgt, List<Prediction<Label>> predictions) forTarget
(MetricTarget<Label> tgt) Gets the LabelMetric wrapped around the supplied MetricTarget.getImpl()
Returns the implementing function for this metric.static LabelEvaluationUtil.PRCurve
precisionRecallCurve
(Label label, List<Prediction<Label>> predictions) static LabelMetrics
Returns the enum constant of this class with the specified name.static LabelMetrics[]
values()
Returns an array containing the constants of this enum class, in the order they are declared.
-
Enum Constant Details
-
TP
The number of true positives. -
FP
The number of false positives. -
TN
The number of true negatives. -
FN
The number of false negatives. -
PRECISION
The precision, i.e., the number of true positives divided by the number of predicted positives. -
RECALL
The recall, i.e., the number of true positives divided by the number of ground truth positives. -
F1
The F_1 score, i.e., the harmonic mean of the precision and the recall. -
ACCURACY
The accuracy. -
BALANCED_ERROR_RATE
The balanced error rate, i.e., the mean of the per class recalls. -
AUCROC
The area under the receiver-operator curve (ROC). -
AVERAGED_PRECISION
The averaged precision.
-
-
Method Details
-
values
Returns an array containing the constants of this enum class, in the order they are declared.- Returns:
- an array containing the constants of this enum class, in the order they are declared
-
valueOf
Returns the enum constant of this class with the specified name. The string must match exactly an identifier used to declare an enum constant in this class. (Extraneous whitespace characters are not permitted.)- Parameters:
name
- the name of the enum constant to be returned.- Returns:
- the enum constant with the specified name
- Throws:
IllegalArgumentException
- if this enum class has no constant with the specified nameNullPointerException
- if the argument is null
-
getImpl
Returns the implementing function for this metric.- Returns:
- The implementing function.
-
forTarget
Gets the LabelMetric wrapped around the supplied MetricTarget.- Parameters:
tgt
- The metric target.- Returns:
- The label metric combining the implementation function with the supplied metric target.
-
averagedPrecision
public static double averagedPrecision(MetricTarget<Label> tgt, List<Prediction<Label>> predictions) - Parameters:
tgt
- The metric target to use.predictions
- The predictions to use.- Returns:
- The averaged precision for the supplied target with the supplied predictions.
- See Also:
-
averagedPrecision
- Parameters:
label
- The Label to average across.predictions
- The predictions to use.- Returns:
- The averaged precision for the supplied label with the supplied predictions.
- See Also:
-
precisionRecallCurve
public static LabelEvaluationUtil.PRCurve precisionRecallCurve(Label label, List<Prediction<Label>> predictions) - Parameters:
label
- The Label to calculate precision and recall for.predictions
- The predictions to use.- Returns:
- The Precision Recall Curve for the supplied label with the supplied predictions.
- See Also:
-
AUCROC
Area under the ROC curve.- Parameters:
label
- the label corresponding to the "positive" classpredictions
- the predictions for which we'll compute the score- Returns:
- AUC ROC score
- Throws:
UnsupportedOperationException
- if a prediction with no probability score, which are required to compute the ROC curve. (See also:Model.generatesProbabilities()
)
-
AUCROC
Area under the ROC curve.- Parameters:
tgt
- The metric target for the positive class.predictions
- the predictions for which we'll compute the score- Returns:
- AUC ROC score
- Throws:
UnsupportedOperationException
- if a prediction with no probability score, which are required to compute the ROC curve. (See also:Model.generatesProbabilities()
)
-