Enum Class LabelMetrics

java.lang.Object
java.lang.Enum<LabelMetrics>
org.tribuo.classification.evaluation.LabelMetrics
All Implemented Interfaces:
Serializable, Comparable<LabelMetrics>, Constable

public enum LabelMetrics extends Enum<LabelMetrics>
An enum of the default LabelMetrics supported by the multi-class classification evaluation package.
  • Enum Constant Details

    • TP

      public static final LabelMetrics TP
      The number of true positives.
    • FP

      public static final LabelMetrics FP
      The number of false positives.
    • TN

      public static final LabelMetrics TN
      The number of true negatives.
    • FN

      public static final LabelMetrics FN
      The number of false negatives.
    • PRECISION

      public static final LabelMetrics PRECISION
      The precision, i.e., the number of true positives divided by the number of predicted positives.
    • RECALL

      public static final LabelMetrics RECALL
      The recall, i.e., the number of true positives divided by the number of ground truth positives.
    • F1

      public static final LabelMetrics F1
      The F_1 score, i.e., the harmonic mean of the precision and the recall.
    • ACCURACY

      public static final LabelMetrics ACCURACY
      The accuracy.
    • BALANCED_ERROR_RATE

      public static final LabelMetrics BALANCED_ERROR_RATE
      The balanced error rate, i.e., the mean of the per class recalls.
    • AUCROC

      public static final LabelMetrics AUCROC
      The area under the receiver-operator curve (ROC).
    • AVERAGED_PRECISION

      public static final LabelMetrics AVERAGED_PRECISION
      The averaged precision.
  • Method Details

    • values

      public static LabelMetrics[] values()
      Returns an array containing the constants of this enum class, in the order they are declared.
      Returns:
      an array containing the constants of this enum class, in the order they are declared
    • valueOf

      public static LabelMetrics valueOf(String name)
      Returns the enum constant of this class with the specified name. The string must match exactly an identifier used to declare an enum constant in this class. (Extraneous whitespace characters are not permitted.)
      Parameters:
      name - the name of the enum constant to be returned.
      Returns:
      the enum constant with the specified name
      Throws:
      IllegalArgumentException - if this enum class has no constant with the specified name
      NullPointerException - if the argument is null
    • getImpl

      Returns the implementing function for this metric.
      Returns:
      The implementing function.
    • forTarget

      public LabelMetric forTarget(MetricTarget<Label> tgt)
      Gets the LabelMetric wrapped around the supplied MetricTarget.
      Parameters:
      tgt - The metric target.
      Returns:
      The label metric combining the implementation function with the supplied metric target.
    • averagedPrecision

      public static double averagedPrecision(MetricTarget<Label> tgt, List<Prediction<Label>> predictions)
      Parameters:
      tgt - The metric target to use.
      predictions - The predictions to use.
      Returns:
      The averaged precision for the supplied target with the supplied predictions.
      See Also:
    • averagedPrecision

      public static double averagedPrecision(Label label, List<Prediction<Label>> predictions)
      Parameters:
      label - The Label to average across.
      predictions - The predictions to use.
      Returns:
      The averaged precision for the supplied label with the supplied predictions.
      See Also:
    • precisionRecallCurve

      public static LabelEvaluationUtil.PRCurve precisionRecallCurve(Label label, List<Prediction<Label>> predictions)
      Parameters:
      label - The Label to calculate precision and recall for.
      predictions - The predictions to use.
      Returns:
      The Precision Recall Curve for the supplied label with the supplied predictions.
      See Also:
    • AUCROC

      public static double AUCROC(Label label, List<Prediction<Label>> predictions)
      Area under the ROC curve.
      Parameters:
      label - the label corresponding to the "positive" class
      predictions - the predictions for which we'll compute the score
      Returns:
      AUC ROC score
      Throws:
      UnsupportedOperationException - if a prediction with no probability score, which are required to compute the ROC curve. (See also: Model.generatesProbabilities())
    • AUCROC

      public static double AUCROC(MetricTarget<Label> tgt, List<Prediction<Label>> predictions)
      Area under the ROC curve.
      Parameters:
      tgt - The metric target for the positive class.
      predictions - the predictions for which we'll compute the score
      Returns:
      AUC ROC score
      Throws:
      UnsupportedOperationException - if a prediction with no probability score, which are required to compute the ROC curve. (See also: Model.generatesProbabilities())