Class LabelEvaluationUtil

java.lang.Object
org.tribuo.classification.evaluation.LabelEvaluationUtil

public final class LabelEvaluationUtil extends Object
Static utility functions for calculating performance metrics on Labels.
  • Nested Class Summary

    Nested Classes
    Modifier and Type
    Class
    Description
    static class 
    Stores the Precision-Recall curve as three arrays: the precisions, the recalls, and the thresholds associated with those values.
    static class 
    Stores the ROC curve as three arrays: the false positive rate, the true positive rate, and the thresholds associated with those rates.
  • Method Summary

    Modifier and Type
    Method
    Description
    static double
    averagedPrecision(boolean[] yPos, double[] yScore)
    Summarises a Precision-Recall Curve by taking the weighted mean of the precisions at a given threshold, where the weight is the recall achieved at that threshold.
    static double
    binaryAUCROC(boolean[] yPos, double[] yScore)
    Calculates the area under the receiver operator characteristic curve, i.e., the AUC of the ROC curve.
    generatePRCurve(boolean[] yPos, double[] yScore)
    Calculates the Precision Recall curve for a single label.
    generateROCCurve(boolean[] yPos, double[] yScore)
    Calculates the binary ROC for a single label.

    Methods inherited from class java.lang.Object

    clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
  • Method Details

    • averagedPrecision

      public static double averagedPrecision(boolean[] yPos, double[] yScore)
      Summarises a Precision-Recall Curve by taking the weighted mean of the precisions at a given threshold, where the weight is the recall achieved at that threshold. Follows scikit-learn's implementation. In general use the AUC for a Precision-Recall Gain curve as the area under the precision-recall curve is not properly normalized.
      Parameters:
      yPos - Each element is true if the label was from the positive class.
      yScore - Each element is the score of the positive class.
      Returns:
      The averaged precision.
    • generatePRCurve

      public static LabelEvaluationUtil.PRCurve generatePRCurve(boolean[] yPos, double[] yScore)
      Calculates the Precision Recall curve for a single label. In general use Precision-Recall Gain curves.
      Parameters:
      yPos - Each element is true if the label was from the positive class.
      yScore - Each element is the score of the positive class.
      Returns:
      The PRCurve for one label.
    • binaryAUCROC

      public static double binaryAUCROC(boolean[] yPos, double[] yScore)
      Calculates the area under the receiver operator characteristic curve, i.e., the AUC of the ROC curve.
      Parameters:
      yPos - Is the associated index a positive label.
      yScore - The score of the positive class.
      Returns:
      The auc (a value bounded 0.0-1.0).
    • generateROCCurve

      public static LabelEvaluationUtil.ROC generateROCCurve(boolean[] yPos, double[] yScore)
      Calculates the binary ROC for a single label.
      Parameters:
      yPos - Each element is true if the label was from the positive class.
      yScore - Each element is the score of the positive class.
      Returns:
      The ROC for one label.