Class AbstractTrainingNode<T extends Output<T>>

java.lang.Object
org.tribuo.common.tree.AbstractTrainingNode<T>
All Implemented Interfaces:
Serializable, Node<T>
Direct Known Subclasses:
ClassifierTrainingNode, JointRegressorTrainingNode, RegressorTrainingNode

public abstract class AbstractTrainingNode<T extends Output<T>> extends Object implements Node<T>
Base class for decision tree nodes used at training time.
See Also:
  • Field Details

    • DEFAULT_SIZE

      protected static final int DEFAULT_SIZE
      Default buffer size used in the split operation.
      See Also:
    • depth

      protected final int depth
    • numExamples

      protected final int numExamples
    • leafDeterminer

      protected final AbstractTrainingNode.LeafDeterminer leafDeterminer
    • split

      protected boolean split
    • splitID

      protected int splitID
    • splitValue

      protected double splitValue
    • impurityScore

      protected double impurityScore
    • greaterThan

      protected Node<T extends Output<T>> greaterThan
    • lessThanOrEqual

      protected Node<T extends Output<T>> lessThanOrEqual
  • Constructor Details

    • AbstractTrainingNode

      protected AbstractTrainingNode(int depth, int numExamples, AbstractTrainingNode.LeafDeterminer leafDeterminer)
      Builds an abstract training node.
      Parameters:
      depth - The depth of this node.
      numExamples - The number of examples in this node.
      leafDeterminer - The parameters which determine if the node forms a leaf.
  • Method Details

    • buildTree

      public abstract List<AbstractTrainingNode<T>> buildTree(int[] featureIDs, SplittableRandom rng, boolean useRandomSplitPoints)
      Builds next level of a tree.
      Parameters:
      featureIDs - Indices of the features available in this split.
      rng - Splittable random number generator.
      useRandomSplitPoints - Whether to choose split points for features at random.
      Returns:
      A possibly empty list of TrainingNodes.
    • convertTree

      public abstract Node<T> convertTree()
      Converts a tree from a training representation to the final inference time representation.
      Returns:
      The converted subtree.
    • getWeightSum

      public abstract float getWeightSum()
      The sum of the weights associated with this node's examples.
      Returns:
      the sum of the weights associated with this node's examples.
    • getDepth

      public int getDepth()
      The depth of this node in the tree.
      Returns:
      The depth.
    • shouldMakeLeaf

      public boolean shouldMakeLeaf(double impurityScore, float weightSum)
      Determines whether the node to be created should be a LeafNode.
      Parameters:
      impurityScore - impurity score for the new node.
      weightSum - total example weight for the new node.
      Returns:
      Whether the new node should be a LeafNode.
    • createSplitNode

      public SplitNode<T> createSplitNode()
      Transforms an AbstractTrainingNode into a SplitNode
      Returns:
      A SplitNode
    • getNextNode

      public Node<T> getNextNode(SparseVector example)
      Description copied from interface: Node
      Returns the next node in the tree based on the supplied example, or null if it's a leaf.
      Specified by:
      getNextNode in interface Node<T extends Output<T>>
      Parameters:
      example - The example.
      Returns:
      The next node down in the tree.
    • getNumExamples

      public int getNumExamples()
      The number of training examples in this node.
      Returns:
      The number of training examples in this node.
    • isLeaf

      public boolean isLeaf()
      Description copied from interface: Node
      Is it a leaf node?
      Specified by:
      isLeaf in interface Node<T extends Output<T>>
      Returns:
      True if it's a leaf node.
    • copy

      public Node<T> copy()
      Description copied from interface: Node
      Copies the node and it's children.
      Specified by:
      copy in interface Node<T extends Output<T>>
      Returns:
      A deep copy.