Package ai.onnx.proto

Class OnnxMl.TrainingInfoProto

java.lang.Object
com.google.protobuf.AbstractMessageLite
com.google.protobuf.AbstractMessage
com.google.protobuf.GeneratedMessageV3
ai.onnx.proto.OnnxMl.TrainingInfoProto
All Implemented Interfaces:
OnnxMl.TrainingInfoProtoOrBuilder, com.google.protobuf.Message, com.google.protobuf.MessageLite, com.google.protobuf.MessageLiteOrBuilder, com.google.protobuf.MessageOrBuilder, Serializable
Enclosing class:
OnnxMl

public static final class OnnxMl.TrainingInfoProto extends com.google.protobuf.GeneratedMessageV3 implements OnnxMl.TrainingInfoProtoOrBuilder
 Training information
 TrainingInfoProto stores information for training a model.
 In particular, this defines two functionalities: an initialization-step
 and a training-algorithm-step. Initialization resets the model
 back to its original state as if no training has been performed.
 Training algorithm improves the model based on input data.
 The semantics of the initialization-step is that the initializers
 in ModelProto.graph and in TrainingInfoProto.algorithm are first
 initialized as specified by the initializers in the graph, and then
 updated by the "initialization_binding" in every instance in
 ModelProto.training_info.
 The field "algorithm" defines a computation graph which represents a
 training algorithm's step. After the execution of a
 TrainingInfoProto.algorithm, the initializers specified by "update_binding"
 may be immediately updated. If the targeted training algorithm contains
 consecutive update steps (such as block coordinate descent methods),
 the user needs to create a TrainingInfoProto for each step.
 
Protobuf type onnx.TrainingInfoProto
See Also:
  • Field Details

  • Method Details

    • newInstance

      protected Object newInstance(com.google.protobuf.GeneratedMessageV3.UnusedPrivateParameter unused)
      Overrides:
      newInstance in class com.google.protobuf.GeneratedMessageV3
    • getUnknownFields

      public final com.google.protobuf.UnknownFieldSet getUnknownFields()
      Specified by:
      getUnknownFields in interface com.google.protobuf.MessageOrBuilder
      Overrides:
      getUnknownFields in class com.google.protobuf.GeneratedMessageV3
    • getDescriptor

      public static final com.google.protobuf.Descriptors.Descriptor getDescriptor()
    • internalGetFieldAccessorTable

      protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable internalGetFieldAccessorTable()
      Specified by:
      internalGetFieldAccessorTable in class com.google.protobuf.GeneratedMessageV3
    • hasInitialization

      public boolean hasInitialization()
       This field describes a graph to compute the initial tensors
       upon starting the training process. Initialization graph has no input
       and can have multiple outputs. Usually, trainable tensors in neural
       networks are randomly initialized. To achieve that, for each tensor,
       the user can put a random number operator such as RandomNormal or
       RandomUniform in TrainingInfoProto.initialization.node and assign its
       random output to the specific tensor using "initialization_binding".
       This graph can also set the initializers in "algorithm" in the same
       TrainingInfoProto; a use case is resetting the number of training
       iteration to zero.
       By default, this field is an empty graph and its evaluation does not
       produce any output. Thus, no initializer would be changed by default.
       
      optional .onnx.GraphProto initialization = 1;
      Specified by:
      hasInitialization in interface OnnxMl.TrainingInfoProtoOrBuilder
      Returns:
      Whether the initialization field is set.
    • getInitialization

      public OnnxMl.GraphProto getInitialization()
       This field describes a graph to compute the initial tensors
       upon starting the training process. Initialization graph has no input
       and can have multiple outputs. Usually, trainable tensors in neural
       networks are randomly initialized. To achieve that, for each tensor,
       the user can put a random number operator such as RandomNormal or
       RandomUniform in TrainingInfoProto.initialization.node and assign its
       random output to the specific tensor using "initialization_binding".
       This graph can also set the initializers in "algorithm" in the same
       TrainingInfoProto; a use case is resetting the number of training
       iteration to zero.
       By default, this field is an empty graph and its evaluation does not
       produce any output. Thus, no initializer would be changed by default.
       
      optional .onnx.GraphProto initialization = 1;
      Specified by:
      getInitialization in interface OnnxMl.TrainingInfoProtoOrBuilder
      Returns:
      The initialization.
    • getInitializationOrBuilder

      public OnnxMl.GraphProtoOrBuilder getInitializationOrBuilder()
       This field describes a graph to compute the initial tensors
       upon starting the training process. Initialization graph has no input
       and can have multiple outputs. Usually, trainable tensors in neural
       networks are randomly initialized. To achieve that, for each tensor,
       the user can put a random number operator such as RandomNormal or
       RandomUniform in TrainingInfoProto.initialization.node and assign its
       random output to the specific tensor using "initialization_binding".
       This graph can also set the initializers in "algorithm" in the same
       TrainingInfoProto; a use case is resetting the number of training
       iteration to zero.
       By default, this field is an empty graph and its evaluation does not
       produce any output. Thus, no initializer would be changed by default.
       
      optional .onnx.GraphProto initialization = 1;
      Specified by:
      getInitializationOrBuilder in interface OnnxMl.TrainingInfoProtoOrBuilder
    • hasAlgorithm

      public boolean hasAlgorithm()
       This field represents a training algorithm step. Given required inputs,
       it computes outputs to update initializers in its own or inference graph's
       initializer lists. In general, this field contains loss node, gradient node,
       optimizer node, increment of iteration count.
       An execution of the training algorithm step is performed by executing the
       graph obtained by combining the inference graph (namely "ModelProto.graph")
       and the "algorithm" graph. That is, the actual the actual
       input/initializer/output/node/value_info/sparse_initializer list of
       the training graph is the concatenation of
       "ModelProto.graph.input/initializer/output/node/value_info/sparse_initializer"
       and "algorithm.input/initializer/output/node/value_info/sparse_initializer"
       in that order. This combined graph must satisfy the normal ONNX conditions.
       Now, let's provide a visualization of graph combination for clarity.
       Let the inference graph (i.e., "ModelProto.graph") be
          tensor_a, tensor_b -> MatMul -> tensor_c -> Sigmoid -> tensor_d
       and the "algorithm" graph be
          tensor_d -> Add -> tensor_e
       The combination process results
          tensor_a, tensor_b -> MatMul -> tensor_c -> Sigmoid -> tensor_d -> Add -> tensor_e
       Notice that an input of a node in the "algorithm" graph may reference the
       output of a node in the inference graph (but not the other way round). Also, inference
       node cannot reference inputs of "algorithm". With these restrictions, inference graph
       can always be run independently without training information.
       By default, this field is an empty graph and its evaluation does not
       produce any output. Evaluating the default training step never
       update any initializers.
       
      optional .onnx.GraphProto algorithm = 2;
      Specified by:
      hasAlgorithm in interface OnnxMl.TrainingInfoProtoOrBuilder
      Returns:
      Whether the algorithm field is set.
    • getAlgorithm

      public OnnxMl.GraphProto getAlgorithm()
       This field represents a training algorithm step. Given required inputs,
       it computes outputs to update initializers in its own or inference graph's
       initializer lists. In general, this field contains loss node, gradient node,
       optimizer node, increment of iteration count.
       An execution of the training algorithm step is performed by executing the
       graph obtained by combining the inference graph (namely "ModelProto.graph")
       and the "algorithm" graph. That is, the actual the actual
       input/initializer/output/node/value_info/sparse_initializer list of
       the training graph is the concatenation of
       "ModelProto.graph.input/initializer/output/node/value_info/sparse_initializer"
       and "algorithm.input/initializer/output/node/value_info/sparse_initializer"
       in that order. This combined graph must satisfy the normal ONNX conditions.
       Now, let's provide a visualization of graph combination for clarity.
       Let the inference graph (i.e., "ModelProto.graph") be
          tensor_a, tensor_b -> MatMul -> tensor_c -> Sigmoid -> tensor_d
       and the "algorithm" graph be
          tensor_d -> Add -> tensor_e
       The combination process results
          tensor_a, tensor_b -> MatMul -> tensor_c -> Sigmoid -> tensor_d -> Add -> tensor_e
       Notice that an input of a node in the "algorithm" graph may reference the
       output of a node in the inference graph (but not the other way round). Also, inference
       node cannot reference inputs of "algorithm". With these restrictions, inference graph
       can always be run independently without training information.
       By default, this field is an empty graph and its evaluation does not
       produce any output. Evaluating the default training step never
       update any initializers.
       
      optional .onnx.GraphProto algorithm = 2;
      Specified by:
      getAlgorithm in interface OnnxMl.TrainingInfoProtoOrBuilder
      Returns:
      The algorithm.
    • getAlgorithmOrBuilder

      public OnnxMl.GraphProtoOrBuilder getAlgorithmOrBuilder()
       This field represents a training algorithm step. Given required inputs,
       it computes outputs to update initializers in its own or inference graph's
       initializer lists. In general, this field contains loss node, gradient node,
       optimizer node, increment of iteration count.
       An execution of the training algorithm step is performed by executing the
       graph obtained by combining the inference graph (namely "ModelProto.graph")
       and the "algorithm" graph. That is, the actual the actual
       input/initializer/output/node/value_info/sparse_initializer list of
       the training graph is the concatenation of
       "ModelProto.graph.input/initializer/output/node/value_info/sparse_initializer"
       and "algorithm.input/initializer/output/node/value_info/sparse_initializer"
       in that order. This combined graph must satisfy the normal ONNX conditions.
       Now, let's provide a visualization of graph combination for clarity.
       Let the inference graph (i.e., "ModelProto.graph") be
          tensor_a, tensor_b -> MatMul -> tensor_c -> Sigmoid -> tensor_d
       and the "algorithm" graph be
          tensor_d -> Add -> tensor_e
       The combination process results
          tensor_a, tensor_b -> MatMul -> tensor_c -> Sigmoid -> tensor_d -> Add -> tensor_e
       Notice that an input of a node in the "algorithm" graph may reference the
       output of a node in the inference graph (but not the other way round). Also, inference
       node cannot reference inputs of "algorithm". With these restrictions, inference graph
       can always be run independently without training information.
       By default, this field is an empty graph and its evaluation does not
       produce any output. Evaluating the default training step never
       update any initializers.
       
      optional .onnx.GraphProto algorithm = 2;
      Specified by:
      getAlgorithmOrBuilder in interface OnnxMl.TrainingInfoProtoOrBuilder
    • getInitializationBindingList

      public List<OnnxMl.StringStringEntryProto> getInitializationBindingList()
       This field specifies the bindings from the outputs of "initialization" to
       some initializers in "ModelProto.graph.initializer" and
       the "algorithm.initializer" in the same TrainingInfoProto.
       See "update_binding" below for details.
       By default, this field is empty and no initializer would be changed
       by the execution of "initialization".
       
      repeated .onnx.StringStringEntryProto initialization_binding = 3;
      Specified by:
      getInitializationBindingList in interface OnnxMl.TrainingInfoProtoOrBuilder
    • getInitializationBindingOrBuilderList

      public List<? extends OnnxMl.StringStringEntryProtoOrBuilder> getInitializationBindingOrBuilderList()
       This field specifies the bindings from the outputs of "initialization" to
       some initializers in "ModelProto.graph.initializer" and
       the "algorithm.initializer" in the same TrainingInfoProto.
       See "update_binding" below for details.
       By default, this field is empty and no initializer would be changed
       by the execution of "initialization".
       
      repeated .onnx.StringStringEntryProto initialization_binding = 3;
      Specified by:
      getInitializationBindingOrBuilderList in interface OnnxMl.TrainingInfoProtoOrBuilder
    • getInitializationBindingCount

      public int getInitializationBindingCount()
       This field specifies the bindings from the outputs of "initialization" to
       some initializers in "ModelProto.graph.initializer" and
       the "algorithm.initializer" in the same TrainingInfoProto.
       See "update_binding" below for details.
       By default, this field is empty and no initializer would be changed
       by the execution of "initialization".
       
      repeated .onnx.StringStringEntryProto initialization_binding = 3;
      Specified by:
      getInitializationBindingCount in interface OnnxMl.TrainingInfoProtoOrBuilder
    • getInitializationBinding

      public OnnxMl.StringStringEntryProto getInitializationBinding(int index)
       This field specifies the bindings from the outputs of "initialization" to
       some initializers in "ModelProto.graph.initializer" and
       the "algorithm.initializer" in the same TrainingInfoProto.
       See "update_binding" below for details.
       By default, this field is empty and no initializer would be changed
       by the execution of "initialization".
       
      repeated .onnx.StringStringEntryProto initialization_binding = 3;
      Specified by:
      getInitializationBinding in interface OnnxMl.TrainingInfoProtoOrBuilder
    • getInitializationBindingOrBuilder

      public OnnxMl.StringStringEntryProtoOrBuilder getInitializationBindingOrBuilder(int index)
       This field specifies the bindings from the outputs of "initialization" to
       some initializers in "ModelProto.graph.initializer" and
       the "algorithm.initializer" in the same TrainingInfoProto.
       See "update_binding" below for details.
       By default, this field is empty and no initializer would be changed
       by the execution of "initialization".
       
      repeated .onnx.StringStringEntryProto initialization_binding = 3;
      Specified by:
      getInitializationBindingOrBuilder in interface OnnxMl.TrainingInfoProtoOrBuilder
    • getUpdateBindingList

      public List<OnnxMl.StringStringEntryProto> getUpdateBindingList()
       Gradient-based training is usually an iterative procedure. In one gradient
       descent iteration, we apply
       x = x - r * g
       where "x" is the optimized tensor, "r" stands for learning rate, and "g" is
       gradient of "x" with respect to a chosen loss. To avoid adding assignments
       into the training graph, we split the update equation into
       y = x - r * g
       x = y
       The user needs to save "y = x - r * g" into TrainingInfoProto.algorithm. To
       tell that "y" should be assigned to "x", the field "update_binding" may
       contain a key-value pair of strings, "x" (key of StringStringEntryProto)
       and "y" (value of StringStringEntryProto).
       For a neural network with multiple trainable (mutable) tensors, there can
       be multiple key-value pairs in "update_binding".
       The initializers appears as keys in "update_binding" are considered
       mutable variables. This implies some behaviors
       as described below.
        1. We have only unique keys in all "update_binding"s so that two
           variables may not have the same name. This ensures that one
           variable is assigned up to once.
        2. The keys must appear in names of "ModelProto.graph.initializer" or
           "TrainingInfoProto.algorithm.initializer".
        3. The values must be output names of "algorithm" or "ModelProto.graph.output".
        4. Mutable variables are initialized to the value specified by the
           corresponding initializer, and then potentially updated by
           "initializer_binding"s and "update_binding"s in "TrainingInfoProto"s.
       This field usually contains names of trainable tensors
       (in ModelProto.graph), optimizer states such as momentums in advanced
       stochastic gradient methods (in TrainingInfoProto.graph),
       and number of training iterations (in TrainingInfoProto.graph).
       By default, this field is empty and no initializer would be changed
       by the execution of "algorithm".
       
      repeated .onnx.StringStringEntryProto update_binding = 4;
      Specified by:
      getUpdateBindingList in interface OnnxMl.TrainingInfoProtoOrBuilder
    • getUpdateBindingOrBuilderList

      public List<? extends OnnxMl.StringStringEntryProtoOrBuilder> getUpdateBindingOrBuilderList()
       Gradient-based training is usually an iterative procedure. In one gradient
       descent iteration, we apply
       x = x - r * g
       where "x" is the optimized tensor, "r" stands for learning rate, and "g" is
       gradient of "x" with respect to a chosen loss. To avoid adding assignments
       into the training graph, we split the update equation into
       y = x - r * g
       x = y
       The user needs to save "y = x - r * g" into TrainingInfoProto.algorithm. To
       tell that "y" should be assigned to "x", the field "update_binding" may
       contain a key-value pair of strings, "x" (key of StringStringEntryProto)
       and "y" (value of StringStringEntryProto).
       For a neural network with multiple trainable (mutable) tensors, there can
       be multiple key-value pairs in "update_binding".
       The initializers appears as keys in "update_binding" are considered
       mutable variables. This implies some behaviors
       as described below.
        1. We have only unique keys in all "update_binding"s so that two
           variables may not have the same name. This ensures that one
           variable is assigned up to once.
        2. The keys must appear in names of "ModelProto.graph.initializer" or
           "TrainingInfoProto.algorithm.initializer".
        3. The values must be output names of "algorithm" or "ModelProto.graph.output".
        4. Mutable variables are initialized to the value specified by the
           corresponding initializer, and then potentially updated by
           "initializer_binding"s and "update_binding"s in "TrainingInfoProto"s.
       This field usually contains names of trainable tensors
       (in ModelProto.graph), optimizer states such as momentums in advanced
       stochastic gradient methods (in TrainingInfoProto.graph),
       and number of training iterations (in TrainingInfoProto.graph).
       By default, this field is empty and no initializer would be changed
       by the execution of "algorithm".
       
      repeated .onnx.StringStringEntryProto update_binding = 4;
      Specified by:
      getUpdateBindingOrBuilderList in interface OnnxMl.TrainingInfoProtoOrBuilder
    • getUpdateBindingCount

      public int getUpdateBindingCount()
       Gradient-based training is usually an iterative procedure. In one gradient
       descent iteration, we apply
       x = x - r * g
       where "x" is the optimized tensor, "r" stands for learning rate, and "g" is
       gradient of "x" with respect to a chosen loss. To avoid adding assignments
       into the training graph, we split the update equation into
       y = x - r * g
       x = y
       The user needs to save "y = x - r * g" into TrainingInfoProto.algorithm. To
       tell that "y" should be assigned to "x", the field "update_binding" may
       contain a key-value pair of strings, "x" (key of StringStringEntryProto)
       and "y" (value of StringStringEntryProto).
       For a neural network with multiple trainable (mutable) tensors, there can
       be multiple key-value pairs in "update_binding".
       The initializers appears as keys in "update_binding" are considered
       mutable variables. This implies some behaviors
       as described below.
        1. We have only unique keys in all "update_binding"s so that two
           variables may not have the same name. This ensures that one
           variable is assigned up to once.
        2. The keys must appear in names of "ModelProto.graph.initializer" or
           "TrainingInfoProto.algorithm.initializer".
        3. The values must be output names of "algorithm" or "ModelProto.graph.output".
        4. Mutable variables are initialized to the value specified by the
           corresponding initializer, and then potentially updated by
           "initializer_binding"s and "update_binding"s in "TrainingInfoProto"s.
       This field usually contains names of trainable tensors
       (in ModelProto.graph), optimizer states such as momentums in advanced
       stochastic gradient methods (in TrainingInfoProto.graph),
       and number of training iterations (in TrainingInfoProto.graph).
       By default, this field is empty and no initializer would be changed
       by the execution of "algorithm".
       
      repeated .onnx.StringStringEntryProto update_binding = 4;
      Specified by:
      getUpdateBindingCount in interface OnnxMl.TrainingInfoProtoOrBuilder
    • getUpdateBinding

      public OnnxMl.StringStringEntryProto getUpdateBinding(int index)
       Gradient-based training is usually an iterative procedure. In one gradient
       descent iteration, we apply
       x = x - r * g
       where "x" is the optimized tensor, "r" stands for learning rate, and "g" is
       gradient of "x" with respect to a chosen loss. To avoid adding assignments
       into the training graph, we split the update equation into
       y = x - r * g
       x = y
       The user needs to save "y = x - r * g" into TrainingInfoProto.algorithm. To
       tell that "y" should be assigned to "x", the field "update_binding" may
       contain a key-value pair of strings, "x" (key of StringStringEntryProto)
       and "y" (value of StringStringEntryProto).
       For a neural network with multiple trainable (mutable) tensors, there can
       be multiple key-value pairs in "update_binding".
       The initializers appears as keys in "update_binding" are considered
       mutable variables. This implies some behaviors
       as described below.
        1. We have only unique keys in all "update_binding"s so that two
           variables may not have the same name. This ensures that one
           variable is assigned up to once.
        2. The keys must appear in names of "ModelProto.graph.initializer" or
           "TrainingInfoProto.algorithm.initializer".
        3. The values must be output names of "algorithm" or "ModelProto.graph.output".
        4. Mutable variables are initialized to the value specified by the
           corresponding initializer, and then potentially updated by
           "initializer_binding"s and "update_binding"s in "TrainingInfoProto"s.
       This field usually contains names of trainable tensors
       (in ModelProto.graph), optimizer states such as momentums in advanced
       stochastic gradient methods (in TrainingInfoProto.graph),
       and number of training iterations (in TrainingInfoProto.graph).
       By default, this field is empty and no initializer would be changed
       by the execution of "algorithm".
       
      repeated .onnx.StringStringEntryProto update_binding = 4;
      Specified by:
      getUpdateBinding in interface OnnxMl.TrainingInfoProtoOrBuilder
    • getUpdateBindingOrBuilder

      public OnnxMl.StringStringEntryProtoOrBuilder getUpdateBindingOrBuilder(int index)
       Gradient-based training is usually an iterative procedure. In one gradient
       descent iteration, we apply
       x = x - r * g
       where "x" is the optimized tensor, "r" stands for learning rate, and "g" is
       gradient of "x" with respect to a chosen loss. To avoid adding assignments
       into the training graph, we split the update equation into
       y = x - r * g
       x = y
       The user needs to save "y = x - r * g" into TrainingInfoProto.algorithm. To
       tell that "y" should be assigned to "x", the field "update_binding" may
       contain a key-value pair of strings, "x" (key of StringStringEntryProto)
       and "y" (value of StringStringEntryProto).
       For a neural network with multiple trainable (mutable) tensors, there can
       be multiple key-value pairs in "update_binding".
       The initializers appears as keys in "update_binding" are considered
       mutable variables. This implies some behaviors
       as described below.
        1. We have only unique keys in all "update_binding"s so that two
           variables may not have the same name. This ensures that one
           variable is assigned up to once.
        2. The keys must appear in names of "ModelProto.graph.initializer" or
           "TrainingInfoProto.algorithm.initializer".
        3. The values must be output names of "algorithm" or "ModelProto.graph.output".
        4. Mutable variables are initialized to the value specified by the
           corresponding initializer, and then potentially updated by
           "initializer_binding"s and "update_binding"s in "TrainingInfoProto"s.
       This field usually contains names of trainable tensors
       (in ModelProto.graph), optimizer states such as momentums in advanced
       stochastic gradient methods (in TrainingInfoProto.graph),
       and number of training iterations (in TrainingInfoProto.graph).
       By default, this field is empty and no initializer would be changed
       by the execution of "algorithm".
       
      repeated .onnx.StringStringEntryProto update_binding = 4;
      Specified by:
      getUpdateBindingOrBuilder in interface OnnxMl.TrainingInfoProtoOrBuilder
    • isInitialized

      public final boolean isInitialized()
      Specified by:
      isInitialized in interface com.google.protobuf.MessageLiteOrBuilder
      Overrides:
      isInitialized in class com.google.protobuf.GeneratedMessageV3
    • writeTo

      public void writeTo(com.google.protobuf.CodedOutputStream output) throws IOException
      Specified by:
      writeTo in interface com.google.protobuf.MessageLite
      Overrides:
      writeTo in class com.google.protobuf.GeneratedMessageV3
      Throws:
      IOException
    • getSerializedSize

      public int getSerializedSize()
      Specified by:
      getSerializedSize in interface com.google.protobuf.MessageLite
      Overrides:
      getSerializedSize in class com.google.protobuf.GeneratedMessageV3
    • equals

      public boolean equals(Object obj)
      Specified by:
      equals in interface com.google.protobuf.Message
      Overrides:
      equals in class com.google.protobuf.AbstractMessage
    • hashCode

      public int hashCode()
      Specified by:
      hashCode in interface com.google.protobuf.Message
      Overrides:
      hashCode in class com.google.protobuf.AbstractMessage
    • parseFrom

      public static OnnxMl.TrainingInfoProto parseFrom(ByteBuffer data) throws com.google.protobuf.InvalidProtocolBufferException
      Throws:
      com.google.protobuf.InvalidProtocolBufferException
    • parseFrom

      public static OnnxMl.TrainingInfoProto parseFrom(ByteBuffer data, com.google.protobuf.ExtensionRegistryLite extensionRegistry) throws com.google.protobuf.InvalidProtocolBufferException
      Throws:
      com.google.protobuf.InvalidProtocolBufferException
    • parseFrom

      public static OnnxMl.TrainingInfoProto parseFrom(com.google.protobuf.ByteString data) throws com.google.protobuf.InvalidProtocolBufferException
      Throws:
      com.google.protobuf.InvalidProtocolBufferException
    • parseFrom

      public static OnnxMl.TrainingInfoProto parseFrom(com.google.protobuf.ByteString data, com.google.protobuf.ExtensionRegistryLite extensionRegistry) throws com.google.protobuf.InvalidProtocolBufferException
      Throws:
      com.google.protobuf.InvalidProtocolBufferException
    • parseFrom

      public static OnnxMl.TrainingInfoProto parseFrom(byte[] data) throws com.google.protobuf.InvalidProtocolBufferException
      Throws:
      com.google.protobuf.InvalidProtocolBufferException
    • parseFrom

      public static OnnxMl.TrainingInfoProto parseFrom(byte[] data, com.google.protobuf.ExtensionRegistryLite extensionRegistry) throws com.google.protobuf.InvalidProtocolBufferException
      Throws:
      com.google.protobuf.InvalidProtocolBufferException
    • parseFrom

      public static OnnxMl.TrainingInfoProto parseFrom(InputStream input) throws IOException
      Throws:
      IOException
    • parseFrom

      public static OnnxMl.TrainingInfoProto parseFrom(InputStream input, com.google.protobuf.ExtensionRegistryLite extensionRegistry) throws IOException
      Throws:
      IOException
    • parseDelimitedFrom

      public static OnnxMl.TrainingInfoProto parseDelimitedFrom(InputStream input) throws IOException
      Throws:
      IOException
    • parseDelimitedFrom

      public static OnnxMl.TrainingInfoProto parseDelimitedFrom(InputStream input, com.google.protobuf.ExtensionRegistryLite extensionRegistry) throws IOException
      Throws:
      IOException
    • parseFrom

      public static OnnxMl.TrainingInfoProto parseFrom(com.google.protobuf.CodedInputStream input) throws IOException
      Throws:
      IOException
    • parseFrom

      public static OnnxMl.TrainingInfoProto parseFrom(com.google.protobuf.CodedInputStream input, com.google.protobuf.ExtensionRegistryLite extensionRegistry) throws IOException
      Throws:
      IOException
    • newBuilderForType

      public OnnxMl.TrainingInfoProto.Builder newBuilderForType()
      Specified by:
      newBuilderForType in interface com.google.protobuf.Message
      Specified by:
      newBuilderForType in interface com.google.protobuf.MessageLite
    • newBuilder

      public static OnnxMl.TrainingInfoProto.Builder newBuilder()
    • newBuilder

      public static OnnxMl.TrainingInfoProto.Builder newBuilder(OnnxMl.TrainingInfoProto prototype)
    • toBuilder

      Specified by:
      toBuilder in interface com.google.protobuf.Message
      Specified by:
      toBuilder in interface com.google.protobuf.MessageLite
    • newBuilderForType

      protected OnnxMl.TrainingInfoProto.Builder newBuilderForType(com.google.protobuf.GeneratedMessageV3.BuilderParent parent)
      Specified by:
      newBuilderForType in class com.google.protobuf.GeneratedMessageV3
    • getDefaultInstance

      public static OnnxMl.TrainingInfoProto getDefaultInstance()
    • parser

      public static com.google.protobuf.Parser<OnnxMl.TrainingInfoProto> parser()
    • getParserForType

      public com.google.protobuf.Parser<OnnxMl.TrainingInfoProto> getParserForType()
      Specified by:
      getParserForType in interface com.google.protobuf.Message
      Specified by:
      getParserForType in interface com.google.protobuf.MessageLite
      Overrides:
      getParserForType in class com.google.protobuf.GeneratedMessageV3
    • getDefaultInstanceForType

      public OnnxMl.TrainingInfoProto getDefaultInstanceForType()
      Specified by:
      getDefaultInstanceForType in interface com.google.protobuf.MessageLiteOrBuilder
      Specified by:
      getDefaultInstanceForType in interface com.google.protobuf.MessageOrBuilder