Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ORT fails with axes_tensor != nullptr was false #18338

Closed
justinchuby opened this issue Nov 8, 2023 · 11 comments
Closed

ORT fails with axes_tensor != nullptr was false #18338

justinchuby opened this issue Nov 8, 2023 · 11 comments
Labels
converter:dynamo issues related supporting the PyTorch Dynamo exporter core runtime issues related to core runtime stale issues that have not been addressed in a while; categorized by a bot

Comments

@justinchuby
Copy link
Contributor

justinchuby commented Nov 8, 2023

Summary

ORT fails with error axes_tensor != nullptr was false for the following model even though axis is not empty.

Reproduction report

Summary

ONNX Runtime raises [ONNXRuntimeError] : 6 : RUNTIME_EXCEPTION : Non-zero status code returned while running If node. Name:'_inline__aten_linalg_vector_norm_no_dim_onnxn10' Status Message: Non-zero status code returned while running If node. Name:'_inline__aten_linalg_vector_norm_no_dim_onnxn1' Status Message: Non-zero status code returned while running If node. Name:'_inline__aten_linalg_vector_norm_no_dim_onnxn3' Status Message: Non-zero status code returned while running If node. Name:'_inline__aten_linalg_vector_norm_no_dim_onnxn3' Status Message: Non-zero status code returned while running If node. Name:'_inline__aten_linalg_vector_norm_no_dim_onnxn3' Status Message: Non-zero status code returned while running ReduceSum node. Name:'' Status Message: /onnxruntime_src/onnxruntime/core/providers/cpu/reduction/reduction_ops.cc:692 void onnxruntime::ValidateCommonFastReduce(const onnxruntime::Tensor*) axes_tensor != nullptr was false. Axes input is null when executing test ops_test.TestOutputConsistencyFullGraphCPU.test_output_match_opinfo__linalg_vector_norm_cpu_float16 in ONNX Script TorchLib.

To recreate this report, use

CREATE_REPRODUCTION_REPORT=1 python -m pytest onnxscript/tests/function_libs/torch_lib/ops_test.py -k test_output_match_opinfo__linalg_vector_norm_cpu_float16

ONNX Script function

@torch_op("aten::linalg_vector_norm", private=True)
def _aten_linalg_vector_norm_no_dim_onnx(self: TFloat, ord: float, keepdim: bool) -> TFloat:
    self_rank = op.Size(op.Shape(self))
    if self_rank == 0:
        self = op.Unsqueeze(self, axes=[0])

    self = op.Abs(self)
    ord = op.Cast(ord, to=FLOAT.dtype)  # Must be FLOAT, due to op.IsInf() needs FLOAT
    if op.IsInf(ord, detect_negative=0, detect_positive=1):
        result = op.ReduceMax(self, keepdims=keepdim)
    elif op.IsInf(ord, detect_negative=1, detect_positive=0):
        result = op.ReduceMin(self, keepdims=keepdim)
    elif ord == 0.0:  # sum(x!=0) means count non-zero elements
        self_bool = op.Cast(self, to=BOOL.dtype)
        self_0_1 = op.CastLike(self_bool, self)
        result = op.ReduceSum(self_0_1, keepdims=False)
    elif ord == 1.0:
        result = op.ReduceL1(self, keepdims=keepdim)
    elif ord == 2.0:
        result = op.ReduceL2(self, keepdims=keepdim)
    else:
        ord_float = op.CastLike(ord, self)
        self_pow = op.Pow(self, ord_float)
        result = op.Pow(op.ReduceSum(self_pow, keepdims=keepdim), op.Div(1.0, ord_float))

    if self_rank == 0:
        result = op.Squeeze(result)

    return result

To reproduce

import google.protobuf.text_format
import numpy as np
from numpy import array, float16, float32, float64, int32, int64
import onnx
import onnxruntime as ort

# Run n times
N = 1

onnx_model_text = """
ir_version: 8
producer_name: "pytorch"
producer_version: "2.2.0"
graph {
  node {
    output: "_val_1"
    name: "Constant_0"
    op_type: "Constant"
    attribute {
      name: "value_ints"
      ints: -1
      type: INTS
    }
    doc_string: ""
  }
  node {
    input: "input_0"
    input: "_val_1"
    output: "_val_2"
    name: "Reshape_1"
    op_type: "Reshape"
    attribute {
      name: "allowzero"
      i: 0
      type: INT
    }
    doc_string: ""
  }
  node {
    input: "_val_2"
    output: "_val_3"
    name: "_aten_linalg_vector_norm_no_dim_onnx_2"
    op_type: "_aten_linalg_vector_norm_no_dim_onnx"
    attribute {
      name: "keepdim"
      i: 0
      type: INT
    }
    attribute {
      name: "ord"
      f: 2.0
      type: FLOAT
    }
    doc_string: ""
    domain: "pkg.onnxscript.torch_lib"
  }
  name: "main_graph"
  input {
    name: "input_0"
    type {
      tensor_type {
        elem_type: 10
        shape {
        }
      }
    }
  }
  output {
    name: "_val_3"
    type {
      tensor_type {
        elem_type: 10
        shape {
        }
      }
    }
  }
  value_info {
    name: "_val_1"
    type {
      tensor_type {
        elem_type: 7
        shape {
          dim {
            dim_value: 1
          }
        }
      }
    }
  }
  value_info {
    name: "_val_2"
    type {
      tensor_type {
        elem_type: 10
        shape {
          dim {
            dim_value: 1
          }
        }
      }
    }
  }
}
opset_import {
  domain: "pkg.onnxscript.torch_lib"
  version: 1
}
opset_import {
  domain: ""
  version: 18
}
opset_import {
  domain: "pkg.onnxscript.torch_lib.common"
  version: 1
}
functions {
  name: "_aten_linalg_vector_norm_no_dim_onnx"
  input: "self"
  output: "result_29"
  attribute: "ord"
  attribute: "keepdim"
  node {
    input: "self"
    output: "tmp"
    name: "n0"
    op_type: "Shape"
    domain: ""
  }
  node {
    input: "tmp"
    output: "self_rank"
    name: "n1"
    op_type: "Size"
    domain: ""
  }
  node {
    output: "int64_0"
    name: "n2"
    op_type: "Constant"
    attribute {
      name: "value"
      t {
        data_type: 7
        int64_data: 0
        name: "int64_0"
      }
      type: TENSOR
    }
    domain: ""
  }
  node {
    input: "int64_0"
    input: "self_rank"
    output: "int64_0_cast"
    name: "n3"
    op_type: "CastLike"
    domain: ""
  }
  node {
    input: "self_rank"
    input: "int64_0_cast"
    output: "cond"
    name: "n4"
    op_type: "Equal"
    domain: ""
  }
  node {
    input: "cond"
    output: "self_2"
    name: "n5"
    op_type: "If"
    attribute {
      name: "then_branch"
      g {
        node {
          output: "int64_0_1d"
          name: "n0"
          op_type: "Constant"
          attribute {
            name: "value"
            t {
              dims: 1
              data_type: 7
              int64_data: 0
              name: "int64_0_1d"
            }
            type: TENSOR
          }
          domain: ""
        }
        node {
          input: "self"
          input: "int64_0_1d"
          output: "self_0"
          name: "n1"
          op_type: "Unsqueeze"
          domain: ""
        }
        name: "thenGraph_4"
        output {
          name: "self_0"
          type {
          }
        }
      }
      type: GRAPH
    }
    attribute {
      name: "else_branch"
      g {
        node {
          input: "self"
          output: "self_1"
          name: "n0"
          op_type: "Identity"
          domain: ""
        }
        name: "elseGraph_4"
        output {
          name: "self_1"
          type {
          }
        }
      }
      type: GRAPH
    }
    domain: ""
  }
  node {
    input: "self_2"
    output: "self_3"
    name: "n6"
    op_type: "Abs"
    domain: ""
  }
  node {
    output: "ord"
    name: "n7"
    op_type: "Constant"
    attribute {
      name: "value_float"
      type: FLOAT
      ref_attr_name: "ord"
    }
    domain: ""
  }
  node {
    input: "ord"
    output: "ord_4"
    name: "n8"
    op_type: "Cast"
    attribute {
      name: "to"
      i: 1
      type: INT
    }
    domain: ""
  }
  node {
    input: "ord_4"
    output: "cond_5"
    name: "n9"
    op_type: "IsInf"
    attribute {
      name: "detect_negative"
      i: 0
      type: INT
    }
    attribute {
      name: "detect_positive"
      i: 1
      type: INT
    }
    domain: ""
  }
  node {
    input: "cond_5"
    output: "result_24"
    name: "n10"
    op_type: "If"
    attribute {
      name: "then_branch"
      g {
        node {
          input: "self_3"
          output: "result"
          name: "n0"
          op_type: "ReduceMax"
          attribute {
            name: "keepdims"
            type: INT
            ref_attr_name: "keepdim"
          }
          domain: ""
        }
        name: "thenGraph_9"
        output {
          name: "result"
          type {
          }
        }
      }
      type: GRAPH
    }
    attribute {
      name: "else_branch"
      g {
        node {
          input: "ord_4"
          output: "cond_6"
          name: "n0"
          op_type: "IsInf"
          attribute {
            name: "detect_negative"
            i: 1
            type: INT
          }
          attribute {
            name: "detect_positive"
            i: 0
            type: INT
          }
          domain: ""
        }
        node {
          input: "cond_6"
          output: "result_23"
          name: "n1"
          op_type: "If"
          attribute {
            name: "then_branch"
            g {
              node {
                input: "self_3"
                output: "result_7"
                name: "n0"
                op_type: "ReduceMin"
                attribute {
                  name: "keepdims"
                  type: INT
                  ref_attr_name: "keepdim"
                }
                domain: ""
              }
              name: "thenGraph_11"
              output {
                name: "result_7"
                type {
                }
              }
            }
            type: GRAPH
          }
          attribute {
            name: "else_branch"
            g {
              node {
                output: "const"
                name: "n0"
                op_type: "Constant"
                attribute {
                  name: "value"
                  t {
                    data_type: 1
                    float_data: 0.0
                    name: "const"
                  }
                  type: TENSOR
                }
                domain: ""
              }
              node {
                input: "const"
                input: "ord_4"
                output: "const_cast"
                name: "n1"
                op_type: "CastLike"
                domain: ""
              }
              node {
                input: "ord_4"
                input: "const_cast"
                output: "cond_8"
                name: "n2"
                op_type: "Equal"
                domain: ""
              }
              node {
                input: "cond_8"
                output: "result_22"
                name: "n3"
                op_type: "If"
                attribute {
                  name: "then_branch"
                  g {
                    node {
                      input: "self_3"
                      output: "self_bool"
                      name: "n0"
                      op_type: "Cast"
                      attribute {
                        name: "to"
                        i: 9
                        type: INT
                      }
                      domain: ""
                    }
                    node {
                      input: "self_bool"
                      input: "self_3"
                      output: "self_0_1"
                      name: "n1"
                      op_type: "CastLike"
                      domain: ""
                    }
                    node {
                      input: "self_0_1"
                      output: "result_9"
                      name: "n2"
                      op_type: "ReduceSum"
                      attribute {
                        name: "keepdims"
                        i: 0
                        type: INT
                      }
                      domain: ""
                    }
                    name: "thenGraph_13"
                    output {
                      name: "result_9"
                      type {
                      }
                    }
                  }
                  type: GRAPH
                }
                attribute {
                  name: "else_branch"
                  g {
                    node {
                      output: "const_10"
                      name: "n0"
                      op_type: "Constant"
                      attribute {
                        name: "value"
                        t {
                          data_type: 1
                          float_data: 1.0
                          name: "const_10"
                        }
                        type: TENSOR
                      }
                      domain: ""
                    }
                    node {
                      input: "const_10"
                      input: "ord_4"
                      output: "const_10_cast"
                      name: "n1"
                      op_type: "CastLike"
                      domain: ""
                    }
                    node {
                      input: "ord_4"
                      input: "const_10_cast"
                      output: "cond_11"
                      name: "n2"
                      op_type: "Equal"
                      domain: ""
                    }
                    node {
                      input: "cond_11"
                      output: "result_21"
                      name: "n3"
                      op_type: "If"
                      attribute {
                        name: "then_branch"
                        g {
                          node {
                            input: "self_3"
                            output: "result_12"
                            name: "n0"
                            op_type: "ReduceL1"
                            attribute {
                              name: "keepdims"
                              type: INT
                              ref_attr_name: "keepdim"
                            }
                            domain: ""
                          }
                          name: "thenGraph_18"
                          output {
                            name: "result_12"
                            type {
                            }
                          }
                        }
                        type: GRAPH
                      }
                      attribute {
                        name: "else_branch"
                        g {
                          node {
                            output: "const_13"
                            name: "n0"
                            op_type: "Constant"
                            attribute {
                              name: "value"
                              t {
                                data_type: 1
                                float_data: 2.0
                                name: "const_13"
                              }
                              type: TENSOR
                            }
                            domain: ""
                          }
                          node {
                            input: "const_13"
                            input: "ord_4"
                            output: "const_13_cast"
                            name: "n1"
                            op_type: "CastLike"
                            domain: ""
                          }
                          node {
                            input: "ord_4"
                            input: "const_13_cast"
                            output: "cond_14"
                            name: "n2"
                            op_type: "Equal"
                            domain: ""
                          }
                          node {
                            input: "cond_14"
                            output: "result_20"
                            name: "n3"
                            op_type: "If"
                            attribute {
                              name: "then_branch"
                              g {
                                node {
                                  input: "self_3"
                                  output: "result_15"
                                  name: "n0"
                                  op_type: "ReduceL2"
                                  attribute {
                                    name: "keepdims"
                                    type: INT
                                    ref_attr_name: "keepdim"
                                  }
                                  domain: ""
                                }
                                name: "thenGraph_20"
                                output {
                                  name: "result_15"
                                  type {
                                  }
                                }
                              }
                              type: GRAPH
                            }
                            attribute {
                              name: "else_branch"
                              g {
                                node {
                                  input: "ord_4"
                                  input: "self_3"
                                  output: "ord_float"
                                  name: "n0"
                                  op_type: "CastLike"
                                  domain: ""
                                }
                                node {
                                  input: "self_3"
                                  input: "ord_float"
                                  output: "self_pow"
                                  name: "n1"
                                  op_type: "Pow"
                                  domain: ""
                                }
                                node {
                                  input: "self_pow"
                                  output: "tmp_16"
                                  name: "n2"
                                  op_type: "ReduceSum"
                                  attribute {
                                    name: "keepdims"
                                    type: INT
                                    ref_attr_name: "keepdim"
                                  }
                                  domain: ""
                                }
                                node {
                                  output: "const_17"
                                  name: "n3"
                                  op_type: "Constant"
                                  attribute {
                                    name: "value"
                                    t {
                                      data_type: 1
                                      float_data: 1.0
                                      name: "const_17"
                                    }
                                    type: TENSOR
                                  }
                                  domain: ""
                                }
                                node {
                                  input: "const_17"
                                  input: "ord_float"
                                  output: "const_17_cast"
                                  name: "n4"
                                  op_type: "CastLike"
                                  domain: ""
                                }
                                node {
                                  input: "const_17_cast"
                                  input: "ord_float"
                                  output: "tmp_18"
                                  name: "n5"
                                  op_type: "Div"
                                  domain: ""
                                }
                                node {
                                  input: "tmp_16"
                                  input: "tmp_18"
                                  output: "result_19"
                                  name: "n6"
                                  op_type: "Pow"
                                  domain: ""
                                }
                                name: "elseGraph_20"
                                output {
                                  name: "result_19"
                                  type {
                                  }
                                }
                              }
                              type: GRAPH
                            }
                            domain: ""
                          }
                          name: "elseGraph_18"
                          output {
                            name: "result_20"
                            type {
                            }
                          }
                        }
                        type: GRAPH
                      }
                      domain: ""
                    }
                    name: "elseGraph_13"
                    output {
                      name: "result_21"
                      type {
                      }
                    }
                  }
                  type: GRAPH
                }
                domain: ""
              }
              name: "elseGraph_11"
              output {
                name: "result_22"
                type {
                }
              }
            }
            type: GRAPH
          }
          domain: ""
        }
        name: "elseGraph_9"
        output {
          name: "result_23"
          type {
          }
        }
      }
      type: GRAPH
    }
    domain: ""
  }
  node {
    output: "int64_0_25"
    name: "n11"
    op_type: "Constant"
    attribute {
      name: "value"
      t {
        data_type: 7
        int64_data: 0
        name: "int64_0_25"
      }
      type: TENSOR
    }
    domain: ""
  }
  node {
    input: "int64_0_25"
    input: "self_rank"
    output: "int64_0_25_cast"
    name: "n12"
    op_type: "CastLike"
    domain: ""
  }
  node {
    input: "self_rank"
    input: "int64_0_25_cast"
    output: "cond_26"
    name: "n13"
    op_type: "Equal"
    domain: ""
  }
  node {
    input: "cond_26"
    output: "result_29"
    name: "n14"
    op_type: "If"
    attribute {
      name: "then_branch"
      g {
        node {
          input: "result_24"
          output: "result_27"
          name: "n0"
          op_type: "Squeeze"
          domain: ""
        }
        name: "thenGraph_27"
        output {
          name: "result_27"
          type {
          }
        }
      }
      type: GRAPH
    }
    attribute {
      name: "else_branch"
      g {
        node {
          input: "result_24"
          output: "result_28"
          name: "n0"
          op_type: "Identity"
          domain: ""
        }
        name: "elseGraph_27"
        output {
          name: "result_28"
          type {
          }
        }
      }
      type: GRAPH
    }
    domain: ""
  }
  opset_import {
    domain: ""
    version: 18
  }
  domain: "pkg.onnxscript.torch_lib"
}
functions {
  name: "Rank"
  input: "input"
  output: "return_val"
  node {
    input: "input"
    output: "tmp"
    name: "n0"
    op_type: "Shape"
    domain: ""
  }
  node {
    input: "tmp"
    output: "return_val"
    name: "n1"
    op_type: "Size"
    domain: ""
  }
  doc_string: "Take the rank of the input tensor."
  opset_import {
    domain: ""
    version: 18
  }
  domain: "pkg.onnxscript.torch_lib.common"
}
functions {
  name: "IsScalar"
  input: "input"
  output: "return_val"
  node {
    input: "input"
    output: "tmp"
    name: "n0"
    op_type: "Shape"
    domain: ""
  }
  node {
    input: "tmp"
    output: "tmp_0"
    name: "n1"
    op_type: "Size"
    domain: ""
  }
  node {
    output: "tmp_1"
    name: "n2"
    op_type: "Constant"
    attribute {
      name: "value_int"
      i: 0
      type: INT
    }
    domain: ""
  }
  node {
    input: "tmp_0"
    input: "tmp_1"
    output: "return_val"
    name: "n3"
    op_type: "Equal"
    domain: ""
  }
  doc_string: "Return whether the input has rank 0, or is a scalar."
  opset_import {
    domain: ""
    version: 18
  }
  domain: "pkg.onnxscript.torch_lib.common"
}

"""

ort_inputs = {'input_0': array(0.8965, dtype=float16)}

# Set up the inference session
session_options = ort.SessionOptions()
session_options.graph_optimization_level = ort.GraphOptimizationLevel.ORT_DISABLE_ALL
onnx_model = onnx.ModelProto()
google.protobuf.text_format.Parse(onnx_model_text, onnx_model)

# Uncomment this line to save the model to a file for examination
# onnx.save_model(onnx_model, "test_output_match_opinfo__linalg_vector_norm_cpu_float16.onnx")

onnx.checker.check_model(onnx_model)
session = ort.InferenceSession(onnx_model.SerializeToString(), session_options, providers=("CPUExecutionProvider",))

# Run the model
for _ in range(N):
    ort_outputs = session.run(None, ort_inputs)

Full error stack

[ONNXRuntimeError] : 6 : RUNTIME_EXCEPTION : Non-zero status code returned while running If node. Name:'_inline__aten_linalg_vector_norm_no_dim_onnxn10' Status Message: Non-zero status code returned while running If node. Name:'_inline__aten_linalg_vector_norm_no_dim_onnxn1' Status Message: Non-zero status code returned while running If node. Name:'_inline__aten_linalg_vector_norm_no_dim_onnxn3' Status Message: Non-zero status code returned while running If node. Name:'_inline__aten_linalg_vector_norm_no_dim_onnxn3' Status Message: Non-zero status code returned while running If node. Name:'_inline__aten_linalg_vector_norm_no_dim_onnxn3' Status Message: Non-zero status code returned while running ReduceSum node. Name:'' Status Message: /onnxruntime_src/onnxruntime/core/providers/cpu/reduction/reduction_ops.cc:692 void onnxruntime::ValidateCommonFastReduce(const onnxruntime::Tensor*) axes_tensor != nullptr was false. Axes input is null

  File "/home/justinchu/dev/onnx-script/onnxscript/tests/function_libs/torch_lib/ops_test_common.py", line 534, in _capture_graph_and_evaluate_torch_script_evaluator
    return _safe_ort_session_run(onnx_model.SerializeToString(), ort_inputs)
  File "/home/justinchu/dev/onnx-script/onnxscript/tests/function_libs/torch_lib/ops_test_common.py", line 349, in _safe_ort_session_run
    raise return_dict["error"]

The ONNX model text for visualization

<
   ir_version: 8,
   opset_import: ["pkg.onnxscript.torch_lib" : 1, "" : 18, "pkg.onnxscript.torch_lib.common" : 1],
   producer_name: "pytorch",
   producer_version: "2.2.0"
>
main_graph (float16 input_0) => (float16 _val_3) 
   <int64[1] _val_1, float16[1] _val_2>
{
   _val_1 = Constant <value_ints: ints = [-1]> ()
   _val_2 = Reshape <allowzero: int = 0> (input_0, _val_1)
   _val_3 = pkg.onnxscript.torch_lib._aten_linalg_vector_norm_no_dim_onnx <keepdim: int = 0, ord: float = 2> (_val_2)
}
<
  domain: "pkg.onnxscript.torch_lib",
  opset_import: ["" : 18]
>
_aten_linalg_vector_norm_no_dim_onnx <ord,keepdim>(self) => (result_29)
{
   tmp = Shape (self)
   self_rank = Size (tmp)
   int64_0 = Constant <value: tensor = int64 int64_0 {0}> ()
   int64_0_cast = CastLike (int64_0, self_rank)
   cond = Equal (self_rank, int64_0_cast)
   self_2 = If (cond) <then_branch: graph = thenGraph_4 () => ( self_0) {
      int64_0_1d = Constant <value: tensor = int64[1] int64_0_1d {0}> ()
      self_0 = Unsqueeze (self, int64_0_1d)
   }, else_branch: graph = elseGraph_4 () => ( self_1) {
      self_1 = Identity (self)
   }>
   self_3 = Abs (self_2)
   ord = Constant <value_float: float = @ord> ()
   ord_4 = Cast <to: int = 1> (ord)
   cond_5 = IsInf <detect_negative: int = 0, detect_positive: int = 1> (ord_4)
   result_24 = If (cond_5) <then_branch: graph = thenGraph_9 () => ( result) {
      result = ReduceMax <keepdims: int = @keepdim> (self_3)
   }, else_branch: graph = elseGraph_9 () => ( result_23) {
      cond_6 = IsInf <detect_negative: int = 1, detect_positive: int = 0> (ord_4)
      result_23 = If (cond_6) <then_branch: graph = thenGraph_11 () => ( result_7) {
         result_7 = ReduceMin <keepdims: int = @keepdim> (self_3)
      }, else_branch: graph = elseGraph_11 () => ( result_22) {
         const = Constant <value: tensor = float const {0}> ()
         const_cast = CastLike (const, ord_4)
         cond_8 = Equal (ord_4, const_cast)
         result_22 = If (cond_8) <then_branch: graph = thenGraph_13 () => ( result_9) {
            self_bool = Cast <to: int = 9> (self_3)
            self_0_1 = CastLike (self_bool, self_3)
            result_9 = ReduceSum <keepdims: int = 0> (self_0_1)
         }, else_branch: graph = elseGraph_13 () => ( result_21) {
            const_10 = Constant <value: tensor = float const_10 {1}> ()
            const_10_cast = CastLike (const_10, ord_4)
            cond_11 = Equal (ord_4, const_10_cast)
            result_21 = If (cond_11) <then_branch: graph = thenGraph_18 () => ( result_12) {
               result_12 = ReduceL1 <keepdims: int = @keepdim> (self_3)
            }, else_branch: graph = elseGraph_18 () => ( result_20) {
               const_13 = Constant <value: tensor = float const_13 {2}> ()
               const_13_cast = CastLike (const_13, ord_4)
               cond_14 = Equal (ord_4, const_13_cast)
               result_20 = If (cond_14) <then_branch: graph = thenGraph_20 () => ( result_15) {
                  result_15 = ReduceL2 <keepdims: int = @keepdim> (self_3)
               }, else_branch: graph = elseGraph_20 () => ( result_19) {
                  ord_float = CastLike (ord_4, self_3)
                  self_pow = Pow (self_3, ord_float)
                  tmp_16 = ReduceSum <keepdims: int = @keepdim> (self_pow)
                  const_17 = Constant <value: tensor = float const_17 {1}> ()
                  const_17_cast = CastLike (const_17, ord_float)
                  tmp_18 = Div (const_17_cast, ord_float)
                  result_19 = Pow (tmp_16, tmp_18)
               }>
            }>
         }>
      }>
   }>
   int64_0_25 = Constant <value: tensor = int64 int64_0_25 {0}> ()
   int64_0_25_cast = CastLike (int64_0_25, self_rank)
   cond_26 = Equal (self_rank, int64_0_25_cast)
   result_29 = If (cond_26) <then_branch: graph = thenGraph_27 () => ( result_27) {
      result_27 = Squeeze (result_24)
   }, else_branch: graph = elseGraph_27 () => ( result_28) {
      result_28 = Identity (result_24)
   }>
}
<
  domain: "pkg.onnxscript.torch_lib.common",
  opset_import: ["" : 18]
>
Rank (input) => (return_val)
{
   tmp = Shape (input)
   return_val = Size (tmp)
}
<
  domain: "pkg.onnxscript.torch_lib.common",
  opset_import: ["" : 18]
>
IsScalar (input) => (return_val)
{
   tmp = Shape (input)
   tmp_0 = Size (tmp)
   tmp_1 = Constant <value_int: int = 0> ()
   return_val = Equal (tmp_0, tmp_1)
}

Environment

OS: Linux-6.2.0-1016-azure-x86_64-with-glibc2.35
Python version: 3.10.9 (main, Jan 11 2023, 15:21:40) [GCC 11.2.0]
onnx==1.15.0
onnxruntime==1.16.1
numpy==1.25.1
torch==2.2.0.dev20231011+cpu

cc @BowenBao @gramalingam @thiagocrepaldi

@justinchuby justinchuby added core runtime issues related to core runtime converter:dynamo issues related supporting the PyTorch Dynamo exporter labels Nov 8, 2023
@yuslepukhin
Copy link
Member

This is what I am seeing when trying to load the model with the latest ORT

onnxruntime.capi.onnxruntime_pybind11_state.InvalidGraph: [ONNXRuntimeError] : 10 : INVALID_GRAPH : Load model from Model.onnx failed:This is an invalid model. In Node, ("", _aten_linalg_vector_norm_no_dim_onnx, "pkg.onnxscript.torch_lib", -1) : ("_val_2": tensor(float16),) -> ("_val_3": tensor(float16),) , Error type field and data field mismatch in attribute ord.

@yuslepukhin
Copy link
Member

Correction, this was my branch with if constant folding

@justinchuby
Copy link
Contributor Author

    attribute {
      name: "ord"
      f: 2.0
      type: FLOAT
    }

it looks ok (float)?

@yuslepukhin
Copy link
Member

I debugged it and it is onnx.checker that fails while checking this attribute.

The used_field count is 2. Meaning, while the type of the attribute is set to FLOAT as expected, the attribute somehow comes to have two value fields set. First one is f, that check passes, and the second one the field i which is int, and it is set to 2.

So the int field is present, but the type is float, so that causes a mismatch.

image

image

@yuslepukhin
Copy link
Member

yuslepukhin commented Nov 8, 2023

This takes place before any optimizations, when the model proto get's loaded.

@justinchuby

This comment was marked as outdated.

@justinchuby
Copy link
Contributor Author

justinchuby commented Nov 8, 2023

I searched the text in model proto. Strange that it doesn't look like the i field was set for ord? There is only f: 2.0

justinchuby added a commit to microsoft/onnxscript that referenced this issue Nov 8, 2023
Comparing float values `1.0` and `2.0` can be fragile. Should we do it?

Related: microsoft/onnxruntime#18338
@yuslepukhin
Copy link
Member

yuslepukhin commented Nov 8, 2023

It is not in the text. I fed it to the parser:

Name: onnx-weekly
Version: 1.15.0

def save_as_model(model_script_file):
    file = open(model_script_file, 'r')
    script = file.read()
    file.close()
    model_proto = onnx.parser.parse_model(script)
    pre, ext = os.path.splitext(model_script_file)
    model_file_name = pre + '.onnx'; 
    onnx.save_model(model_proto, model_file_name)

And it produced the invalid model. Looks like the same with the latest onnx-weekly.

@yuslepukhin
Copy link
Member

Same with tip of the main. I called the parser directly from C++, same result.

@justinchuby
Copy link
Contributor Author

justinchuby commented Nov 10, 2023

Discussed in Teams with @yuslepukhin. The model can be obtained with the code under the "To Reproduce" section. I create an issue in ONNX on the parser bug.

yuslepukhin added a commit that referenced this issue Nov 16, 2023
### Description
Our function inliner converts call nodes to a proto. `Node::ToProto()`
function recreates optional NodeArgs into a `NodeProto`. While handling
missing input parameters, our inliner simply renames them as empty
strings.
`Graph::InlineFunctionProto()` recreates missing NodeArgs even though
the original call node did not have them.

This results in the below mentioned issue. The inlined model has the
following entries, notice the second argument is present, but has no
value in `ReduceSum` call (from a Dynamo exported model).

>
InsertedPrecisionFreeCast__inlfunc__aten_linalg_vector_norm_no_dim_onnx_result_12
= ReduceSum <keepdims: int = 0, noop_with_empty_axes: int = 0>
(InsertedPrecisionFreeCast__inlfunc_ReduceL1_data_abs, )

We now allow second input to ReduceSum to be nullptr and ignore it as it
is optional.

### Motivation and Context
This seeks to address
#18338
Copy link
Contributor

This issue has been automatically marked as stale due to inactivity and will be closed in 7 days if no further activity occurs. If further support is needed, please provide an update and/or more details.

@github-actions github-actions bot added the stale issues that have not been addressed in a while; categorized by a bot label Dec 10, 2023
kleiti pushed a commit to kleiti/onnxruntime that referenced this issue Mar 22, 2024
…osoft#18423)

### Description
Our function inliner converts call nodes to a proto. `Node::ToProto()`
function recreates optional NodeArgs into a `NodeProto`. While handling
missing input parameters, our inliner simply renames them as empty
strings.
`Graph::InlineFunctionProto()` recreates missing NodeArgs even though
the original call node did not have them.

This results in the below mentioned issue. The inlined model has the
following entries, notice the second argument is present, but has no
value in `ReduceSum` call (from a Dynamo exported model).

>
InsertedPrecisionFreeCast__inlfunc__aten_linalg_vector_norm_no_dim_onnx_result_12
= ReduceSum <keepdims: int = 0, noop_with_empty_axes: int = 0>
(InsertedPrecisionFreeCast__inlfunc_ReduceL1_data_abs, )

We now allow second input to ReduceSum to be nullptr and ignore it as it
is optional.

### Motivation and Context
This seeks to address
microsoft#18338
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
converter:dynamo issues related supporting the PyTorch Dynamo exporter core runtime issues related to core runtime stale issues that have not been addressed in a while; categorized by a bot
Projects
None yet
Development

No branches or pull requests

2 participants