Skip to content

[torchlib] Fix aten::diagonal #7646

[torchlib] Fix aten::diagonal

[torchlib] Fix aten::diagonal #7646

GitHub Actions / Test Results failed Jul 25, 2024 in 0s

38 fail, 1 538 skipped, 11 810 pass in 3h 27m 55s

     24 files      24 suites   3h 27m 55s ⏱️
 13 386 tests 11 810 ✅   1 538 💤    38 ❌
480 088 runs  99 962 ✅ 376 978 💤 3 148 ❌

Results for commit de821f4.

Annotations

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

6 out of 24 runs failed: test_output_match_opinfo__div_mode_floor_rounding_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 0s]
Raw output
Failed: Unexpected success
Unexpected success

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

All 24 runs failed: test_output_match_opinfo__diagonal_cpu_int32 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py310-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py310-ubuntu-latest)/pytest.xml [took 5s]
artifacts/Test Results (py310-windows-latest)/pytest.xml [took 9s]
artifacts/Test Results (py311-experimental-torchlib-tracing-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-experimental-torchlib-tracing-ubuntu-latest)/pytest.xml [took 6s]
artifacts/Test Results (py311-experimental-torchlib-tracing-windows-latest)/pytest.xml [took 10s]
artifacts/Test Results (py311-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-onnx-weekly-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-onnx-weekly-ubuntu-latest)/pytest.xml [took 6s]
artifacts/Test Results (py311-onnx-weekly-windows-latest)/pytest.xml [took 9s]
artifacts/Test Results (py311-ort-nightly-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-ort-nightly-ubuntu-latest)/pytest.xml [took 7s]
artifacts/Test Results (py311-ort-nightly-windows-latest)/pytest.xml [took 10s]
artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-torch-nightly-ubuntu-latest)/pytest.xml [took 7s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 8s]
artifacts/Test Results (py311-ubuntu-latest)/pytest.xml [took 7s]
artifacts/Test Results (py311-windows-latest)/pytest.xml [took 9s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py312-torch-nightly-ubuntu-latest)/pytest.xml [took 6s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 10s]
artifacts/Test Results (py39-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py39-ubuntu-latest)/pytest.xml [took 6s]
artifacts/Test Results (py39-windows-latest)/pytest.xml [took 8s]
Raw output
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 0> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 1> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 0> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 1> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 0> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 1> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 1> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = -2> (input0)
}
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 0> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 1> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 0> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 1> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 0> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 1> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 1> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = -2> (input0)
E   }

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

All 24 runs failed: test_output_match_opinfo__diagonal_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py310-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py310-ubuntu-latest)/pytest.xml [took 7s]
artifacts/Test Results (py310-windows-latest)/pytest.xml [took 9s]
artifacts/Test Results (py311-experimental-torchlib-tracing-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-experimental-torchlib-tracing-ubuntu-latest)/pytest.xml [took 7s]
artifacts/Test Results (py311-experimental-torchlib-tracing-windows-latest)/pytest.xml [took 9s]
artifacts/Test Results (py311-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-onnx-weekly-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-onnx-weekly-ubuntu-latest)/pytest.xml [took 6s]
artifacts/Test Results (py311-onnx-weekly-windows-latest)/pytest.xml [took 10s]
artifacts/Test Results (py311-ort-nightly-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-ort-nightly-ubuntu-latest)/pytest.xml [took 7s]
artifacts/Test Results (py311-ort-nightly-windows-latest)/pytest.xml [took 10s]
artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 4s]
artifacts/Test Results (py311-torch-nightly-ubuntu-latest)/pytest.xml [took 6s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 9s]
artifacts/Test Results (py311-ubuntu-latest)/pytest.xml [took 8s]
artifacts/Test Results (py311-windows-latest)/pytest.xml [took 9s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py312-torch-nightly-ubuntu-latest)/pytest.xml [took 6s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 8s]
artifacts/Test Results (py39-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py39-ubuntu-latest)/pytest.xml [took 6s]
artifacts/Test Results (py39-windows-latest)/pytest.xml [took 8s]
Raw output
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 0> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 1> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 0> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 1> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 0> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 1> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 1> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = -2> (input0)
}
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 0> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 1> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 0> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 1> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 0> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 1> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 1> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = -2> (input0)
E   }

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

6 out of 24 runs failed: test_output_match_opinfo__addmv_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Tensor-likes are not close!

Mismatched elements: 1 / 5 (20.0%)
Greatest absolute difference: 0.046875 at index (2,) (up to 0.01 allowed)
Greatest relative difference: 0.002635955810546875 at index (2,) (up to 0.001 allowed)
AssertionError: Tensor-likes are not close!

Mismatched elements: 1 / 5 (20.0%)
Greatest absolute difference: 0.0234375 at index (2,) (up to 0.01 allowed)
Greatest relative difference: 0.0024318695068359375 at index (2,) (up to 0.001 allowed)
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 1 / 5 (20.0%)
E   Greatest absolute difference: 0.046875 at index (2,) (up to 0.01 allowed)
E   Greatest relative difference: 0.002635955810546875 at index (2,) (up to 0.001 allowed)
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 1 / 5 (20.0%)
E   Greatest absolute difference: 0.0234375 at index (2,) (up to 0.01 allowed)
E   Greatest relative difference: 0.0024318695068359375 at index (2,) (up to 0.001 allowed)

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

6 out of 24 runs failed: test_output_match_opinfo__var_mean_correction_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Output 0 mismatch
AssertionError: Output 0 mismatch
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 2 / 25 (8.0%)
E   Greatest absolute difference: 0.03125 at index (2, 3) (up to 1e-05 allowed)
E   Greatest relative difference: 0.0017795562744140625 at index (2, 3) (up to 0.001 allowed)

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:266: in run_test_output_match
    raise AssertionError(f"Output {j} mismatch") from e
E   AssertionError: Output 0 mismatch
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 2 / 25 (8.0%)
E   Greatest absolute difference: 0.0625 at index (2, 2) (up to 1e-05 allowed)
E   Greatest relative difference: 0.0014429092407226562 at index (2, 3) (up to 0.001 allowed)

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:266: in run_test_output_match
    raise AssertionError(f"Output {j} mismatch") from e
E   AssertionError: Output 0 mismatch

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

6 out of 24 runs failed: test_output_match_opinfo__var_mean_unbiased_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Output 0 mismatch
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Scalars are not close!
E   
E   Expected 25.765625 but got 25.796875.
E   Absolute difference: 0.03125 (up to 1e-05 allowed)
E   Relative difference: 0.001212856276531231 (up to 0.001 allowed)

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:266: in run_test_output_match
    raise AssertionError(f"Output {j} mismatch") from e
E   AssertionError: Output 0 mismatch

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

All 24 runs failed: test_output_match_opinfo__diagonal_cpu_float32 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py310-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py310-ubuntu-latest)/pytest.xml [took 6s]
artifacts/Test Results (py310-windows-latest)/pytest.xml [took 8s]
artifacts/Test Results (py311-experimental-torchlib-tracing-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-experimental-torchlib-tracing-ubuntu-latest)/pytest.xml [took 6s]
artifacts/Test Results (py311-experimental-torchlib-tracing-windows-latest)/pytest.xml [took 10s]
artifacts/Test Results (py311-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-onnx-weekly-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-onnx-weekly-ubuntu-latest)/pytest.xml [took 6s]
artifacts/Test Results (py311-onnx-weekly-windows-latest)/pytest.xml [took 9s]
artifacts/Test Results (py311-ort-nightly-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-ort-nightly-ubuntu-latest)/pytest.xml [took 7s]
artifacts/Test Results (py311-ort-nightly-windows-latest)/pytest.xml [took 10s]
artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-torch-nightly-ubuntu-latest)/pytest.xml [took 6s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 10s]
artifacts/Test Results (py311-ubuntu-latest)/pytest.xml [took 7s]
artifacts/Test Results (py311-windows-latest)/pytest.xml [took 9s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py312-torch-nightly-ubuntu-latest)/pytest.xml [took 8s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 9s]
artifacts/Test Results (py39-macos-latest)/pytest.xml [took 4s]
artifacts/Test Results (py39-ubuntu-latest)/pytest.xml [took 6s]
artifacts/Test Results (py39-windows-latest)/pytest.xml [took 8s]
Raw output
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 0> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 1> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 0> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 1> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 0> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 1> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 1> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = -2> (input0)
}
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 0> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 1> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 0> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 1> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 0> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 1> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 1> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = -2> (input0)
E   }

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

6 out of 24 runs failed: test_output_match_opinfo__index_put_bool_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Tensor-likes are not close!

Mismatched elements: 10 / 25 (40.0%)
Greatest absolute difference: 13.84375 at index (3, 0) (up to 1e-05 allowed)
Greatest relative difference: 7.5 at index (1, 4) (up to 0.001 allowed)
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 10 / 25 (40.0%)
E   Greatest absolute difference: 13.84375 at index (3, 0) (up to 1e-05 allowed)
E   Greatest relative difference: 7.5 at index (1, 4) (up to 0.001 allowed)

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

6 out of 24 runs failed: test_output_match_opinfo__var_correction_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Tensor-likes are not close!

Mismatched elements: 2 / 25 (8.0%)
Greatest absolute difference: 0.03125 at index (2, 3) (up to 1e-05 allowed)
Greatest relative difference: 0.0017795562744140625 at index (2, 3) (up to 0.001 allowed)
AssertionError: Tensor-likes are not close!

Mismatched elements: 2 / 25 (8.0%)
Greatest absolute difference: 0.0625 at index (2, 2) (up to 1e-05 allowed)
Greatest relative difference: 0.0014429092407226562 at index (2, 3) (up to 0.001 allowed)
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 2 / 25 (8.0%)
E   Greatest absolute difference: 0.03125 at index (2, 3) (up to 1e-05 allowed)
E   Greatest relative difference: 0.0017795562744140625 at index (2, 3) (up to 0.001 allowed)
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 2 / 25 (8.0%)
E   Greatest absolute difference: 0.0625 at index (2, 2) (up to 1e-05 allowed)
E   Greatest relative difference: 0.0014429092407226562 at index (2, 3) (up to 0.001 allowed)

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

2 out of 24 runs failed: test_output_match_opinfo__topk_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py311-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Output 1 mismatch
AssertionError: Output 1 mismatch
AssertionError: Output 1 mismatch
AssertionError: Output 1 mismatch
AssertionError: Output 1 mismatch
AssertionError: Output 1 mismatch
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not equal!
E   
E   Mismatched elements: 3 / 75 (4.0%)
E   Greatest absolute difference: 8 at index (4, 2, 0)
E   Greatest relative difference: 3.0 at index (4, 1, 4)

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:266: in run_test_output_match
    raise AssertionError(f"Output {j} mismatch") from e
E   AssertionError: Output 1 mismatch
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not equal!
E   
E   Mismatched elements: 3 / 75 (4.0%)
E   Greatest absolute difference: 8 at index (4, 2, 0)
E   Greatest relative difference: 3.0 at index (4, 1, 4)

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:266: in run_test_output_match
    raise AssertionError(f"Output {j} mismatch") from e
E   AssertionError: Output 1 mismatch
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not equal!
E   
E   Mismatched elements: 3 / 75 (4.0%)
E   Greatest absolute difference: 8 at index (4, 2, 0)
E   Greatest relative difference: 3.0 at index (4, 1, 4)

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:266: in run_test_output_match
    raise AssertionError(f"Output {j} mismatch") from e
E   AssertionError: Output 1 mismatch
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not equal!
E   
E   Mismatched elements: 3 / 75 (4.0%)
E   Greatest absolute difference: 8 at index (4, 2, 0)
E   Greatest relative difference: 3.0 at index (4, 1, 4)

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:266: in run_test_output_match
    raise AssertionError(f"Output {j} mismatch") from e
E   AssertionError: Output 1 mismatch
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not equal!
E   
E   Mismatched elements: 3 / 75 (4.0%)
E   Greatest absolute difference: 8 at index (4, 2, 0)
E   Greatest relative difference: 3.0 at index (4, 1, 4)

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:266: in run_test_output_match
    raise AssertionError(f"Output {j} mismatch") from e
E   AssertionError: Output 1 mismatch
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not equal!
E   
E   Mismatched elements: 3 / 75 (4.0%)
E   Greatest absolute difference: 8 at index (4, 2, 0)
E   Greatest relative difference: 3.0 at index (4, 1, 4)

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:266: in run_test_output_match
    raise AssertionError(f"Output {j} mismatch") from e
E   AssertionError: Output 1 mismatch

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

6 out of 24 runs failed: test_output_match_opinfo__var_mean_dim_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 1s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 1s]
Raw output
AssertionError: Output 0 mismatch
AssertionError: Output 0 mismatch
AssertionError: Output 0 mismatch
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 1 / 25 (4.0%)
E   Greatest absolute difference: 0.0078125 at index (3, 3) (up to 1e-05 allowed)
E   Greatest relative difference: 0.0010890960693359375 at index (3, 3) (up to 0.001 allowed)

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:266: in run_test_output_match
    raise AssertionError(f"Output {j} mismatch") from e
E   AssertionError: Output 0 mismatch
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 1 / 25 (4.0%)
E   Greatest absolute difference: 0.0078125 at index (3, 0, 3) (up to 1e-05 allowed)
E   Greatest relative difference: 0.0010890960693359375 at index (3, 0, 3) (up to 0.001 allowed)

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:266: in run_test_output_match
    raise AssertionError(f"Output {j} mismatch") from e
E   AssertionError: Output 0 mismatch
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 4 / 20 (20.0%)
E   Greatest absolute difference: 0.0625 at index (2, 0) (up to 1e-05 allowed)
E   Greatest relative difference: 0.0014753341674804688 at index (0, 1) (up to 0.001 allowed)

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:266: in run_test_output_match
    raise AssertionError(f"Output {j} mismatch") from e
E   AssertionError: Output 0 mismatch

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

4 out of 24 runs failed: test_output_match_opinfo__matmul_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py311-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Tensor-likes are not close!

Mismatched elements: 1 / 25 (4.0%)
Greatest absolute difference: 0.03515625 at index (4, 4) (up to 0.02 allowed)
Greatest relative difference: 0.0084075927734375 at index (4, 4) (up to 0.002 allowed)
AssertionError: Tensor-likes are not close!

Mismatched elements: 2 / 250 (0.8%)
Greatest absolute difference: 0.029296875 at index (0, 1, 7) (up to 0.02 allowed)
Greatest relative difference: 0.019683837890625 at index (4, 0, 7) (up to 0.002 allowed)
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 1 / 25 (4.0%)
E   Greatest absolute difference: 0.03515625 at index (4, 4) (up to 0.02 allowed)
E   Greatest relative difference: 0.0084075927734375 at index (4, 4) (up to 0.002 allowed)
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 2 / 250 (0.8%)
E   Greatest absolute difference: 0.029296875 at index (0, 1, 7) (up to 0.02 allowed)
E   Greatest relative difference: 0.019683837890625 at index (4, 0, 7) (up to 0.002 allowed)

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

6 out of 24 runs failed: test_output_match_opinfo__ops_aten_embedding_bag_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 1s]
artifacts/Test Results (py311-torch-nightly-ubuntu-latest)/pytest.xml [took 7s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 25s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 1s]
artifacts/Test Results (py312-torch-nightly-ubuntu-latest)/pytest.xml [took 8s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 22s]
Raw output
AssertionError: Output 0 mismatch
AssertionError: Output 0 mismatch
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 1 / 15 (6.7%)
E   Greatest absolute difference: 0.0546875 at index (2, 0) (up to 0.01 allowed)
E   Greatest relative difference: 0.0124359130859375 at index (2, 0) (up to 0.01 allowed)

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:266: in run_test_output_match
    raise AssertionError(f"Output {j} mismatch") from e
E   AssertionError: Output 0 mismatch
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 1 / 15 (6.7%)
E   Greatest absolute difference: 0.0546875 at index (2, 0) (up to 0.01 allowed)
E   Greatest relative difference: 0.0124359130859375 at index (2, 0) (up to 0.01 allowed)

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:266: in run_test_output_match
    raise AssertionError(f"Output {j} mismatch") from e
E   AssertionError: Output 0 mismatch

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

6 out of 24 runs failed: test_output_match_opinfo__index_put_bool_cpu_int64 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Tensor-likes are not equal!

Mismatched elements: 9 / 25 (36.0%)
Greatest absolute difference: 15 at index (1, 0)
Greatest relative difference: 6.0 at index (3, 4)
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not equal!
E   
E   Mismatched elements: 9 / 25 (36.0%)
E   Greatest absolute difference: 15 at index (1, 0)
E   Greatest relative difference: 6.0 at index (3, 4)

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

6 out of 24 runs failed: test_output_match_opinfo__var_unbiased_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Scalars are not close!

Expected 25.765625 but got 25.796875.
Absolute difference: 0.03125 (up to 1e-05 allowed)
Relative difference: 0.001212856276531231 (up to 0.001 allowed)
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Scalars are not close!
E   
E   Expected 25.765625 but got 25.796875.
E   Absolute difference: 0.03125 (up to 1e-05 allowed)
E   Relative difference: 0.001212856276531231 (up to 0.001 allowed)

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

All 24 runs failed: test_output_match_opinfo__diagonal_cpu_int64 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py310-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py310-ubuntu-latest)/pytest.xml [took 6s]
artifacts/Test Results (py310-windows-latest)/pytest.xml [took 9s]
artifacts/Test Results (py311-experimental-torchlib-tracing-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-experimental-torchlib-tracing-ubuntu-latest)/pytest.xml [took 6s]
artifacts/Test Results (py311-experimental-torchlib-tracing-windows-latest)/pytest.xml [took 10s]
artifacts/Test Results (py311-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-onnx-weekly-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-onnx-weekly-ubuntu-latest)/pytest.xml [took 7s]
artifacts/Test Results (py311-onnx-weekly-windows-latest)/pytest.xml [took 8s]
artifacts/Test Results (py311-ort-nightly-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-ort-nightly-ubuntu-latest)/pytest.xml [took 7s]
artifacts/Test Results (py311-ort-nightly-windows-latest)/pytest.xml [took 10s]
artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-torch-nightly-ubuntu-latest)/pytest.xml [took 6s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 9s]
artifacts/Test Results (py311-ubuntu-latest)/pytest.xml [took 7s]
artifacts/Test Results (py311-windows-latest)/pytest.xml [took 9s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py312-torch-nightly-ubuntu-latest)/pytest.xml [took 6s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 8s]
artifacts/Test Results (py39-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py39-ubuntu-latest)/pytest.xml [took 7s]
artifacts/Test Results (py39-windows-latest)/pytest.xml [took 8s]
Raw output
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 0> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 1> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 0> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 1> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 0> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 1> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 1> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = -2> (input0)
}
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 0> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 1> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 0> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 1> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 0> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 1> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 1> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2557: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = -2> (input0)
E   }

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

6 out of 24 runs failed: test_output_match_opinfo__index_put_bool_cpu_float32 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Tensor-likes are not close!

Mismatched elements: 10 / 25 (40.0%)
Greatest absolute difference: 12.913021087646484 at index (1, 1) (up to 1e-05 allowed)
Greatest relative difference: 3.1326277256011963 at index (1, 0) (up to 1.3e-06 allowed)
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 10 / 25 (40.0%)
E   Greatest absolute difference: 12.913021087646484 at index (1, 1) (up to 1e-05 allowed)
E   Greatest relative difference: 3.1326277256011963 at index (1, 0) (up to 1.3e-06 allowed)

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

6 out of 24 runs failed: test_output_match_opinfo__erfc_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Tensor-likes are not close!

Mismatched elements: 1 / 20 (5.0%)
Greatest absolute difference: 0.00023031234741210938 at index (2,) (up to 0.0002 allowed)
Greatest relative difference: 0.892578125 at index (2,) (up to 0.01 allowed)
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 1 / 20 (5.0%)
E   Greatest absolute difference: 0.00023031234741210938 at index (2,) (up to 0.0002 allowed)
E   Greatest relative difference: 0.892578125 at index (2,) (up to 0.01 allowed)

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

6 out of 24 runs failed: test_output_match_opinfo__var_dim_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 1s]
Raw output
AssertionError: Tensor-likes are not close!

Mismatched elements: 1 / 25 (4.0%)
Greatest absolute difference: 0.0078125 at index (3, 3) (up to 1e-05 allowed)
Greatest relative difference: 0.0010890960693359375 at index (3, 3) (up to 0.001 allowed)
AssertionError: Tensor-likes are not close!

Mismatched elements: 1 / 25 (4.0%)
Greatest absolute difference: 0.0078125 at index (3, 0, 3) (up to 1e-05 allowed)
Greatest relative difference: 0.0010890960693359375 at index (3, 0, 3) (up to 0.001 allowed)
AssertionError: Tensor-likes are not close!

Mismatched elements: 4 / 20 (20.0%)
Greatest absolute difference: 0.0625 at index (2, 0) (up to 1e-05 allowed)
Greatest relative difference: 0.0014753341674804688 at index (0, 1) (up to 0.001 allowed)
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 1 / 25 (4.0%)
E   Greatest absolute difference: 0.0078125 at index (3, 3) (up to 1e-05 allowed)
E   Greatest relative difference: 0.0010890960693359375 at index (3, 3) (up to 0.001 allowed)
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 1 / 25 (4.0%)
E   Greatest absolute difference: 0.0078125 at index (3, 0, 3) (up to 1e-05 allowed)
E   Greatest relative difference: 0.0010890960693359375 at index (3, 0, 3) (up to 0.001 allowed)
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 4 / 20 (20.0%)
E   Greatest absolute difference: 0.0625 at index (2, 0) (up to 1e-05 allowed)
E   Greatest relative difference: 0.0014753341674804688 at index (0, 1) (up to 0.001 allowed)

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

6 out of 24 runs failed: test_output_match_opinfo__floor_divide_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 0s]
Raw output
Failed: Unexpected success
Unexpected success

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

6 out of 24 runs failed: test_output_match_opinfo__index_put_bool_cpu_int32 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Tensor-likes are not equal!

Mismatched elements: 9 / 25 (36.0%)
Greatest absolute difference: 15 at index (1, 0)
Greatest relative difference: 6.0 at index (3, 4)
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not equal!
E   
E   Mismatched elements: 9 / 25 (36.0%)
E   Greatest absolute difference: 15 at index (1, 0)
E   Greatest relative difference: 6.0 at index (3, 4)

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

All 24 runs failed: test_output_match_opinfo__diagonal_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU)

artifacts/Test Results (py310-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py310-ubuntu-latest)/pytest.xml [took 9s]
artifacts/Test Results (py310-windows-latest)/pytest.xml [took 10s]
artifacts/Test Results (py311-experimental-torchlib-tracing-macos-latest)/pytest.xml [took 6s]
artifacts/Test Results (py311-experimental-torchlib-tracing-ubuntu-latest)/pytest.xml [took 7s]
artifacts/Test Results (py311-experimental-torchlib-tracing-windows-latest)/pytest.xml [took 11s]
artifacts/Test Results (py311-macos-latest)/pytest.xml [took 4s]
artifacts/Test Results (py311-onnx-weekly-macos-latest)/pytest.xml [took 4s]
artifacts/Test Results (py311-onnx-weekly-ubuntu-latest)/pytest.xml [took 8s]
artifacts/Test Results (py311-onnx-weekly-windows-latest)/pytest.xml [took 11s]
artifacts/Test Results (py311-ort-nightly-macos-latest)/pytest.xml [took 4s]
artifacts/Test Results (py311-ort-nightly-ubuntu-latest)/pytest.xml [took 8s]
artifacts/Test Results (py311-ort-nightly-windows-latest)/pytest.xml [took 12s]
artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 4s]
artifacts/Test Results (py311-torch-nightly-ubuntu-latest)/pytest.xml [took 8s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 11s]
artifacts/Test Results (py311-ubuntu-latest)/pytest.xml [took 8s]
artifacts/Test Results (py311-windows-latest)/pytest.xml [took 11s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py312-torch-nightly-ubuntu-latest)/pytest.xml [took 8s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 10s]
artifacts/Test Results (py39-macos-latest)/pytest.xml [took 4s]
artifacts/Test Results (py39-ubuntu-latest)/pytest.xml [took 7s]
artifacts/Test Results (py39-windows-latest)/pytest.xml [took 9s]
Raw output
TypeError: Unsupported sequence type '<class 'list'>' for attribute 'value_ints'
TypeError: Unsupported sequence type '<class 'list'>' for attribute 'value_ints'
TypeError: Unsupported sequence type '<class 'list'>' for attribute 'value_ints'
TypeError: Unsupported sequence type '<class 'list'>' for attribute 'value_ints'
TypeError: Unsupported sequence type '<class 'list'>' for attribute 'value_ints'
TypeError: Unsupported sequence type '<class 'list'>' for attribute 'value_ints'
TypeError: Unsupported sequence type '<class 'list'>' for attribute 'value_ints'
TypeError: Unsupported sequence type '<class 'list'>' for attribute 'value_ints'
TypeError: Unsupported sequence type '<class 'list'>' for attribute 'value_ints'
TypeError: Unsupported sequence type '<class 'list'>' for attribute 'value_ints'
TypeError: Unsupported sequence type '<class 'list'>' for attribute 'value_ints'
TypeError: Unsupported sequence type '<class 'list'>' for attribute 'value_ints'
TypeError: Unsupported sequence type '<class 'list'>' for attribute 'value_ints'
TypeError: Unsupported sequence type '<class 'list'>' for attribute 'value_ints'
TypeError: Unsupported sequence type '<class 'list'>' for attribute 'value_ints'
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:515: in _capture_graph_and_evaluate_torch_script_evaluator
    symbolic_outputs = function(*onnxscript_args, **onnxscript_kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2596: in aten_diagonal
    result, op.Constant(value_ints=[start]), op.Constant(value_ints=[end]), axes=axes
onnxscript/onnx_opset/_impl/opset13.py:453: in Constant
    return op(
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:346: in eval
    return self._graph.add_op_call(schema, inputs, attributes)
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph.add_op_call) at 0x7f3b8115bce0>:111: in add_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:878: in add_op_call
    result = self._add_torchscript_op_call(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph._add_torchscript_op_call) at 0x7f3b8115b880>:129: in _add_torchscript_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:758: in _add_torchscript_op_call
    result = _create_op_call_in_torch_graph(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch._create_op_call_in_torch_graph) at 0x7f3b8115afc0>:113: in _create_op_call_in_torch_graph
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:498: in _create_op_call_in_torch_graph
    _add_attribute_to_torchscript_node(node, key, value)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:462: in _add_attribute_to_torchscript_node
    raise TypeError(f"Unsupported sequence type '{type(value)}' for attribute '{key}'")
E   TypeError: Unsupported sequence type '<class 'list'>' for attribute 'value_ints'
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:515: in _capture_graph_and_evaluate_torch_script_evaluator
    symbolic_outputs = function(*onnxscript_args, **onnxscript_kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2596: in aten_diagonal
    result, op.Constant(value_ints=[start]), op.Constant(value_ints=[end]), axes=axes
onnxscript/onnx_opset/_impl/opset13.py:453: in Constant
    return op(
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:346: in eval
    return self._graph.add_op_call(schema, inputs, attributes)
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph.add_op_call) at 0x7f3b8115bce0>:111: in add_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:878: in add_op_call
    result = self._add_torchscript_op_call(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph._add_torchscript_op_call) at 0x7f3b8115b880>:129: in _add_torchscript_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:758: in _add_torchscript_op_call
    result = _create_op_call_in_torch_graph(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch._create_op_call_in_torch_graph) at 0x7f3b8115afc0>:113: in _create_op_call_in_torch_graph
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:498: in _create_op_call_in_torch_graph
    _add_attribute_to_torchscript_node(node, key, value)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:462: in _add_attribute_to_torchscript_node
    raise TypeError(f"Unsupported sequence type '{type(value)}' for attribute '{key}'")
E   TypeError: Unsupported sequence type '<class 'list'>' for attribute 'value_ints'
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:515: in _capture_graph_and_evaluate_torch_script_evaluator
    symbolic_outputs = function(*onnxscript_args, **onnxscript_kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2596: in aten_diagonal
    result, op.Constant(value_ints=[start]), op.Constant(value_ints=[end]), axes=axes
onnxscript/onnx_opset/_impl/opset13.py:453: in Constant
    return op(
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:346: in eval
    return self._graph.add_op_call(schema, inputs, attributes)
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph.add_op_call) at 0x7f3b8115bce0>:111: in add_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:878: in add_op_call
    result = self._add_torchscript_op_call(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph._add_torchscript_op_call) at 0x7f3b8115b880>:129: in _add_torchscript_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:758: in _add_torchscript_op_call
    result = _create_op_call_in_torch_graph(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch._create_op_call_in_torch_graph) at 0x7f3b8115afc0>:113: in _create_op_call_in_torch_graph
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:498: in _create_op_call_in_torch_graph
    _add_attribute_to_torchscript_node(node, key, value)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:462: in _add_attribute_to_torchscript_node
    raise TypeError(f"Unsupported sequence type '{type(value)}' for attribute '{key}'")
E   TypeError: Unsupported sequence type '<class 'list'>' for attribute 'value_ints'
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:515: in _capture_graph_and_evaluate_torch_script_evaluator
    symbolic_outputs = function(*onnxscript_args, **onnxscript_kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2596: in aten_diagonal
    result, op.Constant(value_ints=[start]), op.Constant(value_ints=[end]), axes=axes
onnxscript/onnx_opset/_impl/opset13.py:453: in Constant
    return op(
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:346: in eval
    return self._graph.add_op_call(schema, inputs, attributes)
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph.add_op_call) at 0x7f3b8115bce0>:111: in add_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:878: in add_op_call
    result = self._add_torchscript_op_call(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph._add_torchscript_op_call) at 0x7f3b8115b880>:129: in _add_torchscript_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:758: in _add_torchscript_op_call
    result = _create_op_call_in_torch_graph(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch._create_op_call_in_torch_graph) at 0x7f3b8115afc0>:113: in _create_op_call_in_torch_graph
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:498: in _create_op_call_in_torch_graph
    _add_attribute_to_torchscript_node(node, key, value)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:462: in _add_attribute_to_torchscript_node
    raise TypeError(f"Unsupported sequence type '{type(value)}' for attribute '{key}'")
E   TypeError: Unsupported sequence type '<class 'list'>' for attribute 'value_ints'
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:515: in _capture_graph_and_evaluate_torch_script_evaluator
    symbolic_outputs = function(*onnxscript_args, **onnxscript_kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2596: in aten_diagonal
    result, op.Constant(value_ints=[start]), op.Constant(value_ints=[end]), axes=axes
onnxscript/onnx_opset/_impl/opset13.py:453: in Constant
    return op(
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:346: in eval
    return self._graph.add_op_call(schema, inputs, attributes)
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph.add_op_call) at 0x7f3b8115bce0>:111: in add_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:878: in add_op_call
    result = self._add_torchscript_op_call(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph._add_torchscript_op_call) at 0x7f3b8115b880>:129: in _add_torchscript_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:758: in _add_torchscript_op_call
    result = _create_op_call_in_torch_graph(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch._create_op_call_in_torch_graph) at 0x7f3b8115afc0>:113: in _create_op_call_in_torch_graph
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:498: in _create_op_call_in_torch_graph
    _add_attribute_to_torchscript_node(node, key, value)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:462: in _add_attribute_to_torchscript_node
    raise TypeError(f"Unsupported sequence type '{type(value)}' for attribute '{key}'")
E   TypeError: Unsupported sequence type '<class 'list'>' for attribute 'value_ints'
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:515: in _capture_graph_and_evaluate_torch_script_evaluator
    symbolic_outputs = function(*onnxscript_args, **onnxscript_kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2596: in aten_diagonal
    result, op.Constant(value_ints=[start]), op.Constant(value_ints=[end]), axes=axes
onnxscript/onnx_opset/_impl/opset13.py:453: in Constant
    return op(
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:346: in eval
    return self._graph.add_op_call(schema, inputs, attributes)
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph.add_op_call) at 0x7f3b8115bce0>:111: in add_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:878: in add_op_call
    result = self._add_torchscript_op_call(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph._add_torchscript_op_call) at 0x7f3b8115b880>:129: in _add_torchscript_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:758: in _add_torchscript_op_call
    result = _create_op_call_in_torch_graph(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch._create_op_call_in_torch_graph) at 0x7f3b8115afc0>:113: in _create_op_call_in_torch_graph
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:498: in _create_op_call_in_torch_graph
    _add_attribute_to_torchscript_node(node, key, value)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:462: in _add_attribute_to_torchscript_node
    raise TypeError(f"Unsupported sequence type '{type(value)}' for attribute '{key}'")
E   TypeError: Unsupported sequence type '<class 'list'>' for attribute 'value_ints'
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:515: in _capture_graph_and_evaluate_torch_script_evaluator
    symbolic_outputs = function(*onnxscript_args, **onnxscript_kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2596: in aten_diagonal
    result, op.Constant(value_ints=[start]), op.Constant(value_ints=[end]), axes=axes
onnxscript/onnx_opset/_impl/opset13.py:453: in Constant
    return op(
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:346: in eval
    return self._graph.add_op_call(schema, inputs, attributes)
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph.add_op_call) at 0x7f3b8115bce0>:111: in add_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:878: in add_op_call
    result = self._add_torchscript_op_call(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph._add_torchscript_op_call) at 0x7f3b8115b880>:129: in _add_torchscript_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:758: in _add_torchscript_op_call
    result = _create_op_call_in_torch_graph(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch._create_op_call_in_torch_graph) at 0x7f3b8115afc0>:113: in _create_op_call_in_torch_graph
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:498: in _create_op_call_in_torch_graph
    _add_attribute_to_torchscript_node(node, key, value)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:462: in _add_attribute_to_torchscript_node
    raise TypeError(f"Unsupported sequence type '{type(value)}' for attribute '{key}'")
E   TypeError: Unsupported sequence type '<class 'list'>' for attribute 'value_ints'
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:515: in _capture_graph_and_evaluate_torch_script_evaluator
    symbolic_outputs = function(*onnxscript_args, **onnxscript_kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2596: in aten_diagonal
    result, op.Constant(value_ints=[start]), op.Constant(value_ints=[end]), axes=axes
onnxscript/onnx_opset/_impl/opset13.py:453: in Constant
    return op(
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:346: in eval
    return self._graph.add_op_call(schema, inputs, attributes)
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph.add_op_call) at 0x7f3b8115bce0>:111: in add_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:878: in add_op_call
    result = self._add_torchscript_op_call(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph._add_torchscript_op_call) at 0x7f3b8115b880>:129: in _add_torchscript_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:758: in _add_torchscript_op_call
    result = _create_op_call_in_torch_graph(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch._create_op_call_in_torch_graph) at 0x7f3b8115afc0>:113: in _create_op_call_in_torch_graph
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:498: in _create_op_call_in_torch_graph
    _add_attribute_to_torchscript_node(node, key, value)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:462: in _add_attribute_to_torchscript_node
    raise TypeError(f"Unsupported sequence type '{type(value)}' for attribute '{key}'")
E   TypeError: Unsupported sequence type '<class 'list'>' for attribute 'value_ints'
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:515: in _capture_graph_and_evaluate_torch_script_evaluator
    symbolic_outputs = function(*onnxscript_args, **onnxscript_kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2596: in aten_diagonal
    result, op.Constant(value_ints=[start]), op.Constant(value_ints=[end]), axes=axes
onnxscript/onnx_opset/_impl/opset13.py:453: in Constant
    return op(
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:346: in eval
    return self._graph.add_op_call(schema, inputs, attributes)
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph.add_op_call) at 0x7f3b8115bce0>:111: in add_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:878: in add_op_call
    result = self._add_torchscript_op_call(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph._add_torchscript_op_call) at 0x7f3b8115b880>:129: in _add_torchscript_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:758: in _add_torchscript_op_call
    result = _create_op_call_in_torch_graph(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch._create_op_call_in_torch_graph) at 0x7f3b8115afc0>:113: in _create_op_call_in_torch_graph
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:498: in _create_op_call_in_torch_graph
    _add_attribute_to_torchscript_node(node, key, value)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:462: in _add_attribute_to_torchscript_node
    raise TypeError(f"Unsupported sequence type '{type(value)}' for attribute '{key}'")
E   TypeError: Unsupported sequence type '<class 'list'>' for attribute 'value_ints'
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:515: in _capture_graph_and_evaluate_torch_script_evaluator
    symbolic_outputs = function(*onnxscript_args, **onnxscript_kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2596: in aten_diagonal
    result, op.Constant(value_ints=[start]), op.Constant(value_ints=[end]), axes=axes
onnxscript/onnx_opset/_impl/opset13.py:453: in Constant
    return op(
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:346: in eval
    return self._graph.add_op_call(schema, inputs, attributes)
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph.add_op_call) at 0x7f3b8115bce0>:111: in add_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:878: in add_op_call
    result = self._add_torchscript_op_call(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph._add_torchscript_op_call) at 0x7f3b8115b880>:129: in _add_torchscript_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:758: in _add_torchscript_op_call
    result = _create_op_call_in_torch_graph(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch._create_op_call_in_torch_graph) at 0x7f3b8115afc0>:113: in _create_op_call_in_torch_graph
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:498: in _create_op_call_in_torch_graph
    _add_attribute_to_torchscript_node(node, key, value)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:462: in _add_attribute_to_torchscript_node
    raise TypeError(f"Unsupported sequence type '{type(value)}' for attribute '{key}'")
E   TypeError: Unsupported sequence type '<class 'list'>' for attribute 'value_ints'
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:515: in _capture_graph_and_evaluate_torch_script_evaluator
    symbolic_outputs = function(*onnxscript_args, **onnxscript_kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2596: in aten_diagonal
    result, op.Constant(value_ints=[start]), op.Constant(value_ints=[end]), axes=axes
onnxscript/onnx_opset/_impl/opset13.py:453: in Constant
    return op(
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:346: in eval
    return self._graph.add_op_call(schema, inputs, attributes)
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph.add_op_call) at 0x7f3b8115bce0>:111: in add_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:878: in add_op_call
    result = self._add_torchscript_op_call(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph._add_torchscript_op_call) at 0x7f3b8115b880>:129: in _add_torchscript_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:758: in _add_torchscript_op_call
    result = _create_op_call_in_torch_graph(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch._create_op_call_in_torch_graph) at 0x7f3b8115afc0>:113: in _create_op_call_in_torch_graph
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:498: in _create_op_call_in_torch_graph
    _add_attribute_to_torchscript_node(node, key, value)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:462: in _add_attribute_to_torchscript_node
    raise TypeError(f"Unsupported sequence type '{type(value)}' for attribute '{key}'")
E   TypeError: Unsupported sequence type '<class 'list'>' for attribute 'value_ints'
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:515: in _capture_graph_and_evaluate_torch_script_evaluator
    symbolic_outputs = function(*onnxscript_args, **onnxscript_kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2596: in aten_diagonal
    result, op.Constant(value_ints=[start]), op.Constant(value_ints=[end]), axes=axes
onnxscript/onnx_opset/_impl/opset13.py:453: in Constant
    return op(
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:346: in eval
    return self._graph.add_op_call(schema, inputs, attributes)
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph.add_op_call) at 0x7f3b8115bce0>:111: in add_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:878: in add_op_call
    result = self._add_torchscript_op_call(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph._add_torchscript_op_call) at 0x7f3b8115b880>:129: in _add_torchscript_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:758: in _add_torchscript_op_call
    result = _create_op_call_in_torch_graph(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch._create_op_call_in_torch_graph) at 0x7f3b8115afc0>:113: in _create_op_call_in_torch_graph
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:498: in _create_op_call_in_torch_graph
    _add_attribute_to_torchscript_node(node, key, value)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:462: in _add_attribute_to_torchscript_node
    raise TypeError(f"Unsupported sequence type '{type(value)}' for attribute '{key}'")
E   TypeError: Unsupported sequence type '<class 'list'>' for attribute 'value_ints'
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:515: in _capture_graph_and_evaluate_torch_script_evaluator
    symbolic_outputs = function(*onnxscript_args, **onnxscript_kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2596: in aten_diagonal
    result, op.Constant(value_ints=[start]), op.Constant(value_ints=[end]), axes=axes
onnxscript/onnx_opset/_impl/opset13.py:453: in Constant
    return op(
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:346: in eval
    return self._graph.add_op_call(schema, inputs, attributes)
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph.add_op_call) at 0x7f3b8115bce0>:111: in add_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:878: in add_op_call
    result = self._add_torchscript_op_call(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph._add_torchscript_op_call) at 0x7f3b8115b880>:129: in _add_torchscript_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:758: in _add_torchscript_op_call
    result = _create_op_call_in_torch_graph(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch._create_op_call_in_torch_graph) at 0x7f3b8115afc0>:113: in _create_op_call_in_torch_graph
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:498: in _create_op_call_in_torch_graph
    _add_attribute_to_torchscript_node(node, key, value)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:462: in _add_attribute_to_torchscript_node
    raise TypeError(f"Unsupported sequence type '{type(value)}' for attribute '{key}'")
E   TypeError: Unsupported sequence type '<class 'list'>' for attribute 'value_ints'
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:515: in _capture_graph_and_evaluate_torch_script_evaluator
    symbolic_outputs = function(*onnxscript_args, **onnxscript_kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2596: in aten_diagonal
    result, op.Constant(value_ints=[start]), op.Constant(value_ints=[end]), axes=axes
onnxscript/onnx_opset/_impl/opset13.py:453: in Constant
    return op(
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:346: in eval
    return self._graph.add_op_call(schema, inputs, attributes)
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph.add_op_call) at 0x7f3b8115bce0>:111: in add_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:878: in add_op_call
    result = self._add_torchscript_op_call(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph._add_torchscript_op_call) at 0x7f3b8115b880>:129: in _add_torchscript_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:758: in _add_torchscript_op_call
    result = _create_op_call_in_torch_graph(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch._create_op_call_in_torch_graph) at 0x7f3b8115afc0>:113: in _create_op_call_in_torch_graph
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:498: in _create_op_call_in_torch_graph
    _add_attribute_to_torchscript_node(node, key, value)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:462: in _add_attribute_to_torchscript_node
    raise TypeError(f"Unsupported sequence type '{type(value)}' for attribute '{key}'")
E   TypeError: Unsupported sequence type '<class 'list'>' for attribute 'value_ints'
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:515: in _capture_graph_and_evaluate_torch_script_evaluator
    symbolic_outputs = function(*onnxscript_args, **onnxscript_kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2596: in aten_diagonal
    result, op.Constant(value_ints=[start]), op.Constant(value_ints=[end]), axes=axes
onnxscript/onnx_opset/_impl/opset13.py:453: in Constant
    return op(
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:346: in eval
    return self._graph.add_op_call(schema, inputs, attributes)
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph.add_op_call) at 0x7f3b8115bce0>:111: in add_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:878: in add_op_call
    result = self._add_torchscript_op_call(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph._add_torchscript_op_call) at 0x7f3b8115b880>:129: in _add_torchscript_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:758: in _add_torchscript_op_call
    result = _create_op_call_in_torch_graph(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch._create_op_call_in_torch_graph) at 0x7f3b8115afc0>:113: in _create_op_call_in_torch_graph
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:498: in _create_op_call_in_torch_graph
    _add_attribute_to_torchscript_node(node, key, value)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:462: in _add_attribute_to_torchscript_node
    raise TypeError(f"Unsupported sequence type '{type(value)}' for attribute '{key}'")
E   TypeError: Unsupported sequence type '<class 'list'>' for attribute 'value_ints'

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

6 out of 24 runs failed: test_output_match_opinfo__sub_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-ubuntu-latest)/pytest.xml [took 1s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Tensor-likes are not close!

Mismatched elements: 2 / 25 (8.0%)
Greatest absolute difference: 0.001953125 at index (3, 3) (up to 1e-05 allowed)
Greatest relative difference: 0.0028400421142578125 at index (3, 3) (up to 0.001 allowed)
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 2 / 25 (8.0%)
E   Greatest absolute difference: 0.001953125 at index (3, 3) (up to 1e-05 allowed)
E   Greatest relative difference: 0.0028400421142578125 at index (3, 3) (up to 0.001 allowed)

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

4 out of 24 runs failed: test_output_match_opinfo__matmul_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU)

artifacts/Test Results (py311-torch-nightly-ubuntu-latest)/pytest.xml [took 1s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-ubuntu-latest)/pytest.xml [took 1s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Tensor-likes are not close!

Mismatched elements: 1 / 25 (4.0%)
Greatest absolute difference: 0.03515625 at index (4, 4) (up to 0.02 allowed)
Greatest relative difference: 0.0084075927734375 at index (4, 4) (up to 0.002 allowed)
AssertionError: Tensor-likes are not close!

Mismatched elements: 2 / 250 (0.8%)
Greatest absolute difference: 0.029296875 at index (0, 1, 7) (up to 0.02 allowed)
Greatest relative difference: 0.019683837890625 at index (4, 0, 7) (up to 0.002 allowed)
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 1 / 25 (4.0%)
E   Greatest absolute difference: 0.03515625 at index (4, 4) (up to 0.02 allowed)
E   Greatest relative difference: 0.0084075927734375 at index (4, 4) (up to 0.002 allowed)
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 2 / 250 (0.8%)
E   Greatest absolute difference: 0.029296875 at index (0, 1, 7) (up to 0.02 allowed)
E   Greatest relative difference: 0.019683837890625 at index (4, 0, 7) (up to 0.002 allowed)

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

6 out of 24 runs failed: test_output_match_opinfo__index_put_bool_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Tensor-likes are not close!

Mismatched elements: 10 / 25 (40.0%)
Greatest absolute difference: 13.84375 at index (3, 0) (up to 1e-05 allowed)
Greatest relative difference: 7.5 at index (1, 4) (up to 0.001 allowed)
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 10 / 25 (40.0%)
E   Greatest absolute difference: 13.84375 at index (3, 0) (up to 1e-05 allowed)
E   Greatest relative difference: 7.5 at index (1, 4) (up to 0.001 allowed)