Skip to content

[torchlib] Fix aten::diagonal #7502

[torchlib] Fix aten::diagonal

[torchlib] Fix aten::diagonal #7502

GitHub Actions / Test Results failed Jul 26, 2024 in 0s

5 errors, 45 fail, 2 739 skipped, 12 594 pass in 1h 48m 40s

     24 files  +     1      24 suites  +1   1h 48m 40s ⏱️ - 30m 27s
 15 383 tests  -  1 287  12 594 ✅  - 1 295    2 739 💤 +     1     45 ❌ +  7   5 🔥 ± 0 
317 190 runs   - 53 247  75 935 ✅  - 7 759  239 883 💤  - 45 255  1 297 ❌  - 218  75 🔥  - 15 

Results for commit adc1f6a. ± Comparison against earlier commit 7ab2005.

Annotations

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 3 runs failed: test_export2python_produces_correct_onnx_script_model_0963_test_resize_upsample_scales_cubic (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py311-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_resize_upsample_scales_cubic' (e=No module named 'tests.onnx_backend_test_code.test_resize_upsample_scales_cubic') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_resize_upsample_scales_cubic.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_resize_upsample_scales_cubic.py', current folder: D:\a\onnxscript\onnxscript
---- CONTENT --
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import FLOAT
from onnxscript.onnx_opset import opset19

@script()
def bck_test_resize_upsample_scales_cubic(X: FLOAT[1,1,4,4], scales: FLOAT[4]) -> (FLOAT[1,1,8,8]):
    Y = opset19.Resize(X, None, scales, mode='cubic')
    return Y
onnxscript\backend\onnx_export_test.py:133: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.11.9\x64\Lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_resize_upsample_scales_cubic'

The above exception was the direct cause of the following exception:
.nox\test\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:267: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:135: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_resize_upsample_scales_cubic' (e=No module named 'tests.onnx_backend_test_code.test_resize_upsample_scales_cubic') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_resize_upsample_scales_cubic.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_resize_upsample_scales_cubic.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT
E   from onnxscript.onnx_opset import opset19
E   
E   @script()
E   def bck_test_resize_upsample_scales_cubic(X: FLOAT[1,1,4,4], scales: FLOAT[4]) -> (FLOAT[1,1,8,8]):
E       Y = opset19.Resize(X, None, scales, mode='cubic')
E       return Y

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 3 runs failed: test_export2python_produces_correct_onnx_script_model_1120_test_slice_end_out_of_bounds (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py311-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_slice_end_out_of_bounds' (e=No module named 'tests.onnx_backend_test_code.test_slice_end_out_of_bounds') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_slice_end_out_of_bounds.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_slice_end_out_of_bounds.py', current folder: D:\a\onnxscript\onnxscript
---- CONTENT --
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import FLOAT, INT64
from onnxscript.onnx_opset import opset13

@script()
def bck_test_slice_end_out_of_bounds(x: FLOAT[20,10,5], starts: INT64[1], ends: INT64[1], axes: INT64[1], steps: INT64[1]) -> (FLOAT[20,9,5]):
    y = opset13.Slice(x, starts, ends, axes, steps)
    return y
onnxscript\backend\onnx_export_test.py:133: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.11.9\x64\Lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_slice_end_out_of_bounds'

The above exception was the direct cause of the following exception:
.nox\test\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:267: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:135: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_slice_end_out_of_bounds' (e=No module named 'tests.onnx_backend_test_code.test_slice_end_out_of_bounds') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_slice_end_out_of_bounds.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_slice_end_out_of_bounds.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT, INT64
E   from onnxscript.onnx_opset import opset13
E   
E   @script()
E   def bck_test_slice_end_out_of_bounds(x: FLOAT[20,10,5], starts: INT64[1], ends: INT64[1], axes: INT64[1], steps: INT64[1]) -> (FLOAT[20,9,5]):
E       y = opset13.Slice(x, starts, ends, axes, steps)
E       return y

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 3 runs failed: test_export2python_produces_correct_onnx_script_model_0400_test_gemm_all_attributes (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py39-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_gemm_all_attributes' (e=No module named 'tests.onnx_backend_test_code.test_gemm_all_attributes') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_gemm_all_attributes.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_gemm_all_attributes.py', current folder: D:\a\onnxscript\onnxscript
---- CONTENT --
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import FLOAT
from onnxscript.onnx_opset import opset13

@script()
def bck_test_gemm_all_attributes(a: FLOAT[4,3], b: FLOAT[5,4], c: FLOAT[1,5]) -> (FLOAT[3,5]):
    y = opset13.Gemm(a, b, c, alpha=0.25, beta=0.3499999940395355, transA=1, transB=1)
    return y
onnxscript\backend\onnx_export_test.py:133: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_gemm_all_attributes'

The above exception was the direct cause of the following exception:
.nox\test\lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:267: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:135: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_gemm_all_attributes' (e=No module named 'tests.onnx_backend_test_code.test_gemm_all_attributes') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_gemm_all_attributes.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_gemm_all_attributes.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT
E   from onnxscript.onnx_opset import opset13
E   
E   @script()
E   def bck_test_gemm_all_attributes(a: FLOAT[4,3], b: FLOAT[5,4], c: FLOAT[1,5]) -> (FLOAT[3,5]):
E       y = opset13.Gemm(a, b, c, alpha=0.25, beta=0.3499999940395355, transA=1, transB=1)
E       return y

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 3 runs failed: test_export2python_produces_correct_onnx_script_model_0013_test_affine_grid_2d_align_corners (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py311-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_affine_grid_2d_align_corners' (e=No module named 'tests.onnx_backend_test_code.test_affine_grid_2d_align_corners') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_affine_grid_2d_align_corners.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_affine_grid_2d_align_corners.py', current folder: D:\a\onnxscript\onnxscript
---- CONTENT --
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import FLOAT, INT64
from onnxscript.onnx_opset import opset20

@script()
def bck_test_affine_grid_2d_align_corners(theta: FLOAT[2,2,3], size: INT64[4]) -> (FLOAT[2,5,6,2]):
    grid = opset20.AffineGrid(theta, size, align_corners=1)
    return grid
onnxscript\backend\onnx_export_test.py:133: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.11.9\x64\Lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_affine_grid_2d_align_corners'

The above exception was the direct cause of the following exception:
.nox\test\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:267: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:135: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_affine_grid_2d_align_corners' (e=No module named 'tests.onnx_backend_test_code.test_affine_grid_2d_align_corners') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_affine_grid_2d_align_corners.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_affine_grid_2d_align_corners.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT, INT64
E   from onnxscript.onnx_opset import opset20
E   
E   @script()
E   def bck_test_affine_grid_2d_align_corners(theta: FLOAT[2,2,3], size: INT64[4]) -> (FLOAT[2,5,6,2]):
E       grid = opset20.AffineGrid(theta, size, align_corners=1)
E       return grid

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 3 runs failed: test_export2python_produces_correct_onnx_script_model_0404_test_gemm_default_no_bias (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py39-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_gemm_default_no_bias' (e=No module named 'tests.onnx_backend_test_code.test_gemm_default_no_bias') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_gemm_default_no_bias.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_gemm_default_no_bias.py', current folder: D:\a\onnxscript\onnxscript
---- CONTENT --
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import FLOAT
from onnxscript.onnx_opset import opset13

@script()
def bck_test_gemm_default_no_bias(a: FLOAT[2,10], b: FLOAT[10,3]) -> (FLOAT[2,3]):
    y = opset13.Gemm(a, b)
    return y
onnxscript\backend\onnx_export_test.py:133: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_gemm_default_no_bias'

The above exception was the direct cause of the following exception:
.nox\test\lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:267: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:135: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_gemm_default_no_bias' (e=No module named 'tests.onnx_backend_test_code.test_gemm_default_no_bias') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_gemm_default_no_bias.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_gemm_default_no_bias.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT
E   from onnxscript.onnx_opset import opset13
E   
E   @script()
E   def bck_test_gemm_default_no_bias(a: FLOAT[2,10], b: FLOAT[10,3]) -> (FLOAT[2,3]):
E       y = opset13.Gemm(a, b)
E       return y

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 3 runs failed: test_export2python_produces_correct_onnx_script_model_0805_test_reduce_l1_do_not_keepdims_random (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py39-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_reduce_l1_do_not_keepdims_random' (e=No module named 'tests.onnx_backend_test_code.test_reduce_l1_do_not_keepdims_random') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_reduce_l1_do_not_keepdims_random.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_reduce_l1_do_not_keepdims_random.py', current folder: D:\a\onnxscript\onnxscript
---- CONTENT --
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import FLOAT, INT64
from onnxscript.onnx_opset import opset18

@script()
def bck_test_reduce_l1_do_not_keepdims_random(data: FLOAT[3,2,2], axes: INT64[1]) -> (FLOAT[3,2]):
    reduced = opset18.ReduceL1(data, axes, keepdims=0)
    return reduced
onnxscript\backend\onnx_export_test.py:133: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_reduce_l1_do_not_keepdims_random'

The above exception was the direct cause of the following exception:
.nox\test\lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:267: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:135: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_reduce_l1_do_not_keepdims_random' (e=No module named 'tests.onnx_backend_test_code.test_reduce_l1_do_not_keepdims_random') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_reduce_l1_do_not_keepdims_random.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_reduce_l1_do_not_keepdims_random.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT, INT64
E   from onnxscript.onnx_opset import opset18
E   
E   @script()
E   def bck_test_reduce_l1_do_not_keepdims_random(data: FLOAT[3,2,2], axes: INT64[1]) -> (FLOAT[3,2]):
E       reduced = opset18.ReduceL1(data, axes, keepdims=0)
E       return reduced

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 3 runs failed: test_export2python_produces_correct_onnx_script_model_0867_test_reduce_max_do_not_keepdims_random (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py39-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_reduce_max_do_not_keepdims_random' (e=No module named 'tests.onnx_backend_test_code.test_reduce_max_do_not_keepdims_random') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_reduce_max_do_not_keepdims_random.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_reduce_max_do_not_keepdims_random.py', current folder: D:\a\onnxscript\onnxscript
---- CONTENT --
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import FLOAT, INT64
from onnxscript.onnx_opset import opset18

@script()
def bck_test_reduce_max_do_not_keepdims_random(data: FLOAT[3,2,2], axes: INT64[1]) -> (FLOAT[3,2]):
    reduced = opset18.ReduceMax(data, axes, keepdims=0)
    return reduced
onnxscript\backend\onnx_export_test.py:133: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_reduce_max_do_not_keepdims_random'

The above exception was the direct cause of the following exception:
.nox\test\lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:267: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:135: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_reduce_max_do_not_keepdims_random' (e=No module named 'tests.onnx_backend_test_code.test_reduce_max_do_not_keepdims_random') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_reduce_max_do_not_keepdims_random.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_reduce_max_do_not_keepdims_random.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT, INT64
E   from onnxscript.onnx_opset import opset18
E   
E   @script()
E   def bck_test_reduce_max_do_not_keepdims_random(data: FLOAT[3,2,2], axes: INT64[1]) -> (FLOAT[3,2]):
E       reduced = opset18.ReduceMax(data, axes, keepdims=0)
E       return reduced

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 3 runs failed: test_export2python_produces_correct_onnx_script_model_0517_test_layer_normalization_3d_axis_negative_2_epsilon (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py39-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_layer_normalization_3d_axis_negative_2_epsilon' (e=No module named 'tests.onnx_backend_test_code.test_layer_normalization_3d_axis_negative_2_epsilon') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_layer_normalization_3d_axis_negative_2_epsilon.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_layer_normalization_3d_axis_negative_2_epsilon.py', current folder: D:\a\onnxscript\onnxscript
---- CONTENT --
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import FLOAT
from onnxscript.onnx_opset import opset17

@script()
def bck_test_layer_normalization_3d_axis_negative_2_epsilon(X: FLOAT[2,3,5], W: FLOAT[3,5], B: FLOAT[3,5]) -> (FLOAT[2,3,5], FLOAT[2,1,1], FLOAT[2,1,1]):
    Y, Mean, InvStdDev = opset17.LayerNormalization(X, W, B, axis=-2, epsilon=0.10000000149011612)
    return Y, Mean, InvStdDev
onnxscript\backend\onnx_export_test.py:133: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_layer_normalization_3d_axis_negative_2_epsilon'

The above exception was the direct cause of the following exception:
.nox\test\lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:267: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:135: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_layer_normalization_3d_axis_negative_2_epsilon' (e=No module named 'tests.onnx_backend_test_code.test_layer_normalization_3d_axis_negative_2_epsilon') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_layer_normalization_3d_axis_negative_2_epsilon.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_layer_normalization_3d_axis_negative_2_epsilon.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT
E   from onnxscript.onnx_opset import opset17
E   
E   @script()
E   def bck_test_layer_normalization_3d_axis_negative_2_epsilon(X: FLOAT[2,3,5], W: FLOAT[3,5], B: FLOAT[3,5]) -> (FLOAT[2,3,5], FLOAT[2,1,1], FLOAT[2,1,1]):
E       Y, Mean, InvStdDev = opset17.LayerNormalization(X, W, B, axis=-2, epsilon=0.10000000149011612)
E       return Y, Mean, InvStdDev

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 3 runs failed: test_export2python_produces_correct_onnx_script_model_0605_test_matmul_4d (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py311-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_matmul_4d' (e=No module named 'tests.onnx_backend_test_code.test_matmul_4d') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_matmul_4d.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_matmul_4d.py', current folder: D:\a\onnxscript\onnxscript
---- CONTENT --
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import FLOAT
from onnxscript.onnx_opset import opset13

@script()
def bck_test_matmul_4d(a: FLOAT[1,2,3,4], b: FLOAT[1,2,4,3]) -> (FLOAT[1,2,3,3]):
    c = opset13.MatMul(a, b)
    return c
onnxscript\backend\onnx_export_test.py:133: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.11.9\x64\Lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_matmul_4d'

The above exception was the direct cause of the following exception:
.nox\test\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:267: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:135: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_matmul_4d' (e=No module named 'tests.onnx_backend_test_code.test_matmul_4d') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_matmul_4d.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_matmul_4d.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT
E   from onnxscript.onnx_opset import opset13
E   
E   @script()
E   def bck_test_matmul_4d(a: FLOAT[1,2,3,4], b: FLOAT[1,2,4,3]) -> (FLOAT[1,2,3,3]):
E       c = opset13.MatMul(a, b)
E       return c

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 3 runs failed: test_export2python_produces_correct_onnx_script_model_0082_test_averagepool_2d_precomputed_pads (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py39-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_averagepool_2d_precomputed_pads' (e=No module named 'tests.onnx_backend_test_code.test_averagepool_2d_precomputed_pads') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_averagepool_2d_precomputed_pads.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_averagepool_2d_precomputed_pads.py', current folder: D:\a\onnxscript\onnxscript
---- CONTENT --
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import FLOAT
from onnxscript.onnx_opset import opset19

@script()
def bck_test_averagepool_2d_precomputed_pads(x: FLOAT[1,1,5,5]) -> (FLOAT[1,1,5,5]):
    y = opset19.AveragePool(x, kernel_shape=[5, 5], pads=[2, 2, 2, 2])
    return y
onnxscript\backend\onnx_export_test.py:133: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_averagepool_2d_precomputed_pads'

The above exception was the direct cause of the following exception:
.nox\test\lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:267: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:135: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_averagepool_2d_precomputed_pads' (e=No module named 'tests.onnx_backend_test_code.test_averagepool_2d_precomputed_pads') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_averagepool_2d_precomputed_pads.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_averagepool_2d_precomputed_pads.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT
E   from onnxscript.onnx_opset import opset19
E   
E   @script()
E   def bck_test_averagepool_2d_precomputed_pads(x: FLOAT[1,1,5,5]) -> (FLOAT[1,1,5,5]):
E       y = opset19.AveragePool(x, kernel_shape=[5, 5], pads=[2, 2, 2, 2])
E       return y

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 3 runs failed: test_export2python_produces_correct_onnx_script_model_1107_test_sign (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py39-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_sign' (e=No module named 'tests.onnx_backend_test_code.test_sign') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_sign.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_sign.py', current folder: D:\a\onnxscript\onnxscript
---- CONTENT --
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import FLOAT
from onnxscript.onnx_opset import opset13

@script()
def bck_test_sign(x: FLOAT[11]) -> (FLOAT[11]):
    y = opset13.Sign(x)
    return y
onnxscript\backend\onnx_export_test.py:133: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_sign'

The above exception was the direct cause of the following exception:
.nox\test\lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:267: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:135: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_sign' (e=No module named 'tests.onnx_backend_test_code.test_sign') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_sign.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_sign.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT
E   from onnxscript.onnx_opset import opset13
E   
E   @script()
E   def bck_test_sign(x: FLOAT[11]) -> (FLOAT[11]):
E       y = opset13.Sign(x)
E       return y

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 3 runs failed: test_export2python_produces_correct_onnx_script_model_0459_test_hardmax_example (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py310-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_hardmax_example' (e=No module named 'tests.onnx_backend_test_code.test_hardmax_example') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_hardmax_example.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_hardmax_example.py', current folder: D:\a\onnxscript\onnxscript
---- CONTENT --
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import FLOAT
from onnxscript.onnx_opset import opset13

@script()
def bck_test_hardmax_example(x: FLOAT[4,4]) -> (FLOAT[4,4]):
    y = opset13.Hardmax(x)
    return y
onnxscript\backend\onnx_export_test.py:133: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.10.11\x64\lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_hardmax_example'

The above exception was the direct cause of the following exception:
.nox\test\lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:267: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:135: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_hardmax_example' (e=No module named 'tests.onnx_backend_test_code.test_hardmax_example') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_hardmax_example.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_hardmax_example.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT
E   from onnxscript.onnx_opset import opset13
E   
E   @script()
E   def bck_test_hardmax_example(x: FLOAT[4,4]) -> (FLOAT[4,4]):
E       y = opset13.Hardmax(x)
E       return y

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

4 out of 15 runs failed: test_output_match_opinfo__ops_aten_embedding_bag_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 1s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 30s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 1s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 28s]
Raw output
AssertionError: Output 0 mismatch
AssertionError: Output 0 mismatch
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 1 / 15 (6.7%)
E   Greatest absolute difference: 0.0546875 at index (2, 0) (up to 0.01 allowed)
E   Greatest relative difference: 0.0124359130859375 at index (2, 0) (up to 0.01 allowed)

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:266: in run_test_output_match
    raise AssertionError(f"Output {j} mismatch") from e
E   AssertionError: Output 0 mismatch
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 1 / 15 (6.7%)
E   Greatest absolute difference: 0.0546875 at index (2, 0) (up to 0.01 allowed)
E   Greatest relative difference: 0.0124359130859375 at index (2, 0) (up to 0.01 allowed)

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:266: in run_test_output_match
    raise AssertionError(f"Output {j} mismatch") from e
E   AssertionError: Output 0 mismatch

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

All 15 runs failed: test_output_match_opinfo__diagonal_cpu_int32 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py310-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-experimental-torchlib-tracing-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-experimental-torchlib-tracing-ubuntu-latest)/pytest.xml [took 6s]
artifacts/Test Results (py311-experimental-torchlib-tracing-windows-latest)/pytest.xml [took 9s]
artifacts/Test Results (py311-macos-latest)/pytest.xml [took 5s]
artifacts/Test Results (py311-onnx-weekly-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-onnx-weekly-ubuntu-latest)/pytest.xml [took 8s]
artifacts/Test Results (py311-onnx-weekly-windows-latest)/pytest.xml [took 9s]
artifacts/Test Results (py311-ort-nightly-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-ort-nightly-windows-latest)/pytest.xml [took 9s]
artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 4s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 15s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 9s]
artifacts/Test Results (py39-macos-latest)/pytest.xml [took 3s]
Raw output
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 0> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 1> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 0> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 1> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 0> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 1> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 1> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = -2> (input0)
}
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 0> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 1> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 0> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 1> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 0> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 1> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 1> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = -2> (input0)
E   }

Check warning on line 0 in tests.function_libs.torch_lib.quantization_test.QuantizedModelExportTest

See this annotation in the file changed.

@github-actions github-actions / Test Results

4 out of 12 runs failed: test_simple_quantized_model (tests.function_libs.torch_lib.quantization_test.QuantizedModelExportTest)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 3s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 2s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 3s]
Raw output
onnx.onnx_cpp2py_export.checker.ValidationError: Required attribute 'to' is missing.

==> Context: Bad node spec for node. Name: Cast_18 OpType: Cast
tests/function_libs/torch_lib/quantization_test.py:50: in test_simple_quantized_model
    onnx.checker.check_model(program.model_proto, full_check=True)
.nox/test_torch_nightly/lib/python3.11/site-packages/onnx/checker.py:179: in check_model
    C.check_model(
E   onnx.onnx_cpp2py_export.checker.ValidationError: Required attribute 'to' is missing.
E   
E   ==> Context: Bad node spec for node. Name: Cast_18 OpType: Cast

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

2 out of 15 runs failed: test_output_match_opinfo__matmul_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Tensor-likes are not close!

Mismatched elements: 1 / 25 (4.0%)
Greatest absolute difference: 0.03515625 at index (4, 4) (up to 0.02 allowed)
Greatest relative difference: 0.0084075927734375 at index (4, 4) (up to 0.002 allowed)
AssertionError: Tensor-likes are not close!

Mismatched elements: 2 / 250 (0.8%)
Greatest absolute difference: 0.029296875 at index (0, 1, 7) (up to 0.02 allowed)
Greatest relative difference: 0.019683837890625 at index (4, 0, 7) (up to 0.002 allowed)
tests\function_libs\torch_lib\ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 1 / 25 (4.0%)
E   Greatest absolute difference: 0.03515625 at index (4, 4) (up to 0.02 allowed)
E   Greatest relative difference: 0.0084075927734375 at index (4, 4) (up to 0.002 allowed)
tests\function_libs\torch_lib\ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 2 / 250 (0.8%)
E   Greatest absolute difference: 0.029296875 at index (0, 1, 7) (up to 0.02 allowed)
E   Greatest relative difference: 0.019683837890625 at index (4, 0, 7) (up to 0.002 allowed)

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

4 out of 15 runs failed: test_output_match_opinfo__var_mean_unbiased_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Output 0 mismatch
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Scalars are not close!
E   
E   Expected 25.765625 but got 25.796875.
E   Absolute difference: 0.03125 (up to 1e-05 allowed)
E   Relative difference: 0.001212856276531231 (up to 0.001 allowed)

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:266: in run_test_output_match
    raise AssertionError(f"Output {j} mismatch") from e
E   AssertionError: Output 0 mismatch

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

4 out of 15 runs failed: test_output_match_opinfo__var_mean_correction_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Output 0 mismatch
AssertionError: Output 0 mismatch
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 2 / 25 (8.0%)
E   Greatest absolute difference: 0.03125 at index (2, 3) (up to 1e-05 allowed)
E   Greatest relative difference: 0.0017795562744140625 at index (2, 3) (up to 0.001 allowed)

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:266: in run_test_output_match
    raise AssertionError(f"Output {j} mismatch") from e
E   AssertionError: Output 0 mismatch
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 2 / 25 (8.0%)
E   Greatest absolute difference: 0.0625 at index (2, 2) (up to 1e-05 allowed)
E   Greatest relative difference: 0.0014429092407226562 at index (2, 3) (up to 0.001 allowed)

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:266: in run_test_output_match
    raise AssertionError(f"Output {j} mismatch") from e
E   AssertionError: Output 0 mismatch

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

All 15 runs failed: test_output_match_opinfo__diagonal_bool_cpu_bool (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py310-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-experimental-torchlib-tracing-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-experimental-torchlib-tracing-ubuntu-latest)/pytest.xml [took 7s]
artifacts/Test Results (py311-experimental-torchlib-tracing-windows-latest)/pytest.xml [took 9s]
artifacts/Test Results (py311-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-onnx-weekly-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-onnx-weekly-ubuntu-latest)/pytest.xml [took 7s]
artifacts/Test Results (py311-onnx-weekly-windows-latest)/pytest.xml [took 9s]
artifacts/Test Results (py311-ort-nightly-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-ort-nightly-windows-latest)/pytest.xml [took 9s]
artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 14s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 9s]
artifacts/Test Results (py39-macos-latest)/pytest.xml [took 3s]
Raw output
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 0> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 1> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 0> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 1> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 0> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 1> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 1> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = -2> (input0)
}
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2629: in aten_diagonal_bool
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 0> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2629: in aten_diagonal_bool
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2629: in aten_diagonal_bool
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2629: in aten_diagonal_bool
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 1> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2629: in aten_diagonal_bool
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 0> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2629: in aten_diagonal_bool
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2629: in aten_diagonal_bool
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2629: in aten_diagonal_bool
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 1> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2629: in aten_diagonal_bool
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 0> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2629: in aten_diagonal_bool
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2629: in aten_diagonal_bool
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2629: in aten_diagonal_bool
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 1> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2629: in aten_diagonal_bool
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 1> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2629: in aten_diagonal_bool
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2629: in aten_diagonal_bool
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = -2> (input0)
E   }

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

All 15 runs failed: test_output_match_opinfo__diagonal_cpu_float32 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py310-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-experimental-torchlib-tracing-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-experimental-torchlib-tracing-ubuntu-latest)/pytest.xml [took 7s]
artifacts/Test Results (py311-experimental-torchlib-tracing-windows-latest)/pytest.xml [took 10s]
artifacts/Test Results (py311-macos-latest)/pytest.xml [took 5s]
artifacts/Test Results (py311-onnx-weekly-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-onnx-weekly-ubuntu-latest)/pytest.xml [took 6s]
artifacts/Test Results (py311-onnx-weekly-windows-latest)/pytest.xml [took 9s]
artifacts/Test Results (py311-ort-nightly-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-ort-nightly-windows-latest)/pytest.xml [took 9s]
artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 4s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 14s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 9s]
artifacts/Test Results (py39-macos-latest)/pytest.xml [took 3s]
Raw output
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 0> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 1> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 0> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 1> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 0> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 1> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 1> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = -2> (input0)
}
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 0> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 1> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 0> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 1> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 0> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 1> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 1> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = -2> (input0)
E   }

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

4 out of 15 runs failed: test_output_match_opinfo__addmv_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Tensor-likes are not close!

Mismatched elements: 1 / 5 (20.0%)
Greatest absolute difference: 0.046875 at index (2,) (up to 0.01 allowed)
Greatest relative difference: 0.002635955810546875 at index (2,) (up to 0.001 allowed)
AssertionError: Tensor-likes are not close!

Mismatched elements: 1 / 5 (20.0%)
Greatest absolute difference: 0.0234375 at index (2,) (up to 0.01 allowed)
Greatest relative difference: 0.0024318695068359375 at index (2,) (up to 0.001 allowed)
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 1 / 5 (20.0%)
E   Greatest absolute difference: 0.046875 at index (2,) (up to 0.01 allowed)
E   Greatest relative difference: 0.002635955810546875 at index (2,) (up to 0.001 allowed)
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 1 / 5 (20.0%)
E   Greatest absolute difference: 0.0234375 at index (2,) (up to 0.01 allowed)
E   Greatest relative difference: 0.0024318695068359375 at index (2,) (up to 0.001 allowed)

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

4 out of 15 runs failed: test_output_match_opinfo__floor_divide_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 0s]
Raw output
Failed: Unexpected success
Unexpected success

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

4 out of 15 runs failed: test_output_match_opinfo__index_put_bool_cpu_int32 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Tensor-likes are not equal!

Mismatched elements: 9 / 25 (36.0%)
Greatest absolute difference: 15 at index (1, 0)
Greatest relative difference: 6.0 at index (3, 4)
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not equal!
E   
E   Mismatched elements: 9 / 25 (36.0%)
E   Greatest absolute difference: 15 at index (1, 0)
E   Greatest relative difference: 6.0 at index (3, 4)

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

All 15 runs failed: test_output_match_opinfo__diagonal_cpu_int64 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py310-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-experimental-torchlib-tracing-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-experimental-torchlib-tracing-ubuntu-latest)/pytest.xml [took 6s]
artifacts/Test Results (py311-experimental-torchlib-tracing-windows-latest)/pytest.xml [took 8s]
artifacts/Test Results (py311-macos-latest)/pytest.xml [took 4s]
artifacts/Test Results (py311-onnx-weekly-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-onnx-weekly-ubuntu-latest)/pytest.xml [took 7s]
artifacts/Test Results (py311-onnx-weekly-windows-latest)/pytest.xml [took 9s]
artifacts/Test Results (py311-ort-nightly-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py311-ort-nightly-windows-latest)/pytest.xml [took 9s]
artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 4s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 12s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 9s]
artifacts/Test Results (py39-macos-latest)/pytest.xml [took 3s]
Raw output
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 0> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 1> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 0> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 1> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 0> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 1> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 1> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = 2> (input0)
}
onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
<
   ir_version: 4,
   opset_import: ["" : 9]
>
node_graph (float input0) => ( output0) {
   output0 = EyeLike <k: int = -2> (input0)
}
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 0> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 1> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 0> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 1> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 0> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 1> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 1> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = 2> (input0)
E   }
onnxscript/evaluator.py:478: in _call_ort
    session = ort.InferenceSession(
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:419: in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
.nox/test_onnx_weekly/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:474: in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
E   onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (EyeLike) [ShapeInferenceError] Input tensor must be 2-dimensional

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:601: in executor
    return function(*args, **kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:2556: in aten_diagonal
    mask = op.EyeLike(op.ConstantOfShape(mask_shape), k=offset)
onnxscript/onnx_opset/_impl/opset9.py:438: in EyeLike
    return op(*self._prepare_inputs(schema, input), dtype=dtype, k=k)
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:512: in _eval
    return _call_ort(schema, inputs, attributes, closure)
onnxscript/evaluator.py:482: in _call_ort
    raise EagerModeError(
E   onnxscript.evaluator.EagerModeError: Unable to create onnxruntime InferenceSession for executing .EyeLike op with onnx model
E   <
E      ir_version: 4,
E      opset_import: ["" : 9]
E   >
E   node_graph (float input0) => ( output0) {
E      output0 = EyeLike <k: int = -2> (input0)
E   }

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

4 out of 15 runs failed: test_output_match_opinfo__div_mode_floor_rounding_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 0s]
Raw output
Failed: Unexpected success
Unexpected success