Skip to content

Fix weekly CI pipeline errors #7599

Fix weekly CI pipeline errors

Fix weekly CI pipeline errors #7599

GitHub Actions / Test Results failed Jul 22, 2024 in 0s

35 fail, 3 706 skipped, 9 763 pass in 27m 33s

     22 files  +     22      22 suites  +22   27m 33s ⏱️ + 27m 33s
 13 504 tests + 13 504   9 763 ✅ + 9 763   3 706 💤 + 3 706  35 ❌ +35 
117 477 runs  +117 477  46 116 ✅ +46 116  71 265 💤 +71 265  96 ❌ +96 

Results for commit 88a62b1. ± Comparison against earlier commit 492096e.

Annotations

Check warning on line 0 in onnxscript.ir.serde_test.TensorProtoTensorTest

See this annotation in the file changed.

@github-actions github-actions / Test Results

12 out of 18 runs failed: test_tensor_proto_tensor_bfloat16 (onnxscript.ir.serde_test.TensorProtoTensorTest)

artifacts/Test Results (py310-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py310-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py310-windows-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-ort-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-ort-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-ort-nightly-windows-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-windows-latest)/pytest.xml [took 0s]
artifacts/Test Results (py39-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py39-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py39-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: 
Arrays are not equal

(shapes (1, 18), (1, 9) mismatch)
 x: array([[0, -3, 0, -1, 0, -0.5, 0, -0, 0, 0, 0, 0.5, 0, 1, 0, 42, 0, 2]],
      dtype=bfloat16)
 y: array([[-3, -1, -0.5, -0, 0, 0.5, 1, 42, 2]], dtype=bfloat16)
onnxscript\ir\serde_test.py:106: in test_tensor_proto_tensor_bfloat16
    np.testing.assert_array_equal(array_from_raw_data.view(ml_dtypes.bfloat16), expected_array)
C:\hostedtoolcache\windows\Python\3.10.11\x64\lib\contextlib.py:79: in inner
    return func(*args, **kwds)
E   AssertionError: 
E   Arrays are not equal
E   
E   (shapes (1, 18), (1, 9) mismatch)
E    x: array([[0, -3, 0, -1, 0, -0.5, 0, -0, 0, 0, 0, 0.5, 0, 1, 0, 42, 0, 2]],
E         dtype=bfloat16)
E    y: array([[-3, -1, -0.5, -0, 0, 0.5, 1, 42, 2]], dtype=bfloat16)

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 4 runs failed: test_export2python_produces_correct_onnx_script_model_0460_test_hardmax_negative_axis (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py39-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_hardmax_negative_axis' (e=No module named 'tests.onnx_backend_test_code.test_hardmax_negative_axis') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_hardmax_negative_axis.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_hardmax_negative_axis.py', current folder: D:\a\onnxscript\onnxscript
---- CONTENT --
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import FLOAT
from onnxscript.onnx_opset import opset13

@script()
def bck_test_hardmax_negative_axis(x: FLOAT[3,4,5]) -> (FLOAT[3,4,5]):
    y = opset13.Hardmax(x, axis=-1)
    return y
onnxscript\backend\onnx_export_test.py:133: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_hardmax_negative_axis'

The above exception was the direct cause of the following exception:
.nox\test\lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:267: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:135: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_hardmax_negative_axis' (e=No module named 'tests.onnx_backend_test_code.test_hardmax_negative_axis') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_hardmax_negative_axis.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_hardmax_negative_axis.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT
E   from onnxscript.onnx_opset import opset13
E   
E   @script()
E   def bck_test_hardmax_negative_axis(x: FLOAT[3,4,5]) -> (FLOAT[3,4,5]):
E       y = opset13.Hardmax(x, axis=-1)
E       return y

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 4 runs failed: test_export2python_produces_correct_onnx_script_model_0310_test_depthtospace_crd_mode_example (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py39-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_depthtospace_crd_mode_example' (e=No module named 'tests.onnx_backend_test_code.test_depthtospace_crd_mode_example') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_depthtospace_crd_mode_example.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_depthtospace_crd_mode_example.py', current folder: D:\a\onnxscript\onnxscript
---- CONTENT --
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import FLOAT
from onnxscript.onnx_opset import opset13

@script()
def bck_test_depthtospace_crd_mode_example(x: FLOAT[1,8,2,3]) -> (FLOAT[1,2,4,6]):
    y = opset13.DepthToSpace(x, blocksize=2, mode='CRD')
    return y
onnxscript\backend\onnx_export_test.py:133: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_depthtospace_crd_mode_example'

The above exception was the direct cause of the following exception:
.nox\test\lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:267: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:135: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_depthtospace_crd_mode_example' (e=No module named 'tests.onnx_backend_test_code.test_depthtospace_crd_mode_example') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_depthtospace_crd_mode_example.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_depthtospace_crd_mode_example.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT
E   from onnxscript.onnx_opset import opset13
E   
E   @script()
E   def bck_test_depthtospace_crd_mode_example(x: FLOAT[1,8,2,3]) -> (FLOAT[1,2,4,6]):
E       y = opset13.DepthToSpace(x, blocksize=2, mode='CRD')
E       return y

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 7 runs failed: test_export2python_produces_correct_onnx_script_model_0088_test_averagepool_2d_strides (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py39-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_averagepool_2d_strides' (e=No module named 'tests.onnx_backend_test_code.test_averagepool_2d_strides') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_averagepool_2d_strides.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_averagepool_2d_strides.py', current folder: D:\a\onnxscript\onnxscript
---- CONTENT --
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import FLOAT
from onnxscript.onnx_opset import opset19

@script()
def bck_test_averagepool_2d_strides(x: FLOAT[1,3,32,32]) -> (FLOAT[1,3,10,10]):
    y = opset19.AveragePool(x, kernel_shape=[5, 5], strides=[3, 3])
    return y
onnxscript\backend\onnx_export_test.py:133: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_averagepool_2d_strides'

The above exception was the direct cause of the following exception:
.nox\test\lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:267: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:135: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_averagepool_2d_strides' (e=No module named 'tests.onnx_backend_test_code.test_averagepool_2d_strides') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_averagepool_2d_strides.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_averagepool_2d_strides.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT
E   from onnxscript.onnx_opset import opset19
E   
E   @script()
E   def bck_test_averagepool_2d_strides(x: FLOAT[1,3,32,32]) -> (FLOAT[1,3,10,10]):
E       y = opset19.AveragePool(x, kernel_shape=[5, 5], strides=[3, 3])
E       return y

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 4 runs failed: test_export2python_produces_correct_onnx_script_model_0642_test_mean_one_input (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py39-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_mean_one_input' (e=No module named 'tests.onnx_backend_test_code.test_mean_one_input') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_mean_one_input.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_mean_one_input.py', current folder: D:\a\onnxscript\onnxscript
---- CONTENT --
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import FLOAT
from onnxscript.onnx_opset import opset13

@script()
def bck_test_mean_one_input(data_0: FLOAT[3]) -> (FLOAT[3]):
    result = opset13.Mean(data_0)
    return result
onnxscript\backend\onnx_export_test.py:133: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_mean_one_input'

The above exception was the direct cause of the following exception:
.nox\test\lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:267: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:135: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_mean_one_input' (e=No module named 'tests.onnx_backend_test_code.test_mean_one_input') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_mean_one_input.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_mean_one_input.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT
E   from onnxscript.onnx_opset import opset13
E   
E   @script()
E   def bck_test_mean_one_input(data_0: FLOAT[3]) -> (FLOAT[3]):
E       result = opset13.Mean(data_0)
E       return result

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 4 runs failed: test_export2python_produces_correct_onnx_script_model_0812_test_reduce_l1_keep_dims_random_expanded (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py310-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_reduce_l1_keep_dims_random_expanded' (e=No module named 'tests.onnx_backend_test_code.test_reduce_l1_keep_dims_random_expanded') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_reduce_l1_keep_dims_random_expanded.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_reduce_l1_keep_dims_random_expanded.py', current folder: D:\a\onnxscript\onnxscript
---- CONTENT --
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import FLOAT, INT64
from onnxscript.onnx_opset import opset18

@script()
def bck_test_reduce_l1_keep_dims_random_expanded(data: FLOAT[3,2,2], axes: INT64[1]) -> (FLOAT[3,2,1]):
    ReduceL1_test_reduce_l1_keep_dims_random_expanded_function_data_abs = opset18.Abs(data)
    reduced = opset18.ReduceSum(ReduceL1_test_reduce_l1_keep_dims_random_expanded_function_data_abs, axes, keepdims=1)
    return reduced
onnxscript\backend\onnx_export_test.py:133: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.10.11\x64\lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_reduce_l1_keep_dims_random_expanded'

The above exception was the direct cause of the following exception:
.nox\test\lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:267: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:135: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_reduce_l1_keep_dims_random_expanded' (e=No module named 'tests.onnx_backend_test_code.test_reduce_l1_keep_dims_random_expanded') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_reduce_l1_keep_dims_random_expanded.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_reduce_l1_keep_dims_random_expanded.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT, INT64
E   from onnxscript.onnx_opset import opset18
E   
E   @script()
E   def bck_test_reduce_l1_keep_dims_random_expanded(data: FLOAT[3,2,2], axes: INT64[1]) -> (FLOAT[3,2,1]):
E       ReduceL1_test_reduce_l1_keep_dims_random_expanded_function_data_abs = opset18.Abs(data)
E       reduced = opset18.ReduceSum(ReduceL1_test_reduce_l1_keep_dims_random_expanded_function_data_abs, axes, keepdims=1)
E       return reduced

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 7 runs failed: test_export2python_produces_correct_onnx_script_model_0095_test_basic_conv_without_padding (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py310-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_basic_conv_without_padding' (e=No module named 'tests.onnx_backend_test_code.test_basic_conv_without_padding') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_basic_conv_without_padding.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_basic_conv_without_padding.py', current folder: D:\a\onnxscript\onnxscript
---- CONTENT --
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import FLOAT
from onnxscript.onnx_opset import opset11

@script()
def bck_test_basic_conv_without_padding(x: FLOAT[1,1,5,5], W: FLOAT[1,1,3,3]) -> (FLOAT[1,1,3,3]):
    y = opset11.Conv(x, W, kernel_shape=[3, 3], pads=[0, 0, 0, 0])
    return y
onnxscript\backend\onnx_export_test.py:133: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.10.11\x64\lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_basic_conv_without_padding'

The above exception was the direct cause of the following exception:
.nox\test\lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:267: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:135: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_basic_conv_without_padding' (e=No module named 'tests.onnx_backend_test_code.test_basic_conv_without_padding') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_basic_conv_without_padding.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_basic_conv_without_padding.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT
E   from onnxscript.onnx_opset import opset11
E   
E   @script()
E   def bck_test_basic_conv_without_padding(x: FLOAT[1,1,5,5], W: FLOAT[1,1,3,3]) -> (FLOAT[1,1,3,3]):
E       y = opset11.Conv(x, W, kernel_shape=[3, 3], pads=[0, 0, 0, 0])
E       return y

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 4 runs failed: test_export2python_produces_correct_onnx_script_model_0757_test_or_bcast4v4d (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py310-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_or_bcast4v4d' (e=No module named 'tests.onnx_backend_test_code.test_or_bcast4v4d') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_or_bcast4v4d.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_or_bcast4v4d.py', current folder: D:\a\onnxscript\onnxscript
---- CONTENT --
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import BOOL
from onnxscript.onnx_opset import opset7

@script()
def bck_test_or_bcast4v4d(x: BOOL[1,4,1,6], y: BOOL[3,1,5,6]) -> (BOOL[3,4,5,6]):
    r_or = opset7.Or(x, y)
    return r_or
onnxscript\backend\onnx_export_test.py:133: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.10.11\x64\lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_or_bcast4v4d'

The above exception was the direct cause of the following exception:
.nox\test\lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:267: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:135: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_or_bcast4v4d' (e=No module named 'tests.onnx_backend_test_code.test_or_bcast4v4d') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_or_bcast4v4d.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_or_bcast4v4d.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import BOOL
E   from onnxscript.onnx_opset import opset7
E   
E   @script()
E   def bck_test_or_bcast4v4d(x: BOOL[1,4,1,6], y: BOOL[3,1,5,6]) -> (BOOL[3,4,5,6]):
E       r_or = opset7.Or(x, y)
E       return r_or

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 4 runs failed: test_export2python_produces_correct_onnx_script_model_0650_test_min_int32 (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py39-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_min_int32' (e=No module named 'tests.onnx_backend_test_code.test_min_int32') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_min_int32.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_min_int32.py', current folder: D:\a\onnxscript\onnxscript
---- CONTENT --
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import INT32
from onnxscript.onnx_opset import opset13

@script()
def bck_test_min_int32(data_0: INT32[3], data_1: INT32[3]) -> (INT32[3]):
    result = opset13.Min(data_0, data_1)
    return result
onnxscript\backend\onnx_export_test.py:133: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_min_int32'

The above exception was the direct cause of the following exception:
.nox\test\lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:267: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:135: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_min_int32' (e=No module named 'tests.onnx_backend_test_code.test_min_int32') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_min_int32.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_min_int32.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import INT32
E   from onnxscript.onnx_opset import opset13
E   
E   @script()
E   def bck_test_min_int32(data_0: INT32[3], data_1: INT32[3]) -> (INT32[3]):
E       result = opset13.Min(data_0, data_1)
E       return result

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

2 out of 4 runs failed: test_output_match_opinfo__clamp_cpu_float32 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
Raw output
TypeError: Unsupported attribute type '<class 'torch._C._onnx.TensorProtoDataType'>' for attribute 'to'
TypeError: Unsupported attribute type '<class 'torch._C._onnx.TensorProtoDataType'>' for attribute 'to'
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:515: in _capture_graph_and_evaluate_torch_script_evaluator
    symbolic_outputs = function(*onnxscript_args, **onnxscript_kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:1648: in aten_clamp
    min_clamp = op.CastLike(min, self)
onnxscript/onnx_opset/_impl/opset15.py:257: in CastLike
    return op(*self._prepare_inputs(schema, input, target_type))
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:341: in eval
    return self._graph.add_op_call(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph.add_op_call) at 0x110f3ad40>:111: in add_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:878: in add_op_call
    result = self._add_torchscript_op_call(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph._add_torchscript_op_call) at 0x110f3a840>:129: in _add_torchscript_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:758: in _add_torchscript_op_call
    result = _create_op_call_in_torch_graph(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch._create_op_call_in_torch_graph) at 0x110f020c0>:113: in _create_op_call_in_torch_graph
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:498: in _create_op_call_in_torch_graph
    _add_attribute_to_torchscript_node(node, key, value)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:463: in _add_attribute_to_torchscript_node
    raise TypeError(f"Unsupported attribute type '{type(value)}' for attribute '{key}'")
E   TypeError: Unsupported attribute type '<class 'torch._C._onnx.TensorProtoDataType'>' for attribute 'to'
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:515: in _capture_graph_and_evaluate_torch_script_evaluator
    symbolic_outputs = function(*onnxscript_args, **onnxscript_kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:1652: in aten_clamp
    max_clamp = op.CastLike(max, self)
onnxscript/onnx_opset/_impl/opset15.py:257: in CastLike
    return op(*self._prepare_inputs(schema, input, target_type))
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:341: in eval
    return self._graph.add_op_call(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph.add_op_call) at 0x110f3ad40>:111: in add_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:878: in add_op_call
    result = self._add_torchscript_op_call(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph._add_torchscript_op_call) at 0x110f3a840>:129: in _add_torchscript_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:758: in _add_torchscript_op_call
    result = _create_op_call_in_torch_graph(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch._create_op_call_in_torch_graph) at 0x110f020c0>:113: in _create_op_call_in_torch_graph
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:498: in _create_op_call_in_torch_graph
    _add_attribute_to_torchscript_node(node, key, value)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:463: in _add_attribute_to_torchscript_node
    raise TypeError(f"Unsupported attribute type '{type(value)}' for attribute '{key}'")
E   TypeError: Unsupported attribute type '<class 'torch._C._onnx.TensorProtoDataType'>' for attribute 'to'

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

2 out of 4 runs failed: test_output_match_opinfo__clamp_cpu_int64 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
Raw output
TypeError: Unsupported attribute type '<class 'torch._C._onnx.TensorProtoDataType'>' for attribute 'to'
TypeError: Unsupported attribute type '<class 'torch._C._onnx.TensorProtoDataType'>' for attribute 'to'
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:515: in _capture_graph_and_evaluate_torch_script_evaluator
    symbolic_outputs = function(*onnxscript_args, **onnxscript_kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:1648: in aten_clamp
    min_clamp = op.CastLike(min, self)
onnxscript/onnx_opset/_impl/opset15.py:257: in CastLike
    return op(*self._prepare_inputs(schema, input, target_type))
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:341: in eval
    return self._graph.add_op_call(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph.add_op_call) at 0x11ea76d40>:111: in add_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:878: in add_op_call
    result = self._add_torchscript_op_call(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph._add_torchscript_op_call) at 0x11ea76840>:129: in _add_torchscript_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:758: in _add_torchscript_op_call
    result = _create_op_call_in_torch_graph(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch._create_op_call_in_torch_graph) at 0x11ea75bc0>:113: in _create_op_call_in_torch_graph
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:498: in _create_op_call_in_torch_graph
    _add_attribute_to_torchscript_node(node, key, value)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:463: in _add_attribute_to_torchscript_node
    raise TypeError(f"Unsupported attribute type '{type(value)}' for attribute '{key}'")
E   TypeError: Unsupported attribute type '<class 'torch._C._onnx.TensorProtoDataType'>' for attribute 'to'
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:515: in _capture_graph_and_evaluate_torch_script_evaluator
    symbolic_outputs = function(*onnxscript_args, **onnxscript_kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:1652: in aten_clamp
    max_clamp = op.CastLike(max, self)
onnxscript/onnx_opset/_impl/opset15.py:257: in CastLike
    return op(*self._prepare_inputs(schema, input, target_type))
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:341: in eval
    return self._graph.add_op_call(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph.add_op_call) at 0x11ea76d40>:111: in add_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:878: in add_op_call
    result = self._add_torchscript_op_call(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph._add_torchscript_op_call) at 0x11ea76840>:129: in _add_torchscript_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:758: in _add_torchscript_op_call
    result = _create_op_call_in_torch_graph(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch._create_op_call_in_torch_graph) at 0x11ea75bc0>:113: in _create_op_call_in_torch_graph
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:498: in _create_op_call_in_torch_graph
    _add_attribute_to_torchscript_node(node, key, value)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:463: in _add_attribute_to_torchscript_node
    raise TypeError(f"Unsupported attribute type '{type(value)}' for attribute '{key}'")
E   TypeError: Unsupported attribute type '<class 'torch._C._onnx.TensorProtoDataType'>' for attribute 'to'

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

2 out of 4 runs failed: test_output_match_opinfo__rsub_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Tensor-likes are not close!

Mismatched elements: 2 / 25 (8.0%)
Greatest absolute difference: 0.00390625 at index (0, 2) (up to 1e-05 allowed)
Greatest relative difference: 0.0017986297607421875 at index (0, 4) (up to 0.001 allowed)
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 2 / 25 (8.0%)
E   Greatest absolute difference: 0.00390625 at index (0, 2) (up to 1e-05 allowed)
E   Greatest relative difference: 0.0017986297607421875 at index (0, 4) (up to 0.001 allowed)

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

2 out of 4 runs failed: test_output_match_opinfo__addmv_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Tensor-likes are not close!

Mismatched elements: 1 / 5 (20.0%)
Greatest absolute difference: 0.046875 at index (2,) (up to 0.01 allowed)
Greatest relative difference: 0.002635955810546875 at index (2,) (up to 0.001 allowed)
AssertionError: Tensor-likes are not close!

Mismatched elements: 1 / 5 (20.0%)
Greatest absolute difference: 0.0234375 at index (2,) (up to 0.01 allowed)
Greatest relative difference: 0.0024318695068359375 at index (2,) (up to 0.001 allowed)
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 1 / 5 (20.0%)
E   Greatest absolute difference: 0.046875 at index (2,) (up to 0.01 allowed)
E   Greatest relative difference: 0.002635955810546875 at index (2,) (up to 0.001 allowed)
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 1 / 5 (20.0%)
E   Greatest absolute difference: 0.0234375 at index (2,) (up to 0.01 allowed)
E   Greatest relative difference: 0.0024318695068359375 at index (2,) (up to 0.001 allowed)

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

2 out of 4 runs failed: test_output_match_opinfo__clamp_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
Raw output
TypeError: Unsupported attribute type '<class 'torch._C._onnx.TensorProtoDataType'>' for attribute 'to'
TypeError: Unsupported attribute type '<class 'torch._C._onnx.TensorProtoDataType'>' for attribute 'to'
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:515: in _capture_graph_and_evaluate_torch_script_evaluator
    symbolic_outputs = function(*onnxscript_args, **onnxscript_kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:1648: in aten_clamp
    min_clamp = op.CastLike(min, self)
onnxscript/onnx_opset/_impl/opset15.py:257: in CastLike
    return op(*self._prepare_inputs(schema, input, target_type))
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:341: in eval
    return self._graph.add_op_call(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph.add_op_call) at 0x110f3ad40>:111: in add_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:878: in add_op_call
    result = self._add_torchscript_op_call(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph._add_torchscript_op_call) at 0x110f3a840>:129: in _add_torchscript_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:758: in _add_torchscript_op_call
    result = _create_op_call_in_torch_graph(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch._create_op_call_in_torch_graph) at 0x110f020c0>:113: in _create_op_call_in_torch_graph
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:498: in _create_op_call_in_torch_graph
    _add_attribute_to_torchscript_node(node, key, value)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:463: in _add_attribute_to_torchscript_node
    raise TypeError(f"Unsupported attribute type '{type(value)}' for attribute '{key}'")
E   TypeError: Unsupported attribute type '<class 'torch._C._onnx.TensorProtoDataType'>' for attribute 'to'
tests/function_libs/torch_lib/ops_test.py:215: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:515: in _capture_graph_and_evaluate_torch_script_evaluator
    symbolic_outputs = function(*onnxscript_args, **onnxscript_kwargs)
onnxscript/values.py:583: in __call__
    return self.func(*args, **kwargs)
onnxscript/function_libs/torch_lib/ops/core.py:1652: in aten_clamp
    max_clamp = op.CastLike(max, self)
onnxscript/onnx_opset/_impl/opset15.py:257: in CastLike
    return op(*self._prepare_inputs(schema, input, target_type))
onnxscript/values.py:301: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:341: in eval
    return self._graph.add_op_call(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph.add_op_call) at 0x110f3ad40>:111: in add_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:878: in add_op_call
    result = self._add_torchscript_op_call(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch.TorchScriptGraph._add_torchscript_op_call) at 0x110f3a840>:129: in _add_torchscript_op_call
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:758: in _add_torchscript_op_call
    result = _create_op_call_in_torch_graph(
<@beartype(onnxscript.function_libs.torch_lib.graph_building._graph_building_torch._create_op_call_in_torch_graph) at 0x110f020c0>:113: in _create_op_call_in_torch_graph
    ???
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:498: in _create_op_call_in_torch_graph
    _add_attribute_to_torchscript_node(node, key, value)
onnxscript/function_libs/torch_lib/graph_building/_graph_building_torch.py:463: in _add_attribute_to_torchscript_node
    raise TypeError(f"Unsupported attribute type '{type(value)}' for attribute '{key}'")
E   TypeError: Unsupported attribute type '<class 'torch._C._onnx.TensorProtoDataType'>' for attribute 'to'

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

2 out of 4 runs failed: test_output_match_opinfo__index_put_bool_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Tensor-likes are not close!

Mismatched elements: 10 / 25 (40.0%)
Greatest absolute difference: 13.84375 at index (3, 0) (up to 1e-05 allowed)
Greatest relative difference: 7.5 at index (1, 4) (up to 0.001 allowed)
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 10 / 25 (40.0%)
E   Greatest absolute difference: 13.84375 at index (3, 0) (up to 1e-05 allowed)
E   Greatest relative difference: 7.5 at index (1, 4) (up to 0.001 allowed)

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

2 out of 4 runs failed: test_output_match_opinfo__index_put_bool_cpu_int32 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Tensor-likes are not equal!

Mismatched elements: 9 / 25 (36.0%)
Greatest absolute difference: 15 at index (1, 0)
Greatest relative difference: 6.0 at index (3, 4)
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not equal!
E   
E   Mismatched elements: 9 / 25 (36.0%)
E   Greatest absolute difference: 15 at index (1, 0)
E   Greatest relative difference: 6.0 at index (3, 4)

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

2 out of 4 runs failed: test_output_match_opinfo__index_put_bool_cpu_float32 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Tensor-likes are not close!

Mismatched elements: 10 / 25 (40.0%)
Greatest absolute difference: 12.913021087646484 at index (1, 1) (up to 1e-05 allowed)
Greatest relative difference: 3.1326277256011963 at index (1, 0) (up to 1.3e-06 allowed)
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 10 / 25 (40.0%)
E   Greatest absolute difference: 12.913021087646484 at index (1, 1) (up to 1e-05 allowed)
E   Greatest relative difference: 3.1326277256011963 at index (1, 0) (up to 1.3e-06 allowed)

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

2 out of 4 runs failed: test_output_match_opinfo__index_put_bool_cpu_int64 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Tensor-likes are not equal!

Mismatched elements: 9 / 25 (36.0%)
Greatest absolute difference: 15 at index (1, 0)
Greatest relative difference: 6.0 at index (3, 4)
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not equal!
E   
E   Mismatched elements: 9 / 25 (36.0%)
E   Greatest absolute difference: 15 at index (1, 0)
E   Greatest relative difference: 6.0 at index (3, 4)

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

2 out of 4 runs failed: test_output_match_opinfo__var_mean_dim_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Output 0 mismatch
AssertionError: Output 0 mismatch
AssertionError: Output 0 mismatch
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 1 / 25 (4.0%)
E   Greatest absolute difference: 0.0078125 at index (3, 3) (up to 1e-05 allowed)
E   Greatest relative difference: 0.0010890960693359375 at index (3, 3) (up to 0.001 allowed)

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:266: in run_test_output_match
    raise AssertionError(f"Output {j} mismatch") from e
E   AssertionError: Output 0 mismatch
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 1 / 25 (4.0%)
E   Greatest absolute difference: 0.0078125 at index (3, 0, 3) (up to 1e-05 allowed)
E   Greatest relative difference: 0.0010890960693359375 at index (3, 0, 3) (up to 0.001 allowed)

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:266: in run_test_output_match
    raise AssertionError(f"Output {j} mismatch") from e
E   AssertionError: Output 0 mismatch
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 4 / 20 (20.0%)
E   Greatest absolute difference: 0.0625 at index (2, 0) (up to 1e-05 allowed)
E   Greatest relative difference: 0.0014753341674804688 at index (0, 1) (up to 0.001 allowed)

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:266: in run_test_output_match
    raise AssertionError(f"Output {j} mismatch") from e
E   AssertionError: Output 0 mismatch

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

2 out of 4 runs failed: test_output_match_opinfo__sub_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Tensor-likes are not close!

Mismatched elements: 2 / 25 (8.0%)
Greatest absolute difference: 0.001953125 at index (3, 3) (up to 1e-05 allowed)
Greatest relative difference: 0.0028400421142578125 at index (3, 3) (up to 0.001 allowed)
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 2 / 25 (8.0%)
E   Greatest absolute difference: 0.001953125 at index (3, 3) (up to 1e-05 allowed)
E   Greatest relative difference: 0.0028400421142578125 at index (3, 3) (up to 0.001 allowed)

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

2 out of 4 runs failed: test_output_match_opinfo__addmv_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Tensor-likes are not close!

Mismatched elements: 1 / 5 (20.0%)
Greatest absolute difference: 0.046875 at index (2,) (up to 0.01 allowed)
Greatest relative difference: 0.002635955810546875 at index (2,) (up to 0.001 allowed)
AssertionError: Tensor-likes are not close!

Mismatched elements: 1 / 5 (20.0%)
Greatest absolute difference: 0.0234375 at index (2,) (up to 0.01 allowed)
Greatest relative difference: 0.0024318695068359375 at index (2,) (up to 0.001 allowed)
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 1 / 5 (20.0%)
E   Greatest absolute difference: 0.046875 at index (2,) (up to 0.01 allowed)
E   Greatest relative difference: 0.002635955810546875 at index (2,) (up to 0.001 allowed)
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 1 / 5 (20.0%)
E   Greatest absolute difference: 0.0234375 at index (2,) (up to 0.01 allowed)
E   Greatest relative difference: 0.0024318695068359375 at index (2,) (up to 0.001 allowed)

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

2 out of 4 runs failed: test_output_match_opinfo__var_mean_correction_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Output 0 mismatch
AssertionError: Output 0 mismatch
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 2 / 25 (8.0%)
E   Greatest absolute difference: 0.03125 at index (2, 3) (up to 1e-05 allowed)
E   Greatest relative difference: 0.0017795562744140625 at index (2, 3) (up to 0.001 allowed)

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:266: in run_test_output_match
    raise AssertionError(f"Output {j} mismatch") from e
E   AssertionError: Output 0 mismatch
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 2 / 25 (8.0%)
E   Greatest absolute difference: 0.0625 at index (2, 2) (up to 1e-05 allowed)
E   Greatest relative difference: 0.0014429092407226562 at index (2, 3) (up to 0.001 allowed)

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:266: in run_test_output_match
    raise AssertionError(f"Output {j} mismatch") from e
E   AssertionError: Output 0 mismatch

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

2 out of 4 runs failed: test_output_match_opinfo__index_put_bool_cpu_int64 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Tensor-likes are not equal!

Mismatched elements: 9 / 25 (36.0%)
Greatest absolute difference: 15 at index (1, 0)
Greatest relative difference: 6.0 at index (3, 4)
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not equal!
E   
E   Mismatched elements: 9 / 25 (36.0%)
E   Greatest absolute difference: 15 at index (1, 0)
E   Greatest relative difference: 6.0 at index (3, 4)

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

2 out of 4 runs failed: test_output_match_opinfo__ops_aten_embedding_bag_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 1s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 1s]
Raw output
AssertionError: Output 0 mismatch
AssertionError: Output 0 mismatch
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 1 / 15 (6.7%)
E   Greatest absolute difference: 0.0546875 at index (2, 0) (up to 0.01 allowed)
E   Greatest relative difference: 0.0124359130859375 at index (2, 0) (up to 0.01 allowed)

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:266: in run_test_output_match
    raise AssertionError(f"Output {j} mismatch") from e
E   AssertionError: Output 0 mismatch
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 1 / 15 (6.7%)
E   Greatest absolute difference: 0.0546875 at index (2, 0) (up to 0.01 allowed)
E   Greatest relative difference: 0.0124359130859375 at index (2, 0) (up to 0.01 allowed)

The above exception was the direct cause of the following exception:
tests/function_libs/torch_lib/ops_test.py:266: in run_test_output_match
    raise AssertionError(f"Output {j} mismatch") from e
E   AssertionError: Output 0 mismatch

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

2 out of 4 runs failed: test_output_match_opinfo__index_put_bool_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Tensor-likes are not close!

Mismatched elements: 10 / 25 (40.0%)
Greatest absolute difference: 13.84375 at index (3, 0) (up to 1e-05 allowed)
Greatest relative difference: 7.5 at index (1, 4) (up to 0.001 allowed)
tests/function_libs/torch_lib/ops_test.py:252: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Tensor-likes are not close!
E   
E   Mismatched elements: 10 / 25 (40.0%)
E   Greatest absolute difference: 13.84375 at index (3, 0) (up to 1e-05 allowed)
E   Greatest relative difference: 7.5 at index (1, 4) (up to 0.001 allowed)