Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[luci] CircleReshape's dtype unknown after SubstitutePackToReshapePass #14096

Open
qsunki opened this issue Sep 26, 2024 · 7 comments
Open

[luci] CircleReshape's dtype unknown after SubstitutePackToReshapePass #14096

qsunki opened this issue Sep 26, 2024 · 7 comments
Labels

Comments

@qsunki
Copy link
Contributor

qsunki commented Sep 26, 2024

Problem

After SubstitutePackToReshapePass, Reshape's dtype is unknown.

CircleNode  Pack(dtype: S32)     CircleNode    Reshape(dtype: unkown)
     \       /            --->         \        /
      \     /                           \      /
        Pad                               Pad

And In CircleShapeInferencePass, An assertion occurred while checking the dtype of paddings in the pad operation.

LUCI_ASSERT(paddings->dtype() == S32 || paddings->dtype() == S64, "Support int 32/64 for now");

How to reproduce

yamnet.zip
onecc -C yamnet.cfg -O1

Log

/usr/local/bin/circle2circle lite-model_yamnet_tflite_1.circle lite-model_yamnet_tflite_1.opt2.circle --common_subexpression_elimination --fold_add_v2 --fold_cast --fold_densify --fold_dequantize --fold_dwconv --fold_fully_connected --fold_gather --fold_mul --fold_reshape --fold_shape --fold_sparse_to_dense --fold_squeeze --forward_reshape_to_unaryop --forward_transpose_op --fuse_add_to_fullyconnected_bias --fuse_add_with_conv --fuse_add_with_tconv --fuse_add_with_fully_connected --fuse_batchnorm_with_conv --fuse_batchnorm_with_dwconv --fuse_batchnorm_with_tconv --fuse_mul_to_fullyconnected_weights --fuse_slice_with_tconv --fuse_mean_with_mean --fuse_mul_with_conv --fuse_mul_with_div --fuse_mul_with_fullyconnected --fuse_transpose_with_mean --fuse_horizontal_fc_layers --fuse_activation_function --fuse_instnorm --fuse_prelu --fuse_gelu --fuse_rsqrt --remove_redundant_reshape --remove_redundant_transpose --remove_unnecessary_add --remove_unnecessary_cast --remove_unnecessary_reshape --remove_unnecessary_slice --remove_unnecessary_strided_slice --remove_unnecessary_split --remove_unnecessary_transpose --replace_non_const_fc_with_batch_matmul --replace_with_fc_gelu_fc --resolve_customop_add --resolve_customop_batchmatmul --resolve_customop_matmul --resolve_customop_max_pool_with_argmax --resolve_customop_splitv --substitute_pack_to_reshape --substitute_padv2_to_pad --substitute_splitv_to_split --substitute_squeeze_to_reshape --substitute_strided_slice_to_reshape --substitute_transpose_to_reshape --transform_min_max_to_relu6 --transform_min_relu_to_relu6 --transform_sqrt_div_to_rsqrt_mul
circle2circle: [assert failed] Support int 32/64 for now. 
circle2circle: circle2circle: /home/ssafy/one/compiler/luci/service/src/CircleShapeInferenceHelper.cpp:170: loco::TensorShape luci::sinf::pad_shape(const loco::TensorShape&, const luci::CircleNode*): Assertion `(paddings->dtype() == S32 || paddings->dtype() == S64)' failed.

Related

@qsunki
Copy link
Contributor Author

qsunki commented Sep 26, 2024

Suggestions

1. Add dtype to the node in SubstitutePackToReshapePass

https://github.com/Samsung/ONE/blob/master/compiler/luci/pass/src/SubstitutePackToReshapePass.cpp#L53-L57

  • Before
auto graph = target_node->graph();
auto reshape_node = graph->nodes()->create<luci::CircleReshape>();
reshape_node->tensor(value_node);
reshape_node->name(name + "/Reshape");
luci::add_origin(reshape_node, luci::get_origin(node));
  • After
  auto graph = target_node->graph();
  auto reshape_node = graph->nodes()->create<luci::CircleReshape>();
  reshape_node->tensor(value_node);
  reshape_node->dtype(value_node->dtype());
  reshape_node->name(name + "/Reshape");
  luci::add_origin(reshape_node, luci::get_origin(node));

2. Switch the order of CircleShapeInferencePass And CircleTypeInferencePass

When the CircleTypeInferencePass runs first, the dtype is determined, allowing the CircleShapeInferencePass to function correctly.


Each method works well when applied.
** In the case of the first method, the same might need to be done in other *ToReshapePass.

@qsunki qsunki changed the title [luci] reshape's dtype unknown after SubstitutePackToReshapePass [luci] CircleReshape's dtype unknown after SubstitutePackToReshapePass Sep 26, 2024
@shs-park
Copy link
Contributor

  1. Switch the order of CircleShapeInferencePass and CircleTypeInferencePass

From my perspective, the second approach seems more appropriate, as it appears to be more generally applicable to other operations as well.

However, I do have some concerns about potential issues that might arise from changing the order.

@jinevening, @seanshpark,
Do you have any thoughts or considerations on this?

@jinevening
Copy link
Contributor

+1 for the second approach.

@seanshpark
Copy link
Contributor

I'm not sure second approach is OK :)
well ... each pass shouldn't have any side effects but it seems Pack does. (and maybe others too?)
if this is the case, switching shape-dtype inference pass may cause other problems for our models.
we need full testing.

for the meantime, my proposal is let this issue be opened, go with (1) with a comment.

  reshape_node->tensor(value_node);
  // NOTE shape inference has issue without this. check issue #14096 for details.
  reshape_node->dtype(value_node->dtype());
  reshape_node->name(name + "/Reshape");

@icodo98
Copy link
Contributor

icodo98 commented Sep 26, 2024

each pass shouldn't have any side effects but it seems Pack does. (and maybe others too?)

As far as I know, the roles of TypeInferencePass and ShapeInferencePass are to determine the dtype and shape of the output based on the input and args.
However, for some ops, ShapeInference is restricted to specific dtype inputs (e.g., padding, reshape).
In that case, wouldn’t it be more natural for the results of TypeInferencePass to be passed on to the subsequent ShapeInferencePass?

@seanshpark
Copy link
Contributor

In that case, wouldn’t it be more natural for the results of ...

As I wrote, switch may cause other problem, which I don't know yet, and we need full testing.

@seanshpark
Copy link
Contributor

I remembered what we usually we have done with Passes.

If the condition is not met, just return.

That is, if shape inference fails from dtype` is not determined, just don't do anything and return.
There will be next call and when the condition is met, do what ever it should.

This will be (3).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

6 participants