Skip to content

Commit

Permalink
Bug fix for gather fusion with on-device training (#20891)
Browse files Browse the repository at this point in the history
### Description
Update the initializer that's added in GatherSliceToSplitFusion to use
the GenerateNodeArgName function, rather than the GenerateNodeName
function.

GenerateNodeName goes through all the nodes in the graph to see if the
given name is already used and generates a unique one if it has been
used. GenerateNodeArgName iterates through all the node args in the
graph to see if the given name is already used.

### Motivation and Context
* on-device training goes through a generate artifacts step, where
optimizations are applied, then, when the training artifact is loaded,
additional optimizations are applied. In the first round of
optimizations, a "splits" initializer is added for phi-3. With the
second round of optimizations, another "splits" initializer with
different dimensions and data is added. Since we call GenerateNodeName
func, the first splits initializer isn't found, causing a type error
where it claims the shape of splits does not match the TensorProto
shape.
  • Loading branch information
carzh authored Jun 3, 2024
1 parent 456ab09 commit 94ce120
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion onnxruntime/core/optimizer/gather_fusion.cc
Original file line number Diff line number Diff line change
Expand Up @@ -268,7 +268,7 @@ Status GatherSliceToSplitFusion::ApplyImpl(Graph& graph, bool& modified, int gra
}

ONNX_NAMESPACE::TensorProto split_initializer_proto;
split_initializer_proto.set_name(graph.GenerateNodeName("splits"));
split_initializer_proto.set_name(graph.GenerateNodeArgName("splits"));
split_initializer_proto.set_data_type(ONNX_NAMESPACE::TensorProto_DataType_INT64);
split_initializer_proto.add_dims(static_cast<int64_t>(split_values.size()));
split_initializer_proto.mutable_int64_data()->Add(split_values.begin(), split_values.end());
Expand Down

0 comments on commit 94ce120

Please sign in to comment.