Skip to content

Commit

Permalink
Change compile for pipeline module torch.compile
Browse files Browse the repository at this point in the history
We have encountered and issue with torch.compile and the pipeline
module. modifying a member of the module duing the run will cause
torch compile to restart the analysis and treat the module as dynamic.
this happens because the fwd function will modify the micro_offset
attribute of the pipeline module.
in order to bypass this issue without significantly changing the way
the pipeline module works we propose to compile only the layers in the
pipeline module instead of the pipeline module itslef.
this will bypass the issue, and should still give most of the benefit of
torch compiling the pipeline module while avoiding the issue.
  • Loading branch information
NirSonnenschein committed Aug 28, 2024
1 parent eb37cac commit 9e3339d
Showing 1 changed file with 5 additions and 0 deletions.
5 changes: 5 additions & 0 deletions deepspeed/runtime/pipe/module.py
Original file line number Diff line number Diff line change
Expand Up @@ -662,3 +662,8 @@ def get_additional_losses(self):
Return a dictionary of {"loss name": loss_value} or None if no additional losses.
"""
return None

def compile(self, *args, **kwargs):
for idx, layer in enumerate(self.forward_funcs):
new_layer = torch.compile(layer, *args, **kwargs)
self.forward_funcs[idx] = new_layer

0 comments on commit 9e3339d

Please sign in to comment.