You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Support a method equivalent to torcheval's merge_state to allow explicitly reducing metrics when not used under DDP. This is an updated request from #2063.
Motivation
When used within DDP, torchmetrics objects support automatic syncing and reduction across ranks. However, there doesn't seem to be support for reduction outside DDP. This will be a good feature to have because it allows using torchmetrics for distributed evaluation using frameworks other than DDP.
Pitch
Enabling manual reduction makes torchmetrics more widely applicable because it can be used in distributed frameworks other than DDP, such as ray.
Alternatives
Additional context
The text was updated successfully, but these errors were encountered:
🚀 Feature
Support a method equivalent to
torcheval
's merge_state to allow explicitly reducing metrics when not used under DDP. This is an updated request from #2063.Motivation
When used within DDP, torchmetrics objects support automatic syncing and reduction across ranks. However, there doesn't seem to be support for reduction outside DDP. This will be a good feature to have because it allows using torchmetrics for distributed evaluation using frameworks other than DDP.
Pitch
Enabling manual reduction makes
torchmetrics
more widely applicable because it can be used in distributed frameworks other than DDP, such asray
.Alternatives
Additional context
The text was updated successfully, but these errors were encountered: