You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm benchmarking and performance tuning some ML code, and garbage collection seems to be the largest drag on performance. Is there a way to measure time spent in GC?
The text was updated successfully, but these errors were encountered:
Not within Scalene. My intuition for GC in Python is that it's not usually a problem because most of the time I'd expect reference counting to handle most of the garbage (except for cycles, which I wouldn't expect in a case where you are using an ML library). I'd be interested to see what's going on! Anyway, looks like py-spy might be helpful here: see benfred/py-spy#389.
I'm benchmarking and performance tuning some ML code, and garbage collection seems to be the largest drag on performance. Is there a way to measure time spent in GC?
The text was updated successfully, but these errors were encountered: