Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PERF401 new preview fixes invalidly hoists extend to list compre #14362

Open
Skylion007 opened this issue Nov 15, 2024 · 3 comments · May be fixed by #14369
Open

PERF401 new preview fixes invalidly hoists extend to list compre #14362

Skylion007 opened this issue Nov 15, 2024 · 3 comments · May be fixed by #14369
Labels
bug Something isn't working fixes Related to suggested fixes for violations

Comments

@Skylion007
Copy link
Contributor

Skylion007 commented Nov 15, 2024

I tried running the new autofixes on the PyTorch codebase and was mostly impressed, but found one really annoying (and handable edge case).

Here is an example bad fix

     @dist_init
     def test_wait_all_with_exception(self):
-        futs = []
+        futs = [rpc.rpc_async(dst, raise_func) for _ in range(10)]
         dst = worker_name((self.rank + 1) % self.world_size)
-        for _ in range(10):
-            futs.append(rpc.rpc_async(dst, raise_func))

         with self.assertRaisesRegex(ValueError, "Expected error"):
             torch.futures.wait_all(futs)

Here is an example diff generated by ruff. Note that list comprehensions uses dst even though dst is first defined on the line underneath the list comprehension. Ruff can already detect this because it immeaditely created a bunch of ruff F821 errors as soon as the fixes were applied. It would be good not to hoist the forloop from an extend to a list comprehensions if there are any variables needed for the list comprehension defined or mutated in anyway. I was kind of surprised given that it did properly not hoist the function if there were any comments in between the list definition and the loop.

ruff 0.7.4
ruff check --select=PERF401 --fix --unsafe-fixes --preview

FYI @w0nder1ng

@Skylion007 Skylion007 changed the title PERF401 new fixes generates invalid code PERF401 new preview fixes invalidly hoists extend to list compre Nov 15, 2024
@w0nder1ng
Copy link
Contributor

I didn't consider that when I was writing the fix, and I guess the question now is how this case should be handled. I guess a possible fix could look like:

     @dist_init
     def test_wait_all_with_exception(self):
-        futs = []
         dst = worker_name((self.rank + 1) % self.world_size)
-        for _ in range(10):
-            futs.append(rpc.rpc_async(dst, raise_func))
+        futs = [rpc.rpc_async(dst, raise_func) for _ in range(10)]
         with self.assertRaisesRegex(ValueError, "Expected error"):
             torch.futures.wait_all(futs)

As long as we check that the futs variable isn't used between futs = [] and the for loop, it should be fine.

I was kind of surprised given that it did properly not hoist the function if there were any comments in between the list definition and the loop.

If you have an example of this, I'd be happy to take a look.

@dylwil3 dylwil3 added bug Something isn't working fixes Related to suggested fixes for violations labels Nov 15, 2024
@Skylion007
Copy link
Contributor Author

Skylion007 commented Nov 15, 2024

Sorry typo, it properly did NOT hoist the function if there were comments (to preserve the comments). As it did not hoist the list comprehension if there was a comment there. It did do if there was code in the way, which was surprising.

@w0nder1ng w0nder1ng linked a pull request Nov 15, 2024 that will close this issue
@Skylion007
Copy link
Contributor Author

Also, FYI once these fixes are landed. We should look into PERF403, as the fixes will share a lot of similar logic. :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working fixes Related to suggested fixes for violations
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants