-
Notifications
You must be signed in to change notification settings - Fork 167
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
JWST issues with numpy 2.0 #8580
JWST issues with numpy 2.0 #8580
Comments
Comment by Kenneth MacDonald on JIRA: There doesn't appear to be any problems in STCAL with numpy 2.0. !ruff_checks_numpy_20.png! |
Comment by Kenneth MacDonald on JIRA: There doesn't appear to be a problem in JWST ramp fitting for numpy 2.0.
!Screenshot 2024-08-29 at 1.08.36 PM.png! |
Comment by Kenneth MacDonald on JIRA: Tyler Pauly has an STCAL PR open to handle numpy 2.0, which necessitates a change in the calling of a byte order method:
|
Still regression test differences that need to be investigated; the stcal PR partially addressed this issue. |
Comment by Robert Jedrzejewski on JIRA: I checked all the steps in the miri detector1 pipeline. The only ones that gave non-zero differences between numpy 1.26 and numpy 21.1 were emicorr and ramp_fit. The differences were all below 1 part in 10^6, except for 8 pixels in the ramp_fit step that were still closer to 1 part in 10^5. |
Comment by Robert Jedrzejewski on JIRA: I also checked the refpix irs2 step. I ran the step starting with the same _superbias.fits input file, and then using numpy 1.26 and numpy 2.1.1. The distribution of #discrepant pixels vs. threshold looks like this: rtol #pixels these differences propagated down to the rate file: rtol #pixels There were a few statements that behave differently under numpy 2.1.1 compared to 1.26; I tried forcing the same behaviour by casting variables in arithmetic expressions, but was unable to reconcile the differences. More time would be needed to determine whether the numpy 2 behaviour is preferable to the numpy 1 behaviour |
Comment by Robert Jedrzejewski on JIRA: I checked all the files that were reported as discrepant in the regression test run and made a note of the largest percentage of discrepant pixels in the comparison (there will be more than 1 value for multi-extension products). In some cases the failures were because the data in the extension was of a different size between result and truth. some files had a different number of extensions or table rows in an extension. The last dozen or so tests with very large number of discrepant pixels also had the RSCD and FIRSTFRAME steps performed in the test, while these steps were skipped in the truth files. There are about 4000 file comparisons done in the regression tests. A few are skipped, a few are XFAILed. These 112 failed jwst.regtest.test_nirspec_fs_spec2.[stable-deps] test_nirspec_fs_spec2[jw02072-o002_20221206t143745_spec2_00001_asn.json-nsclean] 0.71% The test modules that product failures are: test_nirspec_fs_spec2.py 20/69 test modules. That means 49/69 are passing. Some of these are very small differences and can safely be disregarded, but some are more significant. |
Comment by Melanie Clarke on JIRA: Looking at the NIRSpec spec2 differences for MOS, FS, and IFU, they appear to be at least partly because the FFT fit in NSClean is a little different at the edges for full frame data. I attached an example from the jw02072-o002_20221206t143745_spec2_00001_asn.json-nsclean test. For this one, the edge effects look a little worse with numpy 2; for the MOS test, they look a little better with numpy 2 than with numpy 1. I don't think there's any reason to hold up transitioning to numpy 2 for this, but we should consider fixing edge effects more robustly while refactoring NSClean for full frame arrays in JP-3740. |
Comment by Maria Pena-Guerrero on JIRA: I did a run with numpy 2.0 and numpy 1.26.4 on MIRI data that I had locally to test the emicorr, dark, and saturation steps. I used file jw01386007001_04101_00006_mirimage_uncal.fits. I find that the differences are negligible, attaching the plots. !dark_current_np2vs126.png|thumbnail! !emicorr_np2_vs126.png|thumbnail! !saturation_np2_vs126.png|thumbnail! I obtained similar results for jw05594035001_02101_00005_nrcb1_uncal.fits |
test_engdb_mast.py also fails with numpy 2: jwst/jwst/lib/tests/test_engdb_mast.py Line 85 in 57f6148
compares string converted numbers (which changed for numpy 2): https://github.com/spacetelescope/stdatamodels/actions/runs/11921284593/job/33225085285#step:10:308 |
Issue JP-3664 was created on JIRA by Brett Graham:
Using numpy 2.0 results in failures in:
test_coron
failures with numpy 2.0 #8579Regression tests here: https://plwishmaster.stsci.edu:8081/job/RT/job/JWST-Developers-Pull-Requests/1664/#showFailuresLink
More recent regression test results here: https://plwishmaster.stsci.edu:8081/job/RT/job/JWST-Developers-Pull-Requests/1737/#showFailuresLink](https://plwishmaster.stsci.edu:8081/job/RT/job/JWST-Developers-Pull-Requests/1737/#showFailuresLink)
These were built with PRs to jwst, stdatamodels and stcal:
jwst: #8718
stcal: tapastro/stcal@1bb106f
EDIT: updated commit hash to
stdatamodels: tapastro/stdatamodels@e3b53d6
Hashes provided for stcal/stdatamodels to specify custom installation procedure for future regression test runs, as needed.
The regression tests show many failures, most of which on cursory inspection show small numerical differences. We'll need to check them off one by one to ensure there aren't any unreasonable changes as part of this migration.
Helpful link to numpy 2.0 migration guide: https://numpy.org/devdocs/numpy_2_0_migration_guide.html
The text was updated successfully, but these errors were encountered: