-
Notifications
You must be signed in to change notification settings - Fork 84
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Review if we need to refactor for numpy >=2.1.0 #2483
Comments
By monitoring releases, creating a robust testing strategy, and potentially pinning compatible versions, you can maintain compatibility with your dependencies as they update their pinning for NumPy. This proactive approach will help avoid issues when new versions are released. |
We do have a scheduled test to see when incompatible API changes happen. When you look at a pixi.lock file of a recent installation and grep for numpy you do see the work for the solver. Some of our dependencies have already set a pinning. When basemap-data changes its pinning, we will find if others have to have a pinning.
|
These version constraints are quite important because they affect the stability of our application. As you mentioned, there are multiple version constraints for NumPy across different dependencies. Do you think we should summarize these constraints in a compatibility matrix? This would help us understand how to handle other dependencies when basemap-data changes its pinning. And yes, the URL for NumPy is also helpful, as it ensures that we are downloading the correct version. Have you faced any issues when updating the pinning for a specific package? |
Sometimes we have to pin a module until some other package has fixed some issues. I expect this here too. Copying the whole section, it is basemap. basemap-data is a subpackage of it. I readed that not correctly before.
I am one of the maintainers of that package on conda-forge. Numpy 2.0 was released in June 2024. The basemap 7 months and 26 days ago. The current pinning comes from the version on conda-forge at that time. Helpful for MSS would be to see the release date of all dependencies and if there are newer versions than we momentanly can use. I've not seen something which can be automated done. And I would expect such a feature on conda-forge or a similiar platform. Because you then want to look also in the dependencies of one dependency. |
Whether we do set a pinning on our own, we should also check if our usage is compatible to the most recent numpy version. And when there are changes needed they can be done in a way that we are prepared but still can use the old version. look for In case we need a pinning, we would do this in stable. |
Absolutely, I agree! It’s crucial that we ensure compatibility with the latest NumPy version, even if we decide to set our own pinning. This proactive approach will help us prepare for any necessary changes while still allowing us to use the older version without issues. I’ll make sure to check the usage of import numpy as np throughout our codebase to identify any potential areas that might require updates when we move to the newer version. Setting the pinning in a stable manner sounds like a solid plan. Do you have any specific versioning strategies in mind for this process? Looking forward to your thoughts! |
will you select me for GSoC 2024 for your Organisation? |
This depends on your proposal if you mean 2025. GSOC2024 is finished. Selections of organization have not started. |
Thank you......So what should I do to get selected? For 2025. |
currently we not have set a pinning for numpy but our dependencies have.
We need to check if we are compatible when they release their pinning.
The text was updated successfully, but these errors were encountered: