Upgrade python and node versions, use pip-tools for dependency freezing #1682
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Description
After giving the talk at the general meeting yesterday I realised that we're using python 3.7, which will be unsupported in a month! (after all of @Bomme's hard work upgrading us from a different unsupported python...)
I saw two options to upgrade the python version:
In the end I went for the most recent version of python, in order to take advantage of potential code speedups. We should try and measure container performance before and after the release (although I guess this is more difficult now without grafana...)It turns out that python 3.11 deleted a bunch of previoulsy deprecated things and our version of celery doesn't work, requiring a non-released beta version to work properly. What's more, some syntax for defining celery tasks has changed between 4.x and 5.x, meaning that we require changes to fs code to upgrade celery too. In light of that, let's just stick with python 3.10 until a new version of celery comes out and then we can upgrade that in one go.
In addition, I added the use of
pip-compile
to take a list of top level dependencies and generate a file of all transitive dependencies, as we had previously discussed with @BommeI noticed that our original requirements.txt file was already pretty "light", with what appears to be only top-level dependencies anyway, that's great! We can double-check that these dependencies are accurate.
I swapped out the use of
pip install --no-cache-dir
forRUN --cache
in the Dockerfile. This means that docker will cache the pip wheel dir on your computer making rebuilds faster, as it doesn't need to download the wheels again (this was mostly done because doing multiple builds on my current internet connection takes ages due to slow download speeds!).This requires docker "buildkit" to be enabled. It's enabled by default on docker for mac and Docker versions > 23.0 (which is the case for sonic). Let me know if you think we should keep it or remove it.
edit: In retrospect this was really annoying to get going on github actions, and I'm not too against going back to the original system
While I was here, I upgraded Node to a more recent LTS (v14 -> v18. v20 is the current latest LTS).
There are still a few pending items that we should look at:
Configure pip-compile for similarity/clustering/tagrec requirements files and test. (as similarity is still py2, I'd vote to skip this one)Look at upgrading python dependencies on the freesound-audio-analyzers repo. I believe that these containers are completely isolated and so shouldn't be affected by this upgrade, and so perhaps we can ignore that for now.Deployment steps: