-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow to configure cgroupsv1 per nodepool #367
Conversation
7d18c14
to
c5a3a89
Compare
There were differences in the rendered Helm template, please check! Output
|
please ignore me, wrong browser tab 🤦♂️ |
I changed the PR's title as there is, IMHO, no need to prefix it this way. Apart from that: LGTM! I guess you only cherry-picked, right? I guess it's currently hard to impossible to test stuff like this by just running cluster test suites, so we might need to come up with a test release that uses the test version of |
apiVersion: v1 | ||
kind: Secret | ||
metadata: | ||
name: {{ include "cluster.resource.name" $ }}-containerd-{{ include "cluster.data.hash" (dict "data" (tpl ($.Files.Get "files/etc/containerd/config.toml") $) "salt" $.Values.providerIntegration.hashSalt) }} | ||
name: {{ include "cluster.resource.name" $ }}-{{ $nodePoolName }}-containerd-{{ include "cluster.data.hash" (dict "data" (tpl ($.Files.Get "files/etc/containerd/workers-config.toml") $) "salt" $.Values.providerIntegration.hashSalt) }} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's hope nobody calls their node pool controlplane
😆
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lol I didn't think of that 😄 I don't think it will happen though (infamous last words)
I migrated a test cluster that was using cgroupsv1 in one of the node pools, and the configuration was migrated successfully. |
What does this PR do?
Backport of #348