Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Knative v1.16 changed behavior of duplicate environment variables #15608

Open
SaschaSchwarze0 opened this issue Nov 11, 2024 · 0 comments
Open
Labels
kind/bug Categorizes issue or PR as related to a bug.

Comments

@SaschaSchwarze0
Copy link
Contributor

In what area(s)?

/area API

Other classifications:

What version of Knative?

v1.16

Expected Behavior

Knative v1.16 changed behavior when it comes to environment variables defined in a KService. The change was not documented.

The origin of the change is the bump of the Kubernetes client version from v0.29 to v0.30 and the regeneration of the custom resource definition yamls.

This caused the addition of these three lines for the env field:

https://github.com/knative/serving/blob/v0.43.0/config/core/300-resources/service.yaml#L282-L284

                                x-kubernetes-list-map-keys:
                                  - name
                                x-kubernetes-list-type: map

With those definitions, Kubernetes blocks Services, Configurations, and Revisions to have duplicate environment variables.

One impact for example is that one can have a KService with duplicate environment variables. One can still update it as long as one does not touch the env (for example by just updating the image). Knative can then still successfully update the Configuration, but fails to create the new Revision.

Actual Behavior

Having duplicate environment variables makes no sense. Period. So, I generally like that validation. But, I think the transition into that validation should be smoother.

One should document this change in the release notes, I think. What might help users, is to determine if they have that problem in their cluster. The following query lists the affected KServices:

kubectl get ksvc -A -o json | jq -r '.items[] | select(.spec.template.spec.containers[0].env != null and (.spec.template.spec.containers[0].env | group_by(.name) | map(select(length > 1)) | select(length > 0)) != null) | [ .metadata.namespace, .metadata.name ] | @tsv'

We suddenly already have a mutating webhook in place where we added this logic to mutate away that problem in KService operations. Not sure if Knative would want a similar logic:

// RemoveDuplicateEnvironmentVariables removes duplicate entries from container.Env and uses the last value in case of such duplicates
func (ksvcm *KServiceMutator) RemoveDuplicateEnvironmentVariables(container *corev1.Container) {
	allEnvVars := map[string]int{}
	j := 0 // output index
	for i := range container.Env {
		name := container.Env[i].Name
		if _, found := allEnvVars[name]; found {
			ksvcm.Logger.Warn(fmt.Sprintf("Removing duplicate environment variable %q. Value is different: %v", name, !reflect.DeepEqual(container.Env[i], container.Env[allEnvVars[name]])))

			// duplicate env var overwrite the first env var with the same name with this one
			container.Env[allEnvVars[name]] = container.Env[i]
		} else {
			// new env var
			allEnvVars[name] = j
			container.Env[j] = container.Env[i]
			j++
		}
	}
	container.Env = container.Env[:j]
}

Steps to Reproduce the Problem

  1. If you are already on v1.16, you could remove the three lines from the Configuration, Revision and Service custom resource definition
  2. Create a KService and have multiple environment variable entries with the same name.
  3. Install v1.16 (or add back the lines you removed in (1).
  4. Try to update the KService (or create the same one).
@SaschaSchwarze0 SaschaSchwarze0 added the kind/bug Categorizes issue or PR as related to a bug. label Nov 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/bug Categorizes issue or PR as related to a bug.
Projects
None yet
Development

No branches or pull requests

1 participant