-
Notifications
You must be signed in to change notification settings - Fork 34
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Run on Universal Container Runtime #264
Comments
I see that the current beta release can run on the UCR. I'll give it a try! |
I can't get it working with a private registry as there seems to be no way to configure the secret to use to pull the image. |
@dirkjonker See instructions starting at Step 30 here: https://docs.mesosphere.com/1.10/administering-clusters/deploying-a-local-dcos-universe/ |
@dirkjonker We've made the latest release of Spark 2.5.0-2.2.1 work well with the UCR on both RHEL/CentOS and CoreOS (and other distros). In addition, with the introduction of the Package Registry in DC/OS 1.12, we've made the operator experience for air-gapped environments a lot easier with the pre-built Ref: https://docs.mesosphere.com/1.12/administering-clusters/repo/package-registry/ |
Other Universe services such as Cassandra and Kafka run on the Universal Container Runtime (UCR), Spark is still using the Docker runtime. Are there any plans to support running Spark on the UCR as well?
The text was updated successfully, but these errors were encountered: