-
Notifications
You must be signed in to change notification settings - Fork 34
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
dcos Spark doesn’t run jobs #70
Comments
Also facing the same issue with the same example.
|
Looks like I have been able to fix this issue. In my case I noticed that I uninstalled Spark as ..
Reinstall Spark and now the submit works and the job finishes. |
I am having this exact same issue but in AWS with a fresh install, and spark is the only service installed |
Hi @ignacio-dc. Please ensure that your Spark Dispatcher is properly registered by verifiying that it appears in the active frameworks listed in /mesos/state.json. If it doesn't, it's likely that you failed to fully uninstall Spark from a previous install, and must do that: https://docs.mesosphere.com/1.8/usage/service-guides/spark/uninstall/) If you continue to have problems, please open a new issue. This issue has been closed. |
@ArtRand @susanxhuynh lets close this. |
I'm experiencing the same error now but it's about the installation of Spark, not about running jobs. We may close this issue. Edit: My new issue is Spark package fails to install with permission errors #208 |
Please answer the following questions before submitting your issue. Thanks!
What version of DC/OS + DC/OS CLI are you using (
dcos --version
)?What operating system and version are you using?
Ubuntu 16.04 LTS
What did you do?
What did you expect to see?
Spark should run jobs.
What did you see instead?
Job is in queued state all the time.
Spark is listed in packages but not as service
System stats
from dcos-cli issue
The text was updated successfully, but these errors were encountered: