Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

High availability with Zookeeper #11

Open
ptriboll opened this issue Mar 16, 2017 · 1 comment
Open

High availability with Zookeeper #11

ptriboll opened this issue Mar 16, 2017 · 1 comment

Comments

@ptriboll
Copy link

Is there a solution to handle spark master url like spark://host1:port1,host2:port2.
Cluster configuration is based on settings spark.deploy.recoveryMode and spark.deploy.zookeeper.url
What is the best approach for rest client api?

@ywilkof
Copy link
Owner

ywilkof commented Mar 31, 2017

Hi @ptriboll,
I am not so familiar with the Zookeeper setup.
In the spark properties map of the request, all keys must be valid properties acknowledged by Spark, as described in the configuration documentation. According to this documentation, the spark.master param does not seem accept a list of host:ports in the Standalone cluster setup.

If it turns out it is possible to give a list of hosts in the Standalone mode then of course we should support it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants