You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is there a solution to handle spark master url like spark://host1:port1,host2:port2.
Cluster configuration is based on settings spark.deploy.recoveryMode and spark.deploy.zookeeper.url
What is the best approach for rest client api?
The text was updated successfully, but these errors were encountered:
Hi @ptriboll,
I am not so familiar with the Zookeeper setup.
In the spark properties map of the request, all keys must be valid properties acknowledged by Spark, as described in the configuration documentation. According to this documentation, the spark.master param does not seem accept a list of host:ports in the Standalone cluster setup.
If it turns out it is possible to give a list of hosts in the Standalone mode then of course we should support it.
Is there a solution to handle spark master url like spark://host1:port1,host2:port2.
Cluster configuration is based on settings spark.deploy.recoveryMode and spark.deploy.zookeeper.url
What is the best approach for rest client api?
The text was updated successfully, but these errors were encountered: