Details
-
Bug
-
Status: Resolved
-
Trivial
-
Resolution: Duplicate
-
1.4.0
-
None
-
None
Description
In Spark version < 1.4 start-slave.sh accepted two parameters. worker# and a list of master addresses.
With Spark 1.4 the start-slave.sh worker# parameter was removed, which broke our custom standalone cluster setup.
With Spark 1.4 the documentation was also updated to mention spark-slave.sh (not previously mentioned), but it describes the old API.
Attachments
Issue Links
- duplicates
-
SPARK-8395 spark-submit documentation is incorrect
- Closed
- is duplicated by
-
SPARK-8941 Standalone cluster worker does not accept multiple masters on launch
- Resolved