Details
-
Improvement
-
Status: Closed
-
Minor
-
Resolution: Fixed
-
1.4.0
-
None
Description
Using a fresh checkout of 1.4.0-bin-hadoop2.6
if you run
./start-slave.sh 1 spark://localhost:7077
you get
failed to launch org.apache.spark.deploy.worker.Worker:
Default is conf/spark-defaults.conf.
15/06/16 13:11:08 INFO Utils: Shutdown hook called
it seems the worker number is not being accepted as desccribed here:
https://spark.apache.org/docs/latest/spark-standalone.html
The documentation says:
./sbin/start-slave.sh <worker#> <master-spark-URL>
but the start.slave-sh script states:
usage="Usage: start-slave.sh <spark-master-URL> where <spark-master-URL> is like spark://localhost:7077"
I have checked for similar issues using :
https://issues.apache.org/jira/browse/SPARK-6552?jql=text%20~%20%22start-slave%22
and found nothing similar so am raising this as an issue.
Attachments
Issue Links
- is duplicated by
-
SPARK-9007 start-slave.sh changed API in 1.4 and the documentation got updated to mention the old API
- Resolved
- links to