Description
spark-shell.cmd can't start.
bin\spark-shell.cmd --master local
will get
Failed to find Spark assembly JAR.
You need to build Spark before running this program.
even when we have built spark.
This is because of the lack of the environment SPARK_SCALA_VERSION which is used in spark-class2.cmd.
In linux scripts, this value is set as 2.10 or 2.11 by default in load-spark-env.sh, but there are no equivalent script in Windows.
As workaround, by executing
set SPARK_SCALA_VERSION=2.10
before execute spark-shell.cmd, we can successfully start it.