Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-6673

spark-shell.cmd can't start even when spark was built in Windows

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Blocker
    • Resolution: Fixed
    • 1.3.0
    • 1.4.0
    • Windows
    • None

    Description

      spark-shell.cmd can't start.

      bin\spark-shell.cmd --master local
      

      will get

      Failed to find Spark assembly JAR.
      You need to build Spark before running this program.
      

      even when we have built spark.

      This is because of the lack of the environment SPARK_SCALA_VERSION which is used in spark-class2.cmd.
      In linux scripts, this value is set as 2.10 or 2.11 by default in load-spark-env.sh, but there are no equivalent script in Windows.

      As workaround, by executing

      set SPARK_SCALA_VERSION=2.10
      

      before execute spark-shell.cmd, we can successfully start it.

      Attachments

        Activity

          People

            tsudukim Masayoshi Tsuzuki
            tsudukim Masayoshi Tsuzuki
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: