Details
-
Bug
-
Status: Open
-
Major
-
Resolution: Unresolved
-
0.8.1
-
None
-
None
Description
Using default setup will result in exception when starting local pyspark remoteinterpreter due to pyspark module not found.
In interpreter.cmd there is a set command for PYTHONPATH to include %ZEPPLIN_HOME%\interpreter\spark\pyspark\pyspark.zip and proper %ZEPPLIN_HOME%\interpreter\spark\pyspark\py4j-*.zip, which does work (verified by debug output), but somehow the spawned python process does not inherit it.
Workaround: Set PYTHONPATH in windows enviroment variables to include both files.