Details
-
Bug
-
Status: Closed
-
Major
-
Resolution: Fixed
-
0.10.1
-
None
Description
I faced this problem where Spark 3.3.1 is complitable with Zeppelin via scala, but not pyspark.
Can you help me to fix it please?
Interpreters' output:
Spark interpreter settings: