Description
Spark 3.2.0 turned on py4j pinned thread mode by default (SPARK-35303). However, in a jupyter notebook, after I cancel (interrupt) a long-running Spark job, the next Spark command will fail with some py4j errors. See attached notebook for repro.
Cannot reproduce the issue after I turn off pinned thread mode .
Attachments
Attachments
Issue Links
- is caused by
-
SPARK-35303 Enable pinned thread mode by default
- Resolved
- links to