Details
Description
Problem description:
i created a custom jar file my-test-1.0-SNAPSHOT.jar with following java code:
package com.mycompany.app; import java.util.LinkedList; import java.util.List; public class Stack { public void my_open() { System.out.println("calling open file"); } public void my_close() { System.out.println("calling close file"); } }
when i use pyspark shell, it works well.
pyspark --driver-class-path my-test-1.0-SNAPSHOT.jar
but when i use pyspark in Zeppelin, it works only for the first time, with proper output "calling open file". But after then, as following shows the method 'my_open' failed to execute with 'None' returned.
more actions:
1. when i tried to use Scala in Zeppelin to do the same, it worked well.
2. i restarted the spark interpreter, then it worked again for the first time, and further running failed.
Attachments
Attachments
Issue Links
- links to