Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
1.2.0
-
None
Description
Files added with SparkContext#addFile are loaded with Utils#fetchFile before a task starts. However, Utils#fetchFile puts all files under the Spart root on the worker node. We should have an option to keep the folder information.
Attachments
Issue Links
- breaks
-
SPARK-6144 When in cluster mode using ADD JAR with a hdfs:// sourced jar will fail
- Closed
- is depended upon by
-
HIVE-8851 Broadcast files for small tables via SparkContext.addFile() and SparkFiles.get() [Spark Branch]
- Open
-
SPARK-3145 Hive on Spark umbrella
- Resolved
- links to