Details
-
Test
-
Status: Closed
-
Major
-
Resolution: Won't Fix
-
1.1.0
-
None
-
None
Description
Most Hadoop components such MR, DFS, Tez, and Yarn provide a mini cluster that can be used to test the external systems that rely on those frameworks, such as Pig and Hive. While Spark's local mode can be used to do such testing and is friendly for debugging, it's too far from a real Spark cluster and a lot of problems cannot be discovered. Thus, an equivalent of Hadoop MR mini cluster in Spark would be very helpful in testing system such as Hive/Pig on Spark.
Spark's local-cluster is considered for this purpose but it doesn't fit well because it requires a Spark installation on the box where the tests run. Also, local-cluster isn't exposed.
Attachments
Issue Links
- blocks
-
HIVE-7382 Create a MiniSparkCluster and set up a testing framework [Spark Branch]
- Resolved
- is depended upon by
-
SPARK-3145 Hive on Spark umbrella
- Resolved