Details
-
Improvement
-
Status: Closed
-
Major
-
Resolution: Fixed
-
1.1.0
-
None
Description
With SPARK-3174, one can configure a minimum and maximum number of executors for a Spark application on Yarn. However, the application always starts with the maximum. It seems more reasonable, at least for Hive on Spark, to start from the minimum and scale up as needed up to the maximum.
Attachments
Issue Links
- is depended upon by
-
SPARK-3145 Hive on Spark umbrella
- Resolved
- links to