Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-4585

Spark dynamic executor allocation shouldn't use maxExecutors as initial number

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Closed
    • Major
    • Resolution: Fixed
    • 1.1.0
    • 1.3.0
    • Spark Core, YARN
    • None

    Description

      With SPARK-3174, one can configure a minimum and maximum number of executors for a Spark application on Yarn. However, the application always starts with the maximum. It seems more reasonable, at least for Hive on Spark, to start from the minimum and scale up as needed up to the maximum.

      Attachments

        Issue Links

          Activity

            People

              sandyr Sandy Ryza
              chengxiang li Chengxiang Li
              Votes:
              0 Vote for this issue
              Watchers:
              7 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: