Details
Description
The Spark exclude node functionality for Spark on YARN, introduced in SPARK-26688, allows users to specify a list of node names that are excluded from resource allocation. This is done using the configuration parameter: spark.yarn.exclude.nodes
The feature currently works only for executors allocated via dynamic allocation. To use the feature on Spark 3.3.1, for example, one may set the configurations spark.dynamicAllocation.enabled=true, spark.dynamicAllocation.minExecutors=0 and spark.executor.instances=0, thus making Spark spawning executors only via dynamic allocation.
This proposes to document this behavior for the current Spark release and also proposes an improvement of this feature by extending the scope of Spark exclude node functionality for YARN beyond dynamic allocation, which I believe makes it more generally useful.