Details
-
Improvement
-
Status: Resolved
-
Major
-
Resolution: Won't Fix
-
None
-
None
-
None
Description
It'll be useful if user can get detailed cluster resource info, e.g. granted/allocated executors, memory and CPU.
Such information is available via WebUI but seems SparkContext doesn't have these APIs.
Attachments
Issue Links
- is depended upon by
-
HIVE-9542 SparkSessionImpl calcualte wrong cores number in TestSparkCliDriver [Spark Branch]
- Open
-
SPARK-3145 Hive on Spark umbrella
- Resolved
- relates to
-
HIVE-9251 SetSparkReducerParallelism is likely to set too small number of reducers [Spark Branch]
- Resolved