Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-2895

Support mapPartitionsWithContext in Spark Java API

    XMLWordPrintableJSON

Details

    • New Feature
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • None
    • 1.2.0
    • Java API

    Description

      This is a requirement from Hive on Spark, mapPartitionsWithContext only exists in Spark Scala API, we expect to access from Spark Java API.
      For HIVE-7627, HIVE-7843, Hive operators which are invoked in mapPartitions closure need to get taskId.

      Attachments

        Issue Links

          Activity

            People

              chengxiang li Chengxiang Li
              chengxiang li Chengxiang Li
              Votes:
              0 Vote for this issue
              Watchers:
              5 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: