Details

    • Sub-task
    • Status: To Do
    • Minor
    • Resolution: Unresolved
    • None
    • None
    • s2jobs
    • None

    Description

      Spark structured streaming provides a simple metric for each small batch, using them it is possible to monitor a streaming job.

      https://spark.apache.org/docs/latest/structured-streaming-programming-guide.html#monitoring-streaming-queries

       

      We can simply report metrics using drop wizard library via the metricsEnabled setting, but I think it would be convenient to implement a StreamingQueryListener so that only the necessary events are taken into the callback.

       

      Personally, if we implement a KafkaStreamingListener and send metrics to Kafka, it seems to be easy to save in other storage or configure dashboards or alarms.

       

       

       

       

       

       

      Attachments

        Activity

          People

            chul Chul Kang
            chul Chul Kang
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated: