Description
Currently, there's one spark-scala-parent module which is shared by multiple scala module. This can reduce code duplication, but bring difficulties for developing spark module. It is not easy for new developers to configure it properly in IDE and run unit test in IDE.
So this ticket is just to propose restructure the spark module: move the code in spark-scala-parent to each scala module, this would cause code duplication, but ease the development phase.
Attachments
Issue Links
- links to