Description
In a pipeline, we don't save additional params specified in `fit()` to transformers, because we should not modify them. The current solution is to store training parameters in the pipeline model and apply those parameters at `transform()`. A better solution would be making transformers copyable. Calling `.copy` on a transformer produces a new transformer with a different UID but same parameters. Then we can use the copied transformers in the pipeline model, with additional params stored.
`copy` may not be a good name because it is not an exact copy.
Attachments
Issue Links
- relates to
-
SPARK-4784 Model.fittingParamMap should store all Params
- Resolved
- links to