Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
1.6.0, 1.7.0, 1.8.0
-
None
Description
Compressors are not getting recycled while writing parquet files. This is causing native/physical memory leak in my spark app which is parquet write intensive since its creating new compressors everytime i write parquet files.
The actual code issue is that we are creating 'codecFactory' in 'getRecordWriter' method of ParquetOutputFormat.java but not calling codecFactory.release() which is responsible for recycling compressors.
Attachments
Issue Links
- blocks
-
PARQUET-392 Release Parquet-mr 1.9.0
- Resolved
- links to