Description
There is a problem with ALS Factorizer when working with distributed environment and oozie.
Steps:
1) Built mahout 1.0 jars and picked mahout-mrlegacy jar.
2) I have created a Java class in which i have called ParallelALSFactorizationJob with respective inputs.
3) Submitted the job and there are list of Map Reduce jobs which got submitted to perform the factorization.
4) Job failed at MultithreadedSharingMapper with the error Unable to read Sequnce file "<ourprogram>.jar" pointing the code at org.apache.mahout.cf.taste.hadoop.als.ALS and readMatrixByRowsFromDistributedCache method.
Cause: The ALS class picks up input files which are sequential files from the distributed cache using readMatrixByRowsFromDistributedCache method. However, when we are working in oozie environment, the program jar as well being copied to distributed cache with input files. As the ALS class trying to read all the files in distributed cache, it is failing when it encounters jar.
The remedy would be setting a condition to pick files those are other than jars.
Attachments
Issue Links
- Dependent
-
MAHOUT-1634 ALS don't work when it adds new files in Distributed Cache
- Closed