Details
-
Improvement
-
Status: Resolved
-
Major
-
Resolution: Duplicate
-
None
-
None
-
None
Description
Since we want users to use the "Hadoop provided" builds, we should explain how to use these since it's much less obvious than just downloading Spark with the built-in hadoop libraries.
My thought is to have a section in the docs for different environments (Apache Hadoop, MapR, HDP, Cloudera, etc) that shows how to modify the classpath to include Hadoop client libraries.
Attachments
Issue Links
- duplicates
-
SPARK-6511 Publish "hadoop provided" build with instructions for different distros
- Resolved