Details
-
Documentation
-
Status: Closed
-
Minor
-
Resolution: Fixed
-
1.2.0
-
None
Description
In the current latest version of Spark (1.2.0) If you go to the Python API, in the RDD section, there is no documentation for rdd.randomSplit(): http://spark.apache.org/docs/latest/api/python/pyspark.html#pyspark.RDD
Nevertheless, it is used as an example in the 1.2.0 documentation for mllib: http://spark.apache.org/docs/latest/mllib-ensembles.html#regression
(It's in the Python code tab, you can Ctrl+F and search for "randomSplit").
But looking in the code, it seems implemented: https://github.com/apache/spark/blob/branch-1.2/python/pyspark/rdd.py#L322