Details
-
New Feature
-
Status: Resolved
-
Minor
-
Resolution: Fixed
-
0.9.0
-
None
Description
As discussed here, I think it would be nice if there was a way to programmatically determine what version of Spark you are running.
The potential use cases are not that important, but they include:
- Branching your code based on what version of Spark is running.
- Checking your version without having to quit and restart the Spark shell.
Right now in PySpark, I believe the only way to determine your version is by firing up the Spark shell and looking at the startup banner.