Description
Issue: One can usually take an RDD of case classes and create a SchemaRDD, where Spark SQL infers the schema from the case class metadata. However, if the case class has multiple constructors, then ScalaReflection.schemaFor gets confused.
Motivation: In spark.ml, I would like to create a class with the following signature:
```
case class LabeledPoint(label: Double, features: Vector, weight: Double) {
def this(label: Double, features: Vector) = this(label, features, 1.0)
}
```
Proposed fix: Change ScalaReflection.schemaFor so it checks for whether there are multiple constructors. If there are multiple ones, it should take the primary constructor. This will not change the behavior of existing code since it currently only supports case classes with 1 constructor.