Details
-
Sub-task
-
Status: Resolved
-
Major
-
Resolution: Duplicate
-
2.4.0
-
None
-
None
Description
Spark should support select <columnname> into <table> from <table> where <condition> as PostgreSQL supports
create table dup(id int);
insert into dup values(1);
insert into dup values(2);
select id into test_dup from dup where id=1;
select * from test_dup;
Result: Success in PostgreSQL
But select id into test_dup from dup where id=1; in Spark gives ParseException
scala> sql("show tables").show(); +--------+---------+-----------+ |database|tableName|isTemporary| +--------+---------+-----------+ | func| dup| false| +--------+---------+-----------+
scala> sql("select id into test_dup from dup where id=1").show() org.apache.spark.sql.catalyst.parser.ParseException: mismatched input 'test_dup' expecting <EOF>(line 1, pos 15) == SQL == select id into test_dup from dup where id=1 ---------------^^^ at org.apache.spark.sql.catalyst.parser.ParseException.withCommand(ParseDriver.scala:241) at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parse(ParseDriver.scala:117) at org.apache.spark.sql.execution.SparkSqlParser.parse(SparkSqlParser.scala:48) at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parsePlan(ParseDriver.scala:69) at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642) ... 49 elided
Attachments
Issue Links
- duplicates
-
SPARK-28329 SELECT INTO syntax
- Open
Duplicate SPARK-28329?