cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

ERROR: Writing to Unity Catalog from Remote Spark using JDBC

DumbBeaver
New Contributor II
This is my code here.
 
df = spark.createDataFrame([[1,1,2]], schema=['id','first_name','last_name'])

(df.write.format("jdbc")

    .option("url",  <jdbc-url>)

    .option("dbtable","hive_metastore.default.test")

    .option("driver", "com.databricks.client.jdbc.Driver")

    .mode("overwrite")

    # .saveAsTable("test")

    .save()

)

The Error

An error occurred while calling o273.save.
: java.sql.SQLException: [Databricks][JDBCDriver](500051) ERROR processing query/statement. Error Code: 0, SQL state: 42601,
Query: CREATE TAB***, Error message from Server: org.apache.hive.service.cli.HiveSQLException:
Error running query: [PARSE_SYNTAX_ERROR]
org.apache.spark.sql.catalyst.parser.ParseException:
[PARSE_SYNTAX_ERROR] Syntax error at or near '\"id\"'.(line 1, pos 19)

== SQL ==
CREATE TABLE test (\"id\" BIGINT , \"first_name\" BIGINT , \"last_name\" BIGINT )
1 ACCEPTED SOLUTION

Accepted Solutions

feiyun0112
Contributor

 

%scala
import org.apache.spark.sql.jdbc.{JdbcDialect, JdbcDialects}

JdbcDialects.registerDialect(new JdbcDialect() {
    override
    def canHandle(url: String): Boolean = url.toLowerCase.startsWith("jdbc:databricks:")
    override
    def quoteIdentifier(column: String): String = column
})

 

you need register a JDBC Spark dialect to remove quote in column name

 

View solution in original post

3 REPLIES 3

feiyun0112
Contributor

 

%scala
import org.apache.spark.sql.jdbc.{JdbcDialect, JdbcDialects}

JdbcDialects.registerDialect(new JdbcDialect() {
    override
    def canHandle(url: String): Boolean = url.toLowerCase.startsWith("jdbc:databricks:")
    override
    def quoteIdentifier(column: String): String = column
})

 

you need register a JDBC Spark dialect to remove quote in column name

 

Thanks, it worked!

Kaniz
Community Manager
Community Manager

I want to express my gratitude for your effort in selecting the most suitable solution. It's great to hear that your query has been successfully resolved. Thank you for your contribution.




 

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.