3 weeks ago
Hi Community,
I have a free demo version and can create a jdbc connection and get metadata (schema, table, and columns structure info).
Everything works as described in the docs, but when working with someone who has a paid version of databricks the same code isn't working. Their admin can see the jdbc connection in the logs and it's a successful connection.
This is the error trace:
[Databricks][JDBCDriver](500594) Error calling GetSchemas API call. Error code from server: 0. Error message from server: TStatus(statusCode:ERROR_STATUS, infoMessages:[*org.apache.hive.service.cli.HiveSQLException:Error
operating GET_SCHEMAS The TCP/IP connection to the host 10.202.34.40, port 1433 has failed. Error: "Connect timed out. Verify the connection properties. Make sure that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port. Make sure that TCP connections to the port are not blocked by a firewall.".:168:167, org.apache.spark.sql.hive.thriftserver.HiveThriftServerErrors$:hiveOperatingError:HiveThriftServerErrors.scala:67, org.apache.spark.sql.hive.thriftserver.HiveThriftServerErrors$:hiveOperatingError:HiveThriftServerErrors.scala:61, org.apache.spark.sql.hive.thriftserver.SparkAsyncOperation$$anonfun$onError$1:applyOrElse:SparkAsyncOperation.scala:210, org.apache.spark.sql.hive.thriftserver.SparkAsyncOperation$$anonfun$onError$1...
To me that looks like databricks can't access its own hive store. I know this other person uses the unity store though. Maybe the hive message isn't related?
I've tried every jdbc url parameter I could find using every combination of options I could think of.
Anyone know what's going on?
Thanks!
Tuesday
Hey @JeffSeaman , I did some digging in our internal docs and a found a few things for you to consider/research:
The most likely root cause has something to do with SQL Server Connectivity issues.
The central error describes a TCP timeout to 10.202.34.40:1433
, which is the classic port for SQL Server:
This aligns precisely with general troubleshooting guidance for JDBC connection timeouts, including:
Are you trying to connect from Free Edition to SQL Server or SQL Server to Free Edition?
Let me know, Louis.
3 weeks ago
Root Cause of Error
Different Metastores
Corrective Solution
Connect via Databricks SQL Warehouse
Use the JDBC endpoint from a SQL Warehouse (not the legacy Hive thrift server).
Example URL:
jdbc:databricks://<workspace-host>:443/default;transportMode=http;ssl=1;httpPath=/sql/1.0/warehouses/<warehouse-id>;AuthMech=3;UID=token;PWD=<personal-access-token>
Use the Official Databricks JDBC Driver
Download from Databricks docs.
Donโt use generic Hive/Spark JDBC drivers.
Verify Unity Catalog Permissions
The user running JDBC must have:
GRANT USE CATALOG ON CATALOG <catalog> TO `<user>`;
GRANT USE SCHEMA ON SCHEMA <catalog>.<schema> TO `<user>`;
Without these, getSchemas() will fail even on the right endpoint.
Test with Basic Queries
Run:
SHOW CATALOGS;
SHOW SCHEMAS IN <catalog>;
SHOW TABLES IN <catalog>.<schema>;
If these succeed, JDBC + Unity Catalog is configured correctly.
Let me know if the above works, and if you find this useful pls mark as an accepted solution
3 weeks ago
Thank you for the quick reply, but I've done all the mentioned steps, including the grants. I got the url base from the SQL Warehouses menu -> Connection details. We are using token auth. (AuthMech=3) ssl is on and transport mode is http.
3 weeks ago - last edited 3 weeks ago
Next Steps
Confirm and switch your compute (SQL Warehouse)
Upgrade the Databricks SQL Warehouse to at least Databricks Runtime 13.3 LTS
Restart the SQL Warehouse after any major configuration changes, especially after attaching to Unity Catalog.
Test with basic queries after these changes:
SHOW CATALOGS;
SHOW CATALOGS;
SHOW SCHEMAS IN <catalog>;
SHOW SCHEMAS IN <catalog>;
SHOW TABLES IN <catalog>.<schema>;
If these steps do not resolve the issue, detailed driver logs and checking edge cases with different user accounts
3 weeks ago
Tried these as well:
3 weeks ago
the posting rules are axing out the whole strings, so here's a short view of the parameters at the end:
Tried these as well:
Tuesday
Hey @JeffSeaman , I did some digging in our internal docs and a found a few things for you to consider/research:
The most likely root cause has something to do with SQL Server Connectivity issues.
The central error describes a TCP timeout to 10.202.34.40:1433
, which is the classic port for SQL Server:
This aligns precisely with general troubleshooting guidance for JDBC connection timeouts, including:
Are you trying to connect from Free Edition to SQL Server or SQL Server to Free Edition?
Let me know, Louis.
yesterday
The problem was an offline custom endpoint. It wasn't part of our metadata scan as we were specifying a specific catalog and schema, but if any part of your instance is down, the jdbc driver will fail the metadata call. The solution was to create a service account that wasn't assigned to that endpoint. I know I'm getting some of the terminology mixed up, but that's the general idea.
yesterday
@JeffSeaman , please let us know if any of my suggestions help get you on the right track. If they do, kindly mark the post as "Accepted Solution" so others can benefit as well.
Cheers, Louis.
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now