3 weeks ago
Hi Community,
I have a free demo version and can create a jdbc connection and get metadata (schema, table, and columns structure info).
Everything works as described in the docs, but when working with someone who has a paid version of databricks the same code isn't working. Their admin can see the jdbc connection in the logs and it's a successful connection.
This is the error trace:
[Databricks][JDBCDriver](500594) Error calling GetSchemas API call. Error code from server: 0. Error message from server: TStatus(statusCode:ERROR_STATUS, infoMessages:[*org.apache.hive.service.cli.HiveSQLException:Error
operating GET_SCHEMAS The TCP/IP connection to the host 10.202.34.40, port 1433 has failed. Error: "Connect timed out. Verify the connection properties. Make sure that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port. Make sure that TCP connections to the port are not blocked by a firewall.".:168:167, org.apache.spark.sql.hive.thriftserver.HiveThriftServerErrors$:hiveOperatingError:HiveThriftServerErrors.scala:67, org.apache.spark.sql.hive.thriftserver.HiveThriftServerErrors$:hiveOperatingError:HiveThriftServerErrors.scala:61, org.apache.spark.sql.hive.thriftserver.SparkAsyncOperation$$anonfun$onError$1:applyOrElse:SparkAsyncOperation.scala:210, org.apache.spark.sql.hive.thriftserver.SparkAsyncOperation$$anonfun$onError$1...
To me that looks like databricks can't access its own hive store. I know this other person uses the unity store though. Maybe the hive message isn't related?
I've tried every jdbc url parameter I could find using every combination of options I could think of.
Anyone know what's going on?
Thanks!
3 weeks ago
Root Cause of Error
Different Metastores
Corrective Solution
Connect via Databricks SQL Warehouse
Use the JDBC endpoint from a SQL Warehouse (not the legacy Hive thrift server).
Example URL:
jdbc:databricks://<workspace-host>:443/default;transportMode=http;ssl=1;httpPath=/sql/1.0/warehouses/<warehouse-id>;AuthMech=3;UID=token;PWD=<personal-access-token>
Use the Official Databricks JDBC Driver
Download from Databricks docs.
Don’t use generic Hive/Spark JDBC drivers.
Verify Unity Catalog Permissions
The user running JDBC must have:
GRANT USE CATALOG ON CATALOG <catalog> TO `<user>`;
GRANT USE SCHEMA ON SCHEMA <catalog>.<schema> TO `<user>`;
Without these, getSchemas() will fail even on the right endpoint.
Test with Basic Queries
Run:
SHOW CATALOGS;
SHOW SCHEMAS IN <catalog>;
SHOW TABLES IN <catalog>.<schema>;
If these succeed, JDBC + Unity Catalog is configured correctly.
Let me know if the above works, and if you find this useful pls mark as an accepted solution
3 weeks ago
Thank you for the quick reply, but I've done all the mentioned steps, including the grants. I got the url base from the SQL Warehouses menu -> Connection details. We are using token auth. (AuthMech=3) ssl is on and transport mode is http.
3 weeks ago - last edited 3 weeks ago
Next Steps
Confirm and switch your compute (SQL Warehouse)
Upgrade the Databricks SQL Warehouse to at least Databricks Runtime 13.3 LTS
Restart the SQL Warehouse after any major configuration changes, especially after attaching to Unity Catalog.
Test with basic queries after these changes:
SHOW CATALOGS;
SHOW CATALOGS;
SHOW SCHEMAS IN <catalog>;
SHOW SCHEMAS IN <catalog>;
SHOW TABLES IN <catalog>.<schema>;
If these steps do not resolve the issue, detailed driver logs and checking edge cases with different user accounts
3 weeks ago
Tried these as well:
3 weeks ago
the posting rules are axing out the whole strings, so here's a short view of the parameters at the end:
Tried these as well:
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now