12-15-2021 07:07 AM
Hi guys, how are you ?
How can I access tables outside the Databricks ? (hive metastore)
I have a python script in my local structure but I need to access are tables stored in Databricks (hive metastore) , how ???? Have any idea ?
Thank you guys
12-15-2021 07:20 AM
You can use jdbc/odbc drivers https://docs.databricks.com/integrations/bi/jdbc-odbc-bi.html disadvantage is that in normal version general computing cluster have run.
In public preview there is serverless SQL endpoint - you need to ask databricks for enabling it.
Also when you store your table on storage mounts (Azure blob, s3, ADLS) from many tools (PowerBI, Sata Factory) you can load it as a dataset.
12-15-2021 07:20 AM
You can use jdbc/odbc drivers https://docs.databricks.com/integrations/bi/jdbc-odbc-bi.html disadvantage is that in normal version general computing cluster have run.
In public preview there is serverless SQL endpoint - you need to ask databricks for enabling it.
Also when you store your table on storage mounts (Azure blob, s3, ADLS) from many tools (PowerBI, Sata Factory) you can load it as a dataset.
12-16-2021 03:51 AM
@Hubert Dudek the serveless SQL endpoint it seems to me a good choice
12-16-2021 12:17 AM
using odbc as Hubert mentioned is the easiest way.
Besides Databricks and Databricks SQL, there are also other options like Azure Synapse Serverless, Presto etc.
They all serve data as tables.
Python also has a parquet reader using pyarrow f.e.
12-16-2021 03:52 AM
thank you @Werner Stinckens
08-16-2023 10:00 AM
Thanks, Pokhara Tour
Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections.
Click here to register and join today!
Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.