ā12-15-2021 07:07 AM
Hi guys, how are you ?
How can I access tables outside the Databricks ? (hive metastore)
I have a python script in my local structure but I need to access are tables stored in Databricks (hive metastore) , how ???? Have any idea ?
Thank you guys
ā12-15-2021 07:20 AM
You can use jdbc/odbc drivers https://docs.databricks.com/integrations/bi/jdbc-odbc-bi.html disadvantage is that in normal version general computing cluster have run.
In public preview there is serverless SQL endpoint - you need to ask databricks for enabling it.
Also when you store your table on storage mounts (Azure blob, s3, ADLS) from many tools (PowerBI, Sata Factory) you can load it as a dataset.
ā12-15-2021 07:20 AM
You can use jdbc/odbc drivers https://docs.databricks.com/integrations/bi/jdbc-odbc-bi.html disadvantage is that in normal version general computing cluster have run.
In public preview there is serverless SQL endpoint - you need to ask databricks for enabling it.
Also when you store your table on storage mounts (Azure blob, s3, ADLS) from many tools (PowerBI, Sata Factory) you can load it as a dataset.
ā12-16-2021 03:51 AM
@Hubert Dudekā the serveless SQL endpoint it seems to me a good choice
ā12-16-2021 12:17 AM
using odbc as Hubert mentioned is the easiest way.
Besides Databricks and Databricks SQL, there are also other options like Azure Synapse Serverless, Presto etc.
They all serve data as tables.
Python also has a parquet reader using pyarrow f.e.
ā12-16-2021 03:52 AM
thank you @Werner Stinckensā
ā08-16-2023 10:00 AM
Thanks, Pokhara Tour
Passionate about hosting events and connecting people? Help us grow a vibrant local communityāsign up today to get started!
Sign Up Now