12-15-2021 07:07 AM
Hi guys, how are you ?
How can I access tables outside the Databricks ? (hive metastore)
I have a python script in my local structure but I need to access are tables stored in Databricks (hive metastore) , how ???? Have any idea ?
Thank you guys
12-15-2021 07:20 AM
You can use jdbc/odbc drivers https://docs.databricks.com/integrations/bi/jdbc-odbc-bi.html disadvantage is that in normal version general computing cluster have run.
In public preview there is serverless SQL endpoint - you need to ask databricks for enabling it.
Also when you store your table on storage mounts (Azure blob, s3, ADLS) from many tools (PowerBI, Sata Factory) you can load it as a dataset.
12-15-2021 07:20 AM
You can use jdbc/odbc drivers https://docs.databricks.com/integrations/bi/jdbc-odbc-bi.html disadvantage is that in normal version general computing cluster have run.
In public preview there is serverless SQL endpoint - you need to ask databricks for enabling it.
Also when you store your table on storage mounts (Azure blob, s3, ADLS) from many tools (PowerBI, Sata Factory) you can load it as a dataset.
12-16-2021 03:51 AM
@Hubert Dudek the serveless SQL endpoint it seems to me a good choice
12-16-2021 12:17 AM
using odbc as Hubert mentioned is the easiest way.
Besides Databricks and Databricks SQL, there are also other options like Azure Synapse Serverless, Presto etc.
They all serve data as tables.
Python also has a parquet reader using pyarrow f.e.
12-16-2021 03:52 AM
thank you @Werner Stinckens
08-16-2023 10:00 AM
Thanks, Pokhara Tour
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group