โ12-15-2021 07:07 AM
Hi guys, how are you ?
How can I access tables outside the Databricks ? (hive metastore)
I have a python script in my local structure but I need to access are tables stored in Databricks (hive metastore) , how ???? Have any idea ?
Thank you guys
โ12-15-2021 07:20 AM
You can use jdbc/odbc drivers https://docs.databricks.com/integrations/bi/jdbc-odbc-bi.html disadvantage is that in normal version general computing cluster have run.
In public preview there is serverless SQL endpoint - you need to ask databricks for enabling it.
Also when you store your table on storage mounts (Azure blob, s3, ADLS) from many tools (PowerBI, Sata Factory) you can load it as a dataset.
โ12-15-2021 07:20 AM
You can use jdbc/odbc drivers https://docs.databricks.com/integrations/bi/jdbc-odbc-bi.html disadvantage is that in normal version general computing cluster have run.
In public preview there is serverless SQL endpoint - you need to ask databricks for enabling it.
Also when you store your table on storage mounts (Azure blob, s3, ADLS) from many tools (PowerBI, Sata Factory) you can load it as a dataset.
โ12-16-2021 03:51 AM
@Hubert Dudekโ the serveless SQL endpoint it seems to me a good choice
โ12-16-2021 12:17 AM
using odbc as Hubert mentioned is the easiest way.
Besides Databricks and Databricks SQL, there are also other options like Azure Synapse Serverless, Presto etc.
They all serve data as tables.
Python also has a parquet reader using pyarrow f.e.
โ12-16-2021 03:52 AM
thank you @Werner Stinckensโ
โ08-16-2023 10:00 AM
Thanks, Pokhara Tour
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group