Hi guys, how are you ?How can I access tables outside the Databricks ? (hive metastore)I have a python script in my local structure but I need to access are tables stored in Databricks (hive metastore) , how ???? Have any idea ?Thank you guys
I am trying to assign my databricks_metastore on terraform and I get the following error back as an output `Error: cannot create metastore assignment: Workspace is not in the same region as metastore. Workspace region: Unknown, Metastore region: us-e...
The error message you received indicates that the Databricks workspace region is different from the region where the metastore is located. The message shows that the workspace region is "Unknown", which means that Databricks is unable to determine th...
Yes, Delta Sharing does support AWS PrivateLink. According to the official documentation, using AWS PrivateLink is a recommended way to enable private connectivity between clusters on the data plane and core services on the control plane within the D...
Hi Everyone, I configured my workspace with the unity catalog; however, ever since, I am unable to add users to my workspace. The issue is not related to the process of adding a user, but is related to the added user cannot receive invitation emails ...
Hi @pandas , I faced the similar issue and upon raising a support ticket to Databricks, got to know the root cause (based on the response received). In case your email address has some issues receiving the email when the invite is being sent, the ema...
If I have multiple cluster-scoped init scripts, can I guarantee the order that these scripts are run in? I have to create a folder on the workers and other scripts copy files into that folder.
During UC upgrade, we are required to migrate to account groups from workspace local groups. We are unable to add an account group without first deleting first the workspace local group (as they both have the same name).And, during this process, the ...
Hi @NOOR_BASHASHAIK ,
Suppose you are enabling identity federation on an existing workspace. In that case, you can use account groups and workspace-local groups side-by-side. Still, it is recommended to turn workspace-local groups into account grou...
Is anyone able to see unity catalog objects in Datagrip? I can only view hive_metastore objects.Nothing mentioned here https://docs.databricks.com/dev-tools/datagrip.html
Working with someone else together we have been able to get unity catalog objects in Datagrip. I am only able to connect to one catalog database at a time using separate Data Sources as I have not been successful loading them all from one.Under Data ...
According to the docs, we should be able to use SYNC TABLE to 'upgrade' a table from a Hive metastore to Unity Catalog. We are using AWS Glue as our Hive Metastore, but tables created in Glue do not seem to be set up in a format that Databricks likes...
Hi @JameDavi_51481, Unity Catalog does not support tables created in AWS Glue using Hive SerDe and cannot be upgraded using the SYNC command. The recommended solution is to change the tables into Delta format and then issue the SYNC command to upgrad...
and situation?I am currently trying to came up with the way how to deploy Databricks with Terraform in multi-region, multi-tenant environment. I am not talking about simple cases like this (https://docs.databricks.com/data-governance/unity-catalo...
Is it possible to sign DBFS file URLs via the service principal for temporary public access? What I actually want to do is to have integration with Databricks in my app platform. For that purpose, I need to sign DBFS file URLs from my platform (ec2) ...
Signing a file is generally related to what is done by the underlying cloud storage. However, there might be better ways of doing this depending on what you want to share.If you want to share structured data you can save it as a Delta table in the Me...
Starting DBR 12.1 you can run the UNDROP command to undrop a managed or external table. The table must be in Unity Catalog for this feature to work. See https://docs.databricks.com/sql/language-manual/sql-ref-syntax-ddl-undrop-table.html for more det...
We have setup SCIM with Okta at the account-level and setup Unity Catalog and are in the process of migrating groups from workspace-local to account-level. I have an instance profile that was assigned to a workspace-local group. using `databricks_gro...
Retried this using `databricks_group_role` after the `1.210` release of the `databricks/databricks` provider. This worked with an account-level group using the workspace provider and credentials.
We have a problem with users having permissions to create tables inn the legacy hive_metastore. This only happens when using a personal cluster and when table location is not set. The default catalog has been set to main in the workspace, and users ...
I have a system, where the data governed by (A)AD groups. How do I create personal user sandboxes that the user can grant permission to to another user, only if that user that has access to the originating data? I can hear myself that this sounds a b...
Dear Team,Kindly help me to change default EBS volume GP2 type to GP3 type in my Job Compute and All-Purpose Compute. As per documentations, under Instance, EBS Volume configuration is not available, Attached the Snaps for reference RegardsSridhar
Hi @667572
We haven't heard from you since the last response from @Kaniz ​, and I was checking back to see if her suggestions helped you.
Or else, If you have any solution, please share it with the community, as it can be helpful to others.
Also, P...