- 3404 Views
- 2 replies
- 1 kudos
We have been successfully managing access for our unity catalogs using the databricks_grant resources in Terraform. Now we want to enable the Rudderstack integration for Databricks, but that does not support unity catalog and instead put files inside...
- 3404 Views
- 2 replies
- 1 kudos
Latest Reply
@Mattias P :Unfortunately, it is not currently possible to manage access to the Hive Metastore catalog (or other external metastores) using the databricks_grant resource in Terraform. This is because the databricks_grant resource is specifically des...
1 More Replies
- 2326 Views
- 7 replies
- 5 kudos
Hello,I'm confused about documentation on privilege types when using HMS.The following page is supposed to talk about HMShttps://docs.databricks.com/sql/language-manual/sql-ref-privileges-hms.htmlbut it also mentions READ FILESQuery files directly us...
- 2326 Views
- 7 replies
- 5 kudos
Latest Reply
Hi @Chris Nawara I'm sorry you could not find a solution to your problem in the answers provided.Our community strives to provide helpful and accurate information, but sometimes an immediate solution may only be available for some issues.I suggest ...
6 More Replies
- 3495 Views
- 4 replies
- 5 kudos
I tried to create a view in hive_metastore.default which would access a table from a different catalog. Is there any chance to do so?eg.create view myTest as select * from someCatalog.someSchema.someTable
- 3495 Views
- 4 replies
- 5 kudos
Latest Reply
I will suggest the below links will help you, For exposing your Hive_metastore multiple catalogs, 1) Create 3 catalogs for each environment under single metastore by using Unity Catalog2) Now expose the Unity Catalog using DELTA SHARING to your BI ap...
3 More Replies
- 580 Views
- 0 replies
- 2 kudos
We are setting up Databricks on our AWS account using customer managed VPC and enabling private link. I could see the private endpoints for frontend and backend with which we could spin up our EC2 instances. But the problem we are facing is to connec...
- 580 Views
- 0 replies
- 2 kudos
by
Shafi
• New Contributor III
- 2999 Views
- 4 replies
- 6 kudos
Hi Tried to create a delta table from spark data frame using below command:destination_path = "/dbfs/mnt/kidneycaredevstore/delta/df_corr_feats_spark_4"df_corr_feats_spark.write.format("delta").option("delta.columnMapping.mode", "name").option("path"...
- 2999 Views
- 4 replies
- 6 kudos
Latest Reply
Pat
Honored Contributor III
Hi @Shafiul Alam ,who gave those names to columns? you can rename you columns, replace spaces / special characters, for example:%python
import re
list_of_columns = df_corr_feats_spark.colums
renamed_list_of_columns = [ re.sub(r'[^0-9a-zA-Z]+', "_", ...
3 More Replies
- 3129 Views
- 6 replies
- 4 kudos
Hi allWanted to check if anyone has made an attempt to exploit the Hive Metastore of Databricks for lineage?For example, I loaded metadata of 2 databricks databases using the Collibra Marketplace provided Databricks driver. Here is the scenario -Data...
- 3129 Views
- 6 replies
- 4 kudos
Latest Reply
@Chetan Kardekar @Kaniz Fatma yes, I still need a standard way (through SQL) to access Hive Metastore.
5 More Replies