- 2549 Views
- 1 replies
- 0 kudos
I am trying to create an external table using csv file which is stored in ADLS gen2 My account owner has created a storage credential and an external location I am a databricks user who all privileges on external location when trying to create a tabl...
- 2549 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @Shubhanshu, To overcome the error and create the table, ensure that the client secret token associated with the Azure Active Directory (Azure AD) application service principal is not expired or invalid.
Here are the steps you can follow:
1. Open...
- 2537 Views
- 2 replies
- 0 kudos
We extensively use Databricks (DBX) for creating Tableau visuals. Whenever DBX data sources are published on our self hosted Tableau Server, we have to add the connection creds for the data source. These creds can be both personal email id-password o...
- 2537 Views
- 2 replies
- 0 kudos
Latest Reply
Thanks for your response. While I understood most of it, I still have some doubts to clear. Therefore explaining it again. Problem: We want users to create data sources using Databricks via their personal credentials in the local env (tableau desktop...
1 More Replies
by
TonyUK
• New Contributor II
- 5198 Views
- 7 replies
- 4 kudos
Hi, I have a new workspace that was converted to be controlled via Unity Catalog.There is not data stored here as this is not in use yet, so I wanted to ask if it was safe to remove this without breaking the workspace?Thank you
- 5198 Views
- 7 replies
- 4 kudos
Latest Reply
As things gravitate towards Unity Catalog, it would indeed be nice to have the the ability to completely remove Hive from existing setups.
6 More Replies
- 2883 Views
- 1 replies
- 0 kudos
I have created an external table in unity catalog and when describing its data getting the following errororg.apache.hadoop.fs.FileAlreadyExistsException: Operation failed: "The specified path, or an element of the path, exists and its resource type ...
- 2883 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @Shubhanshu , The error message you're seeing is a FileAlreadyExistsException from the Hadoop File System (HDFS). This error typically occurs when there's an attempt to create a file or directory that already exists, and the operation being perfor...
- 2583 Views
- 4 replies
- 0 kudos
Hi TeamI am looking into implementing a functionality which allows me to create a ServicePrincipal and I want to create a personal token for this ServicePrincipal using java sdk. While trying to do this I am getting this error ```com.databricks.sdk.c...
- 2583 Views
- 4 replies
- 0 kudos
Latest Reply
Hi Thank you for your response how would the databricks know for which service principal I want to grant permission?
3 More Replies
- 3105 Views
- 3 replies
- 1 kudos
Hi,As part of the DataGovernance or Authorization topic we are working on automation of the code for granting the access CATALOG LEVEL,SCHEMA LEVEL and TABLE LEVEL in Unity CatalogAs USE CATALOG Provides access at the Catalog level to user/group(whic...
- 3105 Views
- 3 replies
- 1 kudos
Latest Reply
Hi @sandeephenkel23, Based on the information provided, if you want to grant a user access to only a particular table from a database/schema and not to all the tables, you should use the SELECT privilege on that particular table. Using USE SCHEMA or ...
2 More Replies
- 2365 Views
- 1 replies
- 1 kudos
We are in the process of integrating the Unity Catalog with the workspaces across our organization. As a preliminary step, we need to designate a Databricks Account Admin.Our Azure tenant encompasses multiple subscriptions, each hosting distinct work...
- 2365 Views
- 1 replies
- 1 kudos
Latest Reply
Hi @cayisi, The Databricks Account Admin role does not grant authority over individual subscriptions but extends control over all subscriptions within the tenant. This is because the Databricks Account represents a single entity that can include mult...
- 3044 Views
- 2 replies
- 2 kudos
I am attempting to execute the following command from a notebook on a runtime 10.4 cluster, but I'm encountering an error: "current user does not have privilege USAGE on Catalog".To provide some context, I am using DBR 10.4 specifically because I nee...
- 3044 Views
- 2 replies
- 2 kudos
Latest Reply
Hi @drii_cavalcanti, The error you're encountering is related to the new USAGE privilege that is enforced on Databricks clusters running Databricks Runtime 7.3 LTS and above. This privilege is required to access an object in the database. To resolve ...
1 More Replies
- 1636 Views
- 2 replies
- 0 kudos
Hi everyone,Just wondering if I copy the catalog folders (under the metastore storage) to a different s3 bucket, once I have a new metastore with a new ID I can restore the tables as they were <catalog_name>.<schema_name>.<table_name> into the new me...
- 1636 Views
- 2 replies
- 0 kudos
Latest Reply
Hi @drii_cavalcanti, copying the catalog folders to a different S3 bucket will not restore the tables in the new metastore. The COPY INTO command loads data from a source (like an S3 bucket or ADLS Gen2 container) into a table in a Databricks workspa...
1 More Replies
by
Jas
• New Contributor II
- 1838 Views
- 2 replies
- 2 kudos
In our Production tenant we have majority of our workspaces in the North Europe region. We have disaster recovery set up and these workspaces are in West Europe, so will require 2 MetastoresIf Unity Catalog was to get implemented, is there a way to s...
- 1838 Views
- 2 replies
- 2 kudos
Latest Reply
Thank you Kaniz - I **bleep** relay this information to my team and reach out if we have any issues when implementing.
1 More Replies
- 7980 Views
- 12 replies
- 12 kudos
Hello,I want to grant 'select' option on some columns in my table.So i created view, and now people can access my view but cannot access the table.However, they cannot do 'select' on the view, because they do not have privileges to the table.What can...
- 7980 Views
- 12 replies
- 12 kudos
Latest Reply
To sum up,It is not possible to create a view based on a table that someone does not have permission to.It's a pity, because this option would be very useful to limit access to batch-fed tables.Thanks all for the help and your time!Regards,Łukasz
11 More Replies
- 1266 Views
- 2 replies
- 1 kudos
HiUnity Metastore can currently only be provisioned once for each Azure region. I have three environments in Azure each consisting of a separate Subscription and Resource Group (RG). Each RG contains a Databricks Workspace and a Storage Account holdi...
- 1266 Views
- 2 replies
- 1 kudos
Latest Reply
Ask and ye shall receive: https://github.com/databricks/terraform-provider-databricks/releases/tag/v1.24.0They've implemented account-level UC API in terraform now!
1 More Replies
- 2679 Views
- 3 replies
- 0 kudos
In my Databricks regular Clusters, Our clusters are configured to an External Hive Meta Store (Azure SQL Server). I am able to set the External Hive configuration in the Advanced options Spark config setting within the cluster setting. In our setting...
- 2679 Views
- 3 replies
- 0 kudos
Latest Reply
@RafaelGomez61 I have the some problem and after some exploring, I've configured it and also wrote a blog for the whole steps. You can have a look here: https://medium.com/@judy3.yang/how-to-configure-external-hive-meta-store-for-databricks-sql-wareh...
2 More Replies
- 7789 Views
- 7 replies
- 3 kudos
I'm looking to migrate onto unity catalog but a number of my data ingestion notebooks throw a securityexception/whitelist errors for numerous spark. functionsIs there some configuration setting I need to enable to whitelist the spark.* methods/functi...
- 7789 Views
- 7 replies
- 3 kudos
Latest Reply
Hi @Jakub K I'm sorry you could not find a solution to your problem in the answers provided.Our community strives to provide helpful and accurate information, but sometimes an immediate solution may only be available for some issues.I suggest provid...
6 More Replies
by
Marra
• New Contributor III
- 9104 Views
- 12 replies
- 18 kudos
Hi!So I've been looking into trying Unity Catalog since it seems to add many great features.But one thing I cant get my head around is the fact that we cant (shouldn't?) use multiple metastores in the same region in UC.Let me explain my usecase:We ha...
- 9104 Views
- 12 replies
- 18 kudos
Latest Reply
I think the answer to this issue is have accounts by environment. Would be better if Databricks introduced an Organisations features as per AWS.
11 More Replies