I'm trying to store MLlib instances in Unity Catalog Volumes. I think volumes are a great way to keep things organized.I can save to a volume without any issues and I can access the data using spark.read and with plain python open(). However, when I ...
Just to supplement that if the ML model is saved and then loaded within the same execution, calling load() will not cause the mentioned exception. Copying the model directory from UC volume to ephemeral storage attached to the driver node is also a w...
I have a spreadheet containing table & column descriptions (comments) Is there a way to upload this against the schema in unity catalog? Basically instead of running 'Alter table <> alter column <> comment "description" ' command for every colum...
Hi @prasad_vaze , Let’s explore how you can manage column comments in the Unity Catalog within Azure Databricks.
Unity Catalog Overview:
The Unity Catalog is a fine-grained governance solution for data and AI on the Databricks platform.It simplifie...
Hi All,I have created a new Unity Catalog (UC) metastore (new metastore ID) in west europe and now i want to import alle catalog/schema/tables form the old metastore (west europe) container to the new UC metastore.Note: coping all tables form the old...
Hi @Barney, To import all catalog/schema/tables from the old metastore to the new Unity Catalog (UC) metastore, you can use the Data Explorer upgrade wizard provided by Databricks.
Here are the steps:
1. Open the Data Explorer by clicking Data in th...
Buy Binance account | 100% KYCNeed to buy binance account? Buy this account in our store. We have a lot of accounts and you can see it in our website. Just use this linkBuy Binance Account HereBuy Binance Account Here
To list all the shares in the workspace, the api does not support include_shared_data as supported in get share details. Is there any way we can list the shares with the objects. https://docs.databricks.com/api/azure/workspace/shares/listhttps://doc...
I can't create a metastore. Getting an error "This region already contains a metastore. Only a single metastore per region is allowed." But that was my first time creating it. Afterwards I have checked if it was region specific... but that was also n...
RESOLVED:- Open Databricks Workspace-->Admin Setting-->Workspace Settings--> Scroll to Storage --> Click on Purge for 'Permanently purge workspace storage' & 'Permanently purge all revision history'.FYI: It may work by just purging the workspace stor...
   Hello community,i'm trying to create a unity catalog in azure dataricks, but facing the issue, while creating it shows metastore already exist as error but in reality this is the first time i'm creating, in background it created meta store as half...
Hi @venkateshkallam , If you are still facing the issue, please try purging storage & revisions from Databricks Workspace.You can do this by going to Admin Settings --> Storage --> purgeI did it for my workspace where residual files may be causing th...
Hi all, I got an Internal server error (500) when creating the metastore in Azure. This is the endpoint that throws 500 error. https://accounts.azuredatabricks.net/api/2.0/accounts/4bfcxxxx-bfea-483f-b59b-c2b2e2xxxx4/metastores/0b056xxx-6552-4bed-96c...
Recently it has not been possible to create a metastore. The error "Internal Server Error" appears. Despite authorization of the Managed Identities as Storage Blob Data Contributor. Apparently the error has started showing up on a few more recently, ...
Hi all, I got a similar error as described in this discussion. Internal server error (500) on https://accounts.azuredatabricks.net/api/2.0/accounts/4bfcxxxx-bfea-483f-b59b-c2b2e2xxxx4/metastores/0b056xxx-6552-4bed-96cd-69bab2xxxx62/storage-credential...
I am interested to hear from anyone who has setup the Security Analysis Tool (SAT) on a GCP hosted Databricks environment.I am in the process of getting the tool setup and I'm experiencing issues running the security_analysis_initializer notebook. Th...
Thanks @Kaniz,I have been able to get past this error through recreating the cluster with absolute barebone config. It was potentially a custom configuration (unknown at this time) which was causing this to fail. I will try and reproduce once I get s...
When I run CREATE OR REPLACE VIEW on an existing view in Unity Catalog, the grants that were made on that object are removed. This seems like it is a bug. Is it on purpose or not?How to replicate:1. Create the viewRun the create or replace statement:...
Hi @vmpmreistad, It appears that the issue you're facing is a known behaviour in Databricks when you execute a CREATE OR REPLACE VIEW statement on an existing view. This action overwrites the existing view definition, including any previously granted...
Hello,I am working with Unity Catalogue in Azure Databricks. I have enabled the system schemas for my workspace but unable to figure out a way to send these system table logs to ADLS which I have mounted using Azure Databricks Connector. Can someone ...
@Amber-26 you can try this approach, also if you want to have graphical representation of everything you can use lakehouse monitoring feature, where after enabling these tables, you can consume them into dashboards and run them and analyze them
I get an error message saying I Error getting sample data, when I try to view sample data from a table in a schema I created in a Unity Catalog. I dropped the schema and table and got a collague to recreate and still the same message. We are both Uni...
Hi,we are trying to move some of our code from a ‘legacy’ cluster to a ‘Multi-node/ Shared' cluster so that we can start using Unity Catalog. However, we have run into an issue with some of our code, which calls Stored Procedures, on the new cluster....
We face the same issue. Our code depends heavily on stored procedures (best practice, no).Unfortunately, when moving to shared clusters and DBR 13.3, this does not work anymore.driver_manager = spark._sc._gateway.jvm.java.sql.DriverManager
connection...
Hello,I activated the "Table access control" option and changed the cluster access mode to Shared with the aim of giving access rights to the tables without using the unity catalog.Since this change I can't access dbfs files with python:
Hi @Yahya24 ,
When you change the cluster access mode to "Shared" on Databricks, the cluster is associated with a Databricks-managed IAM role that is used to access AWS resources. This role might not have the necessary permissions to access DBFS res...