- 6689 Views
- 1 replies
- 0 kudos
Data Bricks Architect ceritification
Hello Team,I am planning to pursue the Databricks Architect certification. Could you please let me know which certification I should opt for?If you have any study material or relevant links, kindly share them.Your support would be highly appreciated....
- 6689 Views
- 1 replies
- 0 kudos
- 0 kudos
The Architect credential is an accreditation, not a certification. Accreditations are less rigorous and less expensive than certifications. You didn't say which platform you are on, so here are links to the learning plans (which have the exams) for...
- 0 kudos
- 1868 Views
- 1 replies
- 1 kudos
Pros and cons of putting all various Databricks workspaces (dev, qa , prod) under one metastore
Hi there, If we have separate workspaces for each and every environment, then how we should go about structuring the metastore? What are the pros and cons of putting all workspaces under one metastore instead of having separate metastore for each?Tha...
- 1868 Views
- 1 replies
- 1 kudos
- 1 kudos
Hello Fatima, many thanks for your question. Please first note that if all the workspaces belong to the same account id and are on the same cloud region, they will all need to be associated with the same metastore as you can only have 1 metastore per...
- 1 kudos
- 7718 Views
- 7 replies
- 8 kudos
Resolved! Databricks best practices for azure storage account
Hello EveryoneCurrently, We are in process of building azure databricks and have some doubt regarding best practices to follow for azure storage account which we will be using to store data. Can anyone help me finding best practices to follow for sto...
- 7718 Views
- 7 replies
- 8 kudos
- 8 kudos
Thanks for sharing @Rjdudley @szymon_dybczak @filipniziol
- 8 kudos
- 3084 Views
- 1 replies
- 0 kudos
Budget Policy - Service Principals don't seem to be allowed to use budget policies
ObjectiveTransfer existing DLT pipeline to new owner (service principal). Budget policies enabled.Steps to reproduceCreated a service principalAssigned it group membership of a group that is allowed to use a budget policyEnsured it has access to the ...
- 3084 Views
- 1 replies
- 0 kudos
- 0 kudos
“Thanks for your question! I’m looking into this and will get back to you.”
- 0 kudos
- 931 Views
- 3 replies
- 1 kudos
Resolved! How to add existing recipient to existing delta share
I created a recipient in the Databricks console and also set up a Delta Share. Now, I’d like to link this existing recipient to the Delta Share. Is there a way to accomplish this using Terraform?
- 931 Views
- 3 replies
- 1 kudos
- 1 kudos
Hi @Naïm Thanks for your response. It seems your answer is helping me, but I'm facing another issue. The owner of my recipient is a group, not an individual user. I'm running this Terraform script using a service principal that is a member of that gr...
- 1 kudos
- 2041 Views
- 3 replies
- 0 kudos
Resolved! Control plane set-up
Dear all,In this video from Databricks, Azure Databricks Security Best Practices - https://www.youtube.com/watch?v=R1X8ydIR_Bc&t=623sduring this duration in the video 13.25 - 14.35the presenter talks about benefits of private endpoints. He makes the ...
- 2041 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @noorbasha534, Does this control plane then contains management services for several customers? - Yes, Control Plane has management services that are used across customers in the region. Due to which the presenter says traffic can be isolated fro...
- 0 kudos
- 3519 Views
- 5 replies
- 1 kudos
policy_id in databricks asset bundle workflow
We are using databricks asset bundle for code deployment and biggest issue I am facing is that policy_id is different in each environment.I tried with environment variable sin azure devops and also with declaring the variables in databricks.yaml and ...
- 3519 Views
- 5 replies
- 1 kudos
- 1 kudos
Solved by the lookup function https://docs.databricks.com/en/dev-tools/bundles/variables.html#retrieve-an-objects-id-value
- 1 kudos
- 854 Views
- 1 replies
- 0 kudos
Is it possible to disable file download in Volumes interface?
Workspace security administration panel offers to disable downloads in notebook folders and workspaces. However, it seems that even if all those downloads are disabled, the "Volumes" panel of Unity Catalog still offers a file download button. Is it p...
- 854 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @staskh ,Unfortunately, I don't think it is possible to disable it via UI currently. But volumes are governed by UC permission, so maybe you can try to set permission to read/write to approved group of users and take permission of users who should...
- 0 kudos
- 3449 Views
- 1 replies
- 0 kudos
Issues with Delta Sharing API when using Service Principal Token
Hello,I am currently working with the Delta Sharing API and have encountered an issue when using a Service Principal token for authentication. The API call returns the following error:[CANNOT_INFER_EMPTY_SCHEMA] Can not infer schema from empty datase...
- 3449 Views
- 1 replies
- 0 kudos
- 0 kudos
Please find the response below: 1) The Delta Sharing API supports both personal access tokens and service principal tokens for authentication. 2) Service principals need to be granted specific roles and permissions to access data. This includes assi...
- 0 kudos
- 3180 Views
- 1 replies
- 0 kudos
- 3180 Views
- 1 replies
- 0 kudos
- 0 kudos
The below documentation shows how to install libraries in a cluster. https://docs.databricks.com/en/libraries/cluster-libraries.html#install-a-library-on-a-cluster
- 0 kudos
- 876 Views
- 3 replies
- 2 kudos
GCP Databricks GKE cluster with 4 nodes
I am working on setting up GCP Databricks and successfully created first GCP-Databricks workspace, but what I observed is it is incurring additional charges even i am using 14days free trail. It is GKE cluster with 4 nodes which are spin up as part o...
- 876 Views
- 3 replies
- 2 kudos
- 2 kudos
Thank you @BigRoux,Just want to dig more into this as is there any way to reduce this nodes using CLI or creating customer managed network.
- 2 kudos
- 3366 Views
- 2 replies
- 2 kudos
Resolved! Databricks All-purpose compute Pricing
Hello, I am now struggling how to calculate the cost of my job cluster.My configuration is as below:If I have to run the above cluster 18 hours per day, in Standard Tier and East Asia Region, how much will be the pricing of the cluster?Any help provi...
- 3366 Views
- 2 replies
- 2 kudos
- 2 kudos
@karen_c Let me make a small correction.It seems that you have checked the option for Spot Instances, which should make the cost slightly lower. Please refer to the far-right column of the attached pricing table for more details.Additionally, you hav...
- 2 kudos
- 4980 Views
- 5 replies
- 2 kudos
Resolved! Azure Databricks Unity Catalog - Cannot access Managed Volume in notebook
The problemAfter setting up Unity Catalog and a managed Volume, I can upload/download files to/from the volume, on Databricks Workspace UI.However, I cannot access the volume from notebook. I created an All-purpose compute, and run dbutils.fs.ls("/Vo...
- 4980 Views
- 5 replies
- 2 kudos
- 2 kudos
I found the reason and a solution, but I feel this is a bug. And I wonder what is the best practice.When I enable the ADSL Gen2's Public network access from all networks as shown below, I can access the volume from a notebook.However, if I enable the...
- 2 kudos
- 1457 Views
- 2 replies
- 1 kudos
Access to system.billing.usage tables
I have Account, Marketplace, Billing Admin roles. I have visibility to system.billing.list_prices table only.How do I get access to system.billing.usage tables? Databricks instance is on AWS.Thanks
- 1457 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @Alberto_Umana, Thanks for your response. I needed Metastore Admin permissions too. In account console, I changed the Metastore Admin to be a group, became a part of the group. With this other tables were visible. With this permission using the gr...
- 1 kudos
- 2453 Views
- 3 replies
- 0 kudos
Best Practices for Daily Source-to-Bronze Data Ingestion in Databricks
How can we effectively manage source-to-bronze data ingestion from a project perspective, particularly when considering daily scheduling strategies using either Auto Loader or Serverless Warehouse COPY INTO commands?
- 2453 Views
- 3 replies
- 0 kudos
- 0 kudos
No, it is not a strict requirement. You can have a single node job cluster run the job if the job is small.
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
AWS
5 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta
4 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
39 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
User | Count |
---|---|
82 | |
37 | |
25 | |
17 | |
15 |