- 1864 Views
- 5 replies
- 2 kudos
Resolved! How to create Storage Credential using Service Principal [Azure]
As the document indicates, An Azure Databricks access connector is a first-party Azure resource that lets you connect managed identities to an Azure Databricks account. You must have the Contributor role or higher on the access connector resource in ...
- 1864 Views
- 5 replies
- 2 kudos
- 2 kudos
Thank you, @szymon_dybczak. This is what I thought. After deploying the Databricks workspace, it automatically creates the Databricks managed `Access Connector for Azure Databricks` in the Databricks managed resource group.As I understand, I should c...
- 2 kudos
- 1779 Views
- 1 replies
- 1 kudos
Cluster Upsize Issue: Storage Download Failure Slow
Hi,We're currently experiencing the following issue across our entire Databricks Workspace when either starting a cluster, running a workflow, or upscaling a running cluster. The following errors we receive on our AP clusters and job clusters are bel...
- 1779 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @sdick_vg ,The error is about connectivity issues when trying to reach Azure Storage.Have you maybe enabled any kind of firewall in your organization recently?Could you run for example code to test DNS resolution to your storage account:Have you m...
- 1 kudos
- 757 Views
- 1 replies
- 0 kudos
Error: PERMISSION_DENIED: AWS IAM role does
Hello, We are trying to setup a new workspace. However we are getting following error. Workspace failed to launch.Error: PERMISSION_DENIED: AWS IAM role does not have READ permissions on url s3://jk-databricks-prods3/unity-catalog/742920957025975.Pl...
- 757 Views
- 1 replies
- 0 kudos
- 0 kudos
Hey! I'm experiencing this with the latest Terraform release. Try 1.51.0 if you are deploying via TF, downgrading fixed this for me.
- 0 kudos
- 711 Views
- 2 replies
- 0 kudos
table deployment (DDL) from one catalog to another
HelloWe have a development, a test and a production environmentHow do you generally deploy DDL changes?So, alter a table in development and apply to test then productione.g.table1 has column1, column2, column3I add column4I now want to deploy this ch...
- 711 Views
- 2 replies
- 0 kudos
- 0 kudos
Thanks.I'll step through this solution and see if I can get it working
- 0 kudos
- 310 Views
- 1 replies
- 0 kudos
Testing and Issues Related to Admin Role Changes
Hello,I would like to ask a question regarding user permissions.Currently, all team members are admins. Recently, we plan to change the admin roles so that only I and another user, A, will be admins. The other members will retain general usage permis...
- 310 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Kaniz_Fatma , Can you please help me to delete my post as I accidentally resubmitted the post.Thank you.
- 0 kudos
- 1878 Views
- 2 replies
- 0 kudos
Resolved! Cannot use Terraform to create Databricks Storage Credential
Hi all,When I use Terraform in an Azure DevOps pipeline to create Databricks Storage Credential, I got the following error. Has anybody met the same error before? Or is there any idea how to debug it? Error: cannot create storage credential: failed d...
- 1878 Views
- 2 replies
- 0 kudos
- 0 kudos
I found the reason. Because I did not configure the `auth_type` for the Terraform Databricks provider, it uses the default auth type `azure-cli`. However, in my pipeline, I did not log in Azure CLI using `az login`. Therefore, the authentication of t...
- 0 kudos
- 409 Views
- 0 replies
- 0 kudos
DNS resolution across vnet
Hi, I have created a new databricks workspace in Azure with backend private link. Settings are Required NSG rules - No Azure Databricks RuleNSG rules for AAD and azfrontdoor were added as per documentation. Private endpoint with subresource databric...
- 409 Views
- 0 replies
- 0 kudos
- 488 Views
- 1 replies
- 0 kudos
How to fully enable row-level concurrency on Databricks 14.1
Hey guys,I hope whoever's reading this is doing well.We're trying to enable row-level concurrency on Databricks 14.1. However, the documentation seems a bit contradictory as to whether this is possible, and whether the whole capability of row-level c...
- 488 Views
- 1 replies
- 0 kudos
- 0 kudos
Yes, your understanding is correct. As best practice, use 14.3 LTS DBR, as LTS has longer support compared to 6 months.
- 0 kudos
- 296 Views
- 1 replies
- 0 kudos
Unable to publish notebook
Hello, I receive an error each time I attempt to publish my notebook. the error message does not provide any additional information on why it is occurring. I need to publish this for a bootcamp assignment, so please let me know what I can do to resol...
- 296 Views
- 1 replies
- 0 kudos
- 0 kudos
Try it from a different browser, or try it in incognito mode.
- 0 kudos
- 548 Views
- 2 replies
- 0 kudos
Allow non-admin users to view the driver logs from a Unity Catalog-enabled pipeline
We are trying to enable the option to allow log reading to the non-admin users in the databricks wokspace but we are not able to see the obvious way to check them. The documentation is not showing after enabling the below property where to check the...
- 548 Views
- 2 replies
- 0 kudos
- 0 kudos
Do they see these links in the pipeline?
- 0 kudos
- 1399 Views
- 2 replies
- 0 kudos
Resolved! Help with Databricks SQL Queries
Hi everyone,I’m relatively new to Databricks and trying to optimize some SQL queries for better performance. I’ve noticed that certain queries take longer to run than expected. Does anyone have tips or best practices for writing efficient SQL in Data...
- 1399 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @alexacas ,The best thing is to share the queries and table structures But my general approach is:1. Use partitioning/zordering, or if you can upgrade runtime to 15.4, use liquid clustering, that is the new optimization technique.2. Make sure you ...
- 0 kudos
- 1411 Views
- 3 replies
- 0 kudos
Grant permissions to groups on catalogs linked to the same metastore
Hi everyone!I am configuring several projects using Databricks, and I have a question regarding permission management in Unity Catalog. Here's the situation: Currently, I have two different Databricks resources in an Azure account, each with its resp...
- 1411 Views
- 3 replies
- 0 kudos
- 0 kudos
You could bind the catalog to specific workspaces, making them accessbile only from workspaces they are bound to.https://docs.databricks.com/en/catalogs/binding.htmlIn your example:if `my_catalog_2` is bound to `my_workspace_2` a user in `my_workspac...
- 0 kudos
- 350 Views
- 1 replies
- 0 kudos
Databricks Free Trial - can I link it back from AWS to Azure?
I created a 14-day trial account on Databricks.com and linked it to my AWS. I have not used the trial yet and want to link it to Azure instead. Question: Can I do it, and if so, how?Remark: Please note that I have not used trial yet and the only rele...
- 350 Views
- 1 replies
- 0 kudos
- 0 kudos
The easy way will be to create a new account in portal.azure.com and launch Azure Databricks with Pay as you go.
- 0 kudos
- 489 Views
- 0 replies
- 0 kudos
Implementing Databricks Persona in
Hi all,I am looking to implement the "persona" based access control across multiple workspaces for multiple user groups in Azure Databricks workspaces. Specifically,- I have a "DEV" workspace where the developer groups (Data Engineers and ML Engineer...
- 489 Views
- 0 replies
- 0 kudos
- 365 Views
- 0 replies
- 0 kudos
How to install private repository as package dependency in Databricks Workflow
I am a member of the development team in our company and we use Databricks as sort of like ETL tool. We utilize git integration for our program and run Workflow daily basis. Recently, we created another company internal private git repository and wan...
- 365 Views
- 0 replies
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
Access control
1 -
Access Delta Tables
2 -
ActiveDirectory
1 -
AmazonKMS
1 -
Apache spark
1 -
App
1 -
Availability
1 -
Availability Zone
1 -
AWS
5 -
Aws databricks
1 -
AZ
1 -
Azure
8 -
Azure Data Lake Storage
1 -
Azure databricks
6 -
Azure databricks workspace
1 -
Best practice
1 -
Best Practices
2 -
Billing
2 -
Bucket
1 -
Cache
1 -
Change
1 -
Checkpoint
1 -
Checkpoint Path
1 -
Cluster
1 -
Cluster Pools
1 -
Clusters
1 -
ClustersJob
1 -
Compliance
1 -
Compute Instances
1 -
Cost
1 -
Credential passthrough
1 -
Data
1 -
Data Ingestion & connectivity
6 -
Data Plane
1 -
Databricks Account
1 -
Databricks Control Plane
1 -
Databricks Error Message
2 -
Databricks Partner
1 -
Databricks Repos
1 -
Databricks Runtime
1 -
Databricks SQL
3 -
Databricks SQL Dashboard
1 -
Databricks workspace
1 -
DatabricksJobs
1 -
DatabricksLTS
1 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Dbu
3 -
Deep learning
1 -
DeleteTags Permissions
1 -
Delta
4 -
Delta Sharing
1 -
Delta table
1 -
Dev
1 -
Different Instance Types
1 -
Disaster recovery
1 -
DisasterRecoveryPlan
1 -
DLT Pipeline
1 -
EBS
1 -
Email
2 -
External Data Sources
1 -
Feature
1 -
GA
1 -
Ganglia
3 -
Ganglia Metrics
2 -
GangliaMetrics
1 -
GCP
1 -
GCP Support
1 -
Gdpr
1 -
Gpu
2 -
Group Entitlements
1 -
HIPAA
1 -
Hyperopt
1 -
Init script
1 -
InstanceType
1 -
Integrations
1 -
IP Addresses
1 -
IPRange
1 -
Job
1 -
Job Cluster
1 -
Job clusters
1 -
Job Run
1 -
JOBS
1 -
Key
1 -
KMS
1 -
KMSKey
1 -
Lakehouse
1 -
Limit
1 -
Live Table
1 -
Log
2 -
LTS
3 -
Metrics
1 -
MFA
1 -
ML
1 -
Model Serving
1 -
Multiple workspaces
1 -
Notebook Results
1 -
Okta
1 -
On-premises
1 -
Partner
75 -
Pools
1 -
Premium Workspace
1 -
Public Preview
1 -
Redis
1 -
Repos
1 -
Rest API
1 -
Root Bucket
2 -
SCIM API
1 -
Security
1 -
Security Group
1 -
Security Patch
1 -
Service principal
1 -
Service Principals
1 -
Single User Access Permission
1 -
Sns
1 -
Spark
1 -
Spark-submit
1 -
Spot instances
1 -
SQL
1 -
Sql Warehouse
1 -
Sql Warehouse Endpoints
1 -
Ssh
1 -
Sso
2 -
Streaming Data
1 -
Subnet
1 -
Sync Users
1 -
Tags
1 -
Team Members
1 -
Thrift
1 -
TODAY
1 -
Track Costs
1 -
Unity Catalog
1 -
Use
1 -
User
1 -
Version
1 -
Vulnerability Issue
1 -
Welcome Email
1 -
Workspace
2 -
Workspace Access
1
- « Previous
- Next »
User | Count |
---|---|
42 | |
26 | |
24 | |
15 | |
9 |