- 1503 Views
- 2 replies
- 0 kudos
Resolved! Cannot use Terraform to create Databricks Storage Credential
Hi all,When I use Terraform in an Azure DevOps pipeline to create Databricks Storage Credential, I got the following error. Has anybody met the same error before? Or is there any idea how to debug it? Error: cannot create storage credential: failed d...
- 1503 Views
- 2 replies
- 0 kudos
- 0 kudos
I found the reason. Because I did not configure the `auth_type` for the Terraform Databricks provider, it uses the default auth type `azure-cli`. However, in my pipeline, I did not log in Azure CLI using `az login`. Therefore, the authentication of t...
- 0 kudos
- 295 Views
- 0 replies
- 0 kudos
DNS resolution across vnet
Hi, I have created a new databricks workspace in Azure with backend private link. Settings are Required NSG rules - No Azure Databricks RuleNSG rules for AAD and azfrontdoor were added as per documentation. Private endpoint with subresource databric...
- 295 Views
- 0 replies
- 0 kudos
- 1381 Views
- 8 replies
- 3 kudos
Resolved! Deploy Workflow only to specific target (Databricks Asset Bundles)
I am using Databricks Asset Bundles to deploy Databricks workflows to all of my target environments (dev, staging, prod). However, I have one specific workflow that is supposed to be deployed only to the dev target environment.How can I implement tha...
- 1381 Views
- 8 replies
- 3 kudos
- 3 kudos
Thanks for getting back and clarifying @szymon_dybczak
- 3 kudos
- 363 Views
- 1 replies
- 0 kudos
How to fully enable row-level concurrency on Databricks 14.1
Hey guys,I hope whoever's reading this is doing well.We're trying to enable row-level concurrency on Databricks 14.1. However, the documentation seems a bit contradictory as to whether this is possible, and whether the whole capability of row-level c...
- 363 Views
- 1 replies
- 0 kudos
- 0 kudos
Yes, your understanding is correct. As best practice, use 14.3 LTS DBR, as LTS has longer support compared to 6 months.
- 0 kudos
- 233 Views
- 1 replies
- 0 kudos
Unable to publish notebook
Hello, I receive an error each time I attempt to publish my notebook. the error message does not provide any additional information on why it is occurring. I need to publish this for a bootcamp assignment, so please let me know what I can do to resol...
- 233 Views
- 1 replies
- 0 kudos
- 0 kudos
Try it from a different browser, or try it in incognito mode.
- 0 kudos
- 386 Views
- 2 replies
- 0 kudos
Allow non-admin users to view the driver logs from a Unity Catalog-enabled pipeline
We are trying to enable the option to allow log reading to the non-admin users in the databricks wokspace but we are not able to see the obvious way to check them. The documentation is not showing after enabling the below property where to check the...
- 386 Views
- 2 replies
- 0 kudos
- 0 kudos
Do they see these links in the pipeline?
- 0 kudos
- 1238 Views
- 2 replies
- 0 kudos
Resolved! Help with Databricks SQL Queries
Hi everyone,I’m relatively new to Databricks and trying to optimize some SQL queries for better performance. I’ve noticed that certain queries take longer to run than expected. Does anyone have tips or best practices for writing efficient SQL in Data...
- 1238 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @alexacas ,The best thing is to share the queries and table structures But my general approach is:1. Use partitioning/zordering, or if you can upgrade runtime to 15.4, use liquid clustering, that is the new optimization technique.2. Make sure you ...
- 0 kudos
- 956 Views
- 3 replies
- 0 kudos
Grant permissions to groups on catalogs linked to the same metastore
Hi everyone!I am configuring several projects using Databricks, and I have a question regarding permission management in Unity Catalog. Here's the situation: Currently, I have two different Databricks resources in an Azure account, each with its resp...
- 956 Views
- 3 replies
- 0 kudos
- 0 kudos
You could bind the catalog to specific workspaces, making them accessbile only from workspaces they are bound to.https://docs.databricks.com/en/catalogs/binding.htmlIn your example:if `my_catalog_2` is bound to `my_workspace_2` a user in `my_workspac...
- 0 kudos
- 258 Views
- 1 replies
- 0 kudos
Databricks Free Trial - can I link it back from AWS to Azure?
I created a 14-day trial account on Databricks.com and linked it to my AWS. I have not used the trial yet and want to link it to Azure instead. Question: Can I do it, and if so, how?Remark: Please note that I have not used trial yet and the only rele...
- 258 Views
- 1 replies
- 0 kudos
- 0 kudos
The easy way will be to create a new account in portal.azure.com and launch Azure Databricks with Pay as you go.
- 0 kudos
- 367 Views
- 0 replies
- 0 kudos
Implementing Databricks Persona in
Hi all,I am looking to implement the "persona" based access control across multiple workspaces for multiple user groups in Azure Databricks workspaces. Specifically,- I have a "DEV" workspace where the developer groups (Data Engineers and ML Engineer...
- 367 Views
- 0 replies
- 0 kudos
- 269 Views
- 0 replies
- 0 kudos
How to install private repository as package dependency in Databricks Workflow
I am a member of the development team in our company and we use Databricks as sort of like ETL tool. We utilize git integration for our program and run Workflow daily basis. Recently, we created another company internal private git repository and wan...
- 269 Views
- 0 replies
- 0 kudos
- 689 Views
- 4 replies
- 1 kudos
set access control list for job cluster
Hello,currently, we are using an init script that calls the DBW API to add "can_attach_to" permissions for a specific group to initialize the job cluster. How can we set an access control list for job clusters? Is it possible to add it to a policy?
- 689 Views
- 4 replies
- 1 kudos
- 1 kudos
Hi @mkieszek ,You can use REST API to set access control list for job clusters. The endpoint is for doing that you can find below/api/2.0/permissions/clusters/{cluster_id} (https://docs.databricks.com/api/workspace/clusters/setpermissions)And it's n...
- 1 kudos
- 913 Views
- 0 replies
- 7 kudos
Azure Databricks Multi Tenant Solution
Hello Everyone,For the past few months, we’ve been extensively exploring the use of Databricks as the core of our data warehousing product. We provide analytics dashboards to other organizations and are particularly interested in the Column-Level Sec...
- 913 Views
- 0 replies
- 7 kudos
- 717 Views
- 5 replies
- 2 kudos
Resolved! Databricks bundle custom variables feature does not work for workspace host
Hi all,I am not sure whether this is a bug. I followed the document.My databricks.yml file: bundle: name: test variables: DATABRICKS_HOST: description: The host of the Databricks workspace. DATABRICKS_JOB_RUNNING_SERVICE_PRINCIPAL_CLIENT_I...
- 717 Views
- 5 replies
- 2 kudos
- 2 kudos
Hi @AlbertWang ,The documentation states you cannot map workspace host:
- 2 kudos
- 325 Views
- 0 replies
- 1 kudos
Troubleshooting Cluster
We had a failure on a previously running fact table load (our biggest one) and it looked like an executor was failing due to a timeout error. As a test we upped the cluster size and changed the spark.executor.heartbeatinterval to 300s and the spark....
- 325 Views
- 0 replies
- 1 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
Access control
1 -
Access Delta Tables
2 -
ActiveDirectory
1 -
AmazonKMS
1 -
Apache spark
1 -
App
1 -
Availability
1 -
Availability Zone
1 -
AWS
5 -
Aws databricks
1 -
AZ
1 -
Azure
8 -
Azure Data Lake Storage
1 -
Azure databricks
6 -
Azure databricks workspace
1 -
Best practice
1 -
Best Practices
2 -
Billing
2 -
Bucket
1 -
Cache
1 -
Change
1 -
Checkpoint
1 -
Checkpoint Path
1 -
Cluster
1 -
Cluster Pools
1 -
Clusters
1 -
ClustersJob
1 -
Compliance
1 -
Compute Instances
1 -
Cost
1 -
Credential passthrough
1 -
Data
1 -
Data Ingestion & connectivity
6 -
Data Plane
1 -
Databricks Account
1 -
Databricks Control Plane
1 -
Databricks Error Message
2 -
Databricks Partner
1 -
Databricks Repos
1 -
Databricks Runtime
1 -
Databricks SQL
3 -
Databricks SQL Dashboard
1 -
Databricks workspace
1 -
DatabricksJobs
1 -
DatabricksLTS
1 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Dbu
3 -
Deep learning
1 -
DeleteTags Permissions
1 -
Delta
4 -
Delta Sharing
1 -
Delta table
1 -
Dev
1 -
Different Instance Types
1 -
Disaster recovery
1 -
DisasterRecoveryPlan
1 -
DLT Pipeline
1 -
EBS
1 -
Email
2 -
External Data Sources
1 -
Feature
1 -
GA
1 -
Ganglia
3 -
Ganglia Metrics
2 -
GangliaMetrics
1 -
GCP
1 -
GCP Support
1 -
Gdpr
1 -
Gpu
2 -
Group Entitlements
1 -
HIPAA
1 -
Hyperopt
1 -
Init script
1 -
InstanceType
1 -
Integrations
1 -
IP Addresses
1 -
IPRange
1 -
Job
1 -
Job Cluster
1 -
Job clusters
1 -
Job Run
1 -
JOBS
1 -
Key
1 -
KMS
1 -
KMSKey
1 -
Lakehouse
1 -
Limit
1 -
Live Table
1 -
Log
2 -
LTS
3 -
Metrics
1 -
MFA
1 -
ML
1 -
Model Serving
1 -
Multiple workspaces
1 -
Notebook Results
1 -
Okta
1 -
On-premises
1 -
Partner
62 -
Pools
1 -
Premium Workspace
1 -
Public Preview
1 -
Redis
1 -
Repos
1 -
Rest API
1 -
Root Bucket
2 -
SCIM API
1 -
Security
1 -
Security Group
1 -
Security Patch
1 -
Service principal
1 -
Service Principals
1 -
Single User Access Permission
1 -
Sns
1 -
Spark
1 -
Spark-submit
1 -
Spot instances
1 -
SQL
1 -
Sql Warehouse
1 -
Sql Warehouse Endpoints
1 -
Ssh
1 -
Sso
2 -
Streaming Data
1 -
Subnet
1 -
Sync Users
1 -
Tags
1 -
Team Members
1 -
Thrift
1 -
TODAY
1 -
Track Costs
1 -
Unity Catalog
1 -
Use
1 -
User
1 -
Version
1 -
Vulnerability Issue
1 -
Welcome Email
1 -
Workspace
2 -
Workspace Access
1
- « Previous
- Next »
User | Count |
---|---|
40 | |
13 | |
9 | |
9 | |
9 |