- 2306 Views
- 1 replies
- 1 kudos
Resolved! How to Pass Azure variable to databricks.yml file
Hello I would like to find a way to pass a variable from my Azure variables to my databricks yml file. For example I would like to pass the variable BUNDLE_TARGET to the location in this databricks.yml fileIs there a way to do something like this?...
- 2306 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @TheManOfSteele ,Here are the examples of how to achieve that. I think the simplest way would be to set environment variables.azure devops - How can I pass parameters to databricks.yml in Databricks Asset Bundles? - Stack OverflowDatabricks Asset ...
- 1 kudos
- 3741 Views
- 0 replies
- 0 kudos
Azure Pipeline Release Bundle Validate Failure
Hello,I am trying to create a CI/CD pipeline. After the build pipeline, I am trying to create a release to databricks that runs a notebook. I am trying to run this as a service principal. During the bundle validate step I am getting this error. "Erro...
- 3741 Views
- 0 replies
- 0 kudos
- 2243 Views
- 3 replies
- 0 kudos
Resolved! Workspace selector not working
On the top-right of the Databricks GUI is a selector to select workspaces from.Since yesterday morning (approximately the same time the Microsoft outage happened) that selector stopped working. Instead of a dropdown of Workspaces, we only get a spin...
- 2243 Views
- 3 replies
- 0 kudos
- 0 kudos
Glad to know the issue has stopped occurring, but in case it re-occurs in future, we could collect some backend logs, to understand what is causing the slowness, when the issue is and resolve the service. I would request you to involve the azure team...
- 0 kudos
- 3823 Views
- 2 replies
- 0 kudos
Azure Network Settings in regards to Databricks Table Monitoring
I have set up my Unity Catalog on an Azure Data Lake which uses the companies virtual network to allow access.I have all privileges on my account, so I am able to create, alter or delete catalogs, schemas and tables. I can do these things either usin...
- 3823 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi Kaniz,Thanks for the response and for identifying the problem.I would like some steps on how to adjust the network settings, as everything I have tried so far, hasn't seemed to work.
- 0 kudos
- 1528 Views
- 1 replies
- 0 kudos
databricks-connect 14.3 spark error against 14.3 cluster with data_security_mode = NONE
I am running into an issue with trying to use a 14.3 cluster with databricks-connect 14.3.My cluster config: { "autoscale": { "min_workers": 2, "max_workers": 10 }, "cluster_name": "Developer Cluster", "spark_version": "14.3.x-scala2...
- 1528 Views
- 1 replies
- 0 kudos
- 0 kudos
Are you running the latest version of the Databricks connect?
- 0 kudos
- 3661 Views
- 2 replies
- 3 kudos
Unable to read resources - Unsupported Protocol Scheme (Terraform AWS)
Hello everyone!Over the last few weeks my company has been trying to deploy a Databricks workspace on AWS adapted to the customer's needs, using Terraform. To do this, we started from a base code on Databricks own github (https://github.com/databrick...
- 3661 Views
- 2 replies
- 3 kudos
- 3 kudos
Whats the solution for this? facing same issue.
- 3 kudos
- 3622 Views
- 2 replies
- 0 kudos
Databricks dashboard programatically
Hi,How can I create a databricks dashboard, filters and visuals programatically (api, terraform, sdk, cli...)?Thanks,Pawel
- 3622 Views
- 2 replies
- 0 kudos
- 0 kudos
Maybe slightly late (maybe because development was late :P), but hopefully it will also help other.1. There seems to be support added to the newest terraform databricks provider - 1.49.0 - here2. Other solution would be to use databricks cli (e.g. `d...
- 0 kudos
- 1359 Views
- 1 replies
- 2 kudos
List deployed Databricks asset bundles (DABs)?
Is there a databricks cli command or REST API to list all the DABs that have been deployed to a workspace?
- 1359 Views
- 1 replies
- 2 kudos
- 2 kudos
Hi @Schofield ,Unfortunately, I don't think there is out of the box command that will provide you this information yet. As a workaround, you can try write some code that will extract this information from REST API. For example, you can use /api/2.1/j...
- 2 kudos
- 1712 Views
- 2 replies
- 2 kudos
Resolved! delta sharing issue after enable predictive optimization
Some of our delta sharing tables are not working May be related to this, or maybe not, we enabled predictive optimization on all tables a few days agoes not working any morebut any new tables created works fine after setting thisSET TBLPROPERTIES (de...
- 1712 Views
- 2 replies
- 2 kudos
- 2 kudos
after some debugging, I find out a very unique cause if we used a JSON string in column comment, and it make sense that a JSON string in column comment breaks delta sharingexample: column COMMENT {"key": "primary_key", "is_identity": "true"}The erro...
- 2 kudos
- 4176 Views
- 2 replies
- 1 kudos
Delta Lake S3 multi-cluster writes - DynamoDB
Hi there!I'm trying to figure out how the multi-writers architecture for Delta Lake tables is implemented under the hood.I understand that a DynamoDB table is used to provide mutual exclusion, but the question is: where is the table located? Is it in...
- 4176 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi, could you please help me here? How can i use this configuration in DataBricks? So I will maintain my transcription logs there, and with Parallel, I can use the Delta-RS job.spark.conf.set("spark.delta.logStore.s3a.impl", "io.delta.storage.S3Dynam...
- 1 kudos
- 5339 Views
- 1 replies
- 0 kudos
Technical Architecture - Feedback
Hello MembersI have designed a Technical Architecture (image attached). I would like some feedback on the current design (especially from 5.1 and onwards) and maybe some more ideas or anything else I can use instead of Azure Service Bus and Cosmos DB...
- 5339 Views
- 1 replies
- 0 kudos
- 0 kudos
In the step 3 you will want to consider using Databricks Workflows for orchestration. The ADF databricks notebook activity is not actively developed by microsoft and the API it uses is legacy by Databricks So neither vendor is actively supporting t...
- 0 kudos
- 660 Views
- 0 replies
- 0 kudos
- 660 Views
- 0 replies
- 0 kudos
- 20553 Views
- 4 replies
- 0 kudos
Resolved! Help Needed: Errors with df.display() and df.show() in Databricks
Dear Databricks Community,I am reaching out to you for assistance with some issues I'm encountering in my Databricks environment. I'm hoping the community can provide some guidance to help me resolve these problems.1. Error with df.display(): When I ...
- 20553 Views
- 4 replies
- 0 kudos
- 0 kudos
Dear Databricks Community,I wanted to share some updates regarding the issues I've been encountering in my Databricks environment.After raising a ticket with Microsoft and collaborating with their team for approximately a week, we undertook several t...
- 0 kudos
- 3829 Views
- 2 replies
- 0 kudos
Asset bundle yml factorization
Hello,I have a project using asset bundle in which I have several jobs using roughly the same job definition (tags and job clusters definitions are always the same) Is there a way to put everything in common in a yml file and reuse that in each indiv...
- 3829 Views
- 2 replies
- 0 kudos
- 0 kudos
@erigaud What might work, I actually never tried it by myself so far, is this:Define your complex variables in a separate yaml file (complex variables are supported since v0.222.0), import this file using include, and reference these variables accord...
- 0 kudos
- 955 Views
- 1 replies
- 1 kudos
Capture error for databricks job
Greetings ! We have created a Databricks job using Notebook. This notebook has 6 cells . Can we capture the Success and failure (along with error) and Store it into for monitoring and analysis . Ex if we want to capture the below error
- 955 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @sukanya09 ,You can use jobs API, each run will have information about status of teach task in the jobhttps://docs.databricks.com/api/workspace/jobs/getrunoutput
- 1 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
60 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 119 | |
| 39 | |
| 37 | |
| 28 | |
| 25 |