- 5142 Views
- 7 replies
- 8 kudos
Hi, We currently leverage Azure DevOps to source control our notebooks and use CICD to publish the notebooks to different environments and this works very well. We do not have the same functionality available for Databricks jobs (the ability to sourc...
- 5142 Views
- 7 replies
- 8 kudos
Latest Reply
To manage Databricks jobs within a DevOps pipeline, start by exporting the job configuration as a JSON file from the Databricks workspace. Parameterize this JSON by replacing environment-specific values with placeholders. Integrate the parameterized ...
6 More Replies
- 3143 Views
- 3 replies
- 1 kudos
Hi! this is my CI configuration, I added the databricks jobs configure --version=2.1 command but it stills showing this error, any idea of what can I be doing wrong?Error:Resetting Databricks Job with job_id 1036...WARN: Your CLI is configured to use...
- 3143 Views
- 3 replies
- 1 kudos
Latest Reply
I got to solve this by downgrading the Databricks runtime to 13.3 and had the below commands for optimization and it worked well in my case.spark.conf.set("spark.sql.shuffle.partitions", "200")spark.conf.set("spark.sql.execution.arrow.pyspark.enabled...
2 More Replies
- 13368 Views
- 15 replies
- 8 kudos
A number of people have questions on using Databricks in a productionalized environment. What are the best practices to enable CICD automation?
- 13368 Views
- 15 replies
- 8 kudos
Latest Reply
Any leads/posts for Databricks CI/CD integration with Bitbucket pipeline. I am facing the below error while I creation my CICD pipeline pipelines:branches:master:- step:name: Deploy Databricks Changesimage: docker:19.03.12services:- dockerscript:# U...
14 More Replies
- 63228 Views
- 7 replies
- 11 kudos
Hi everyone,Do you guys know if it's possible to automate the Databricks workflow deployment through azure devops (like what we do with the deployment of notebooks)?
- 63228 Views
- 7 replies
- 11 kudos
Latest Reply
Did you get a chance to try Brickflows - https://github.com/Nike-Inc/brickflowYou can find the documentation here - https://engineering.nike.com/brickflow/v0.11.2/Brickflow uses - Databricks Asset Bundles(DAB) under the hood but provides a Pythonic w...
6 More Replies
- 4285 Views
- 5 replies
- 3 kudos
Hi, follow the example to create one user. It's working however I want to create multiple users, I have tried many ways but still cannot get it work, please share some idea.https://registry.terraform.io/providers/databricks/databricks/latest/docs/res...
- 4285 Views
- 5 replies
- 3 kudos
Latest Reply
What if I want to give User Name along with the email ID?I used below code but its not helping(code is not failing, but not adding user name)It seems this code line: "display_name = each.key" is not working. Pls suggest. terraform {required_provider...
4 More Replies
by
Suman
• New Contributor III
- 1503 Views
- 2 replies
- 2 kudos
I want to implement lint in my project as part of CICD. But as my notebooks has a mix of python and scala . Can Lint be implemented there ?
- 1503 Views
- 2 replies
- 2 kudos
Latest Reply
Suman
New Contributor III
@Vidula Khanna I didn't hear from anyone . I am looking for this answer to implement in my databricks code of my project.Can you please assist.
1 More Replies
by
DJey
• New Contributor III
- 15141 Views
- 4 replies
- 3 kudos
Hi All. I have a scenario where there are few .sql scripts present in my repo. Is there any way we can execute those SQLs on Databricks via Azure DevOps CI/CD pipeline?Please help.
- 15141 Views
- 4 replies
- 3 kudos
Latest Reply
Hi @Divyansh Jain Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers...
3 More Replies
- 775 Views
- 0 replies
- 0 kudos
In Databricks, CI/CD process is decoupled or coupled ?
- 775 Views
- 0 replies
- 0 kudos
by
SK21
• New Contributor II
- 2267 Views
- 3 replies
- 1 kudos
I had created Jobs to trigger the respective notebooks in Databricks Workflow.Now I need to move them to further environments.Would you please help me with an CICD process to promote jobs to further environments.
- 2267 Views
- 3 replies
- 1 kudos
Latest Reply
Please use jobs API 2.1 You can get job and save JSON with that jobs to git.In git then set variables defining databricks workspaces (URL and token) and after push define that API call is triggered with your json stored in git.
2 More Replies
- 1624 Views
- 1 replies
- 1 kudos
My team has a shared codebase and we are running into issues as we migrate to Databricks when two people are doing development on connected sections of our codebase.For example if I add a column to a table for changes on my branch, other members on m...
- 1624 Views
- 1 replies
- 1 kudos
Latest Reply
@Coleman Milligan It's really hard to create something like this without basic knowledge about how CICD should work or even Terraform.You can start here, to understand some basics.https://servian.dev/how-to-hardening-azure-databricks-using-terraform...
- 2653 Views
- 2 replies
- 1 kudos
Hello,I was wondering if there is a way to deploy Databricks Workflows and Delta Live Table pipelines across Workspaces (DEV/UAT/PROD) using Azure DevOps.
- 2653 Views
- 2 replies
- 1 kudos
Latest Reply
Yes, for sure, using Rest API Calls to https://docs.databricks.com/workflows/delta-live-tables/delta-live-tables-api-guide.htmlYou can create DLT manually from GUI and take JSON representative of it, tweak it (so it uses your env variables, for examp...
1 More Replies
- 21617 Views
- 6 replies
- 7 kudos
How to Develop Locally on Databricks with your Favorite IDEdbx is a Databricks Labs project that allows you to develop code locally and then submit against Databricks interactive and job compute clusters from your favorite local IDE (AWS | Azure | GC...
- 21617 Views
- 6 replies
- 7 kudos
Latest Reply
I'm actually not a fan of dbx. I prefer the AWS Glue interactive sessions way of using the IDE. It's exactly like the web notebook experience. I can see the reason why dbx exists, but I'd still like to use a regular notebook experience in my IDE.
5 More Replies
by
Pat
• Honored Contributor III
- 2792 Views
- 3 replies
- 19 kudos
Hi,do you know if there is a way to create Unity Catalog metastore using Service Principal?Here I can see that for creating account-level resources we need to provide a user and password (https://registry.terraform.io/providers/databricks/databricks/...
- 2792 Views
- 3 replies
- 19 kudos
Latest Reply
Pat
Honored Contributor III
This is supported right now in the Azure, but not yet in AWS, but there is plan for AWS support as well.
2 More Replies
by
RantoB
• Valued Contributor
- 8952 Views
- 10 replies
- 8 kudos
Hi,I would like to import a python notebook to my databricks workspace from my local machine using a python script.I manages to create the folder but then I have a status code 400 when I try to import a file :create_folder = requests.post(
'{}/api/...
- 8952 Views
- 10 replies
- 8 kudos
Latest Reply
Hi, Thanks for your answer.Actually both your code and mine are working. However, I cannot write in the directory Repos which is reserved (but I can create subdirectories...)Thanks to your code I got an error message which helped me to understand. Wi...
9 More Replies
- 2335 Views
- 0 replies
- 4 kudos
Best Practices for CI/CD on DatabricksFor CI/CD and software engineering best practices with Databricks notebooks we recommend checking out this best practices guide (AWS, Azure, GCP).For CI/CD and local development using an IDE, we recommend dbx, a ...
- 2335 Views
- 0 replies
- 4 kudos