- 4992 Views
- 15 replies
- 8 kudos
A number of people have questions on using Databricks in a productionalized environment. What are the best practices to enable CICD automation?
- 4992 Views
- 15 replies
- 8 kudos
Latest Reply
Any leads/posts for Databricks CI/CD integration with Bitbucket pipeline. I am facing the below error while I creation my CICD pipeline pipelines:branches:master:- step:name: Deploy Databricks Changesimage: docker:19.03.12services:- dockerscript:# U...
14 More Replies
- 21211 Views
- 7 replies
- 11 kudos
Hi everyone,Do you guys know if it's possible to automate the Databricks workflow deployment through azure devops (like what we do with the deployment of notebooks)?
- 21211 Views
- 7 replies
- 11 kudos
Latest Reply
Did you get a chance to try Brickflows - https://github.com/Nike-Inc/brickflowYou can find the documentation here - https://engineering.nike.com/brickflow/v0.11.2/Brickflow uses - Databricks Asset Bundles(DAB) under the hood but provides a Pythonic w...
6 More Replies
- 1781 Views
- 5 replies
- 3 kudos
Hi, follow the example to create one user. It's working however I want to create multiple users, I have tried many ways but still cannot get it work, please share some idea.https://registry.terraform.io/providers/databricks/databricks/latest/docs/res...
- 1781 Views
- 5 replies
- 3 kudos
Latest Reply
What if I want to give User Name along with the email ID?I used below code but its not helping(code is not failing, but not adding user name)It seems this code line: "display_name = each.key" is not working. Pls suggest. terraform {required_provider...
4 More Replies
by
Suman
• New Contributor III
- 617 Views
- 2 replies
- 2 kudos
I want to implement lint in my project as part of CICD. But as my notebooks has a mix of python and scala . Can Lint be implemented there ?
- 617 Views
- 2 replies
- 2 kudos
Latest Reply
Suman
New Contributor III
@Vidula Khanna​ I didn't hear from anyone . I am looking for this answer to implement in my databricks code of my project.Can you please assist.
1 More Replies
by
DJey
• New Contributor III
- 3049 Views
- 4 replies
- 3 kudos
Hi All. I have a scenario where there are few .sql scripts present in my repo. Is there any way we can execute those SQLs on Databricks via Azure DevOps CI/CD pipeline?Please help.
- 3049 Views
- 4 replies
- 3 kudos
Latest Reply
Hi @Divyansh Jain​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers...
3 More Replies
- 367 Views
- 0 replies
- 0 kudos
In Databricks, CI/CD process is decoupled or coupled ?
- 367 Views
- 0 replies
- 0 kudos
by
SK21
• New Contributor II
- 976 Views
- 3 replies
- 1 kudos
I had created Jobs to trigger the respective notebooks in Databricks Workflow.Now I need to move them to further environments.Would you please help me with an CICD process to promote jobs to further environments.
- 976 Views
- 3 replies
- 1 kudos
Latest Reply
Please use jobs API 2.1 You can get job and save JSON with that jobs to git.In git then set variables defining databricks workspaces (URL and token) and after push define that API call is triggered with your json stored in git.
2 More Replies
- 2290 Views
- 7 replies
- 6 kudos
Hi, We currently leverage Azure DevOps to source control our notebooks and use CICD to publish the notebooks to different environments and this works very well. We do not have the same functionality available for Databricks jobs (the ability to sourc...
- 2290 Views
- 7 replies
- 6 kudos
Latest Reply
My team is currently looking at establishing REPO(s) for source control to start. I know I've seen some documentation for when a MERGE is completed to auto update the main branch in DBX remote repo. Does annyone have a template and/or best practices ...
6 More Replies
- 573 Views
- 1 replies
- 1 kudos
My team has a shared codebase and we are running into issues as we migrate to Databricks when two people are doing development on connected sections of our codebase.For example if I add a column to a table for changes on my branch, other members on m...
- 573 Views
- 1 replies
- 1 kudos
Latest Reply
@Coleman Milligan​ It's really hard to create something like this without basic knowledge about how CICD should work or even Terraform.You can start here, to understand some basics.https://servian.dev/how-to-hardening-azure-databricks-using-terraform...
- 1418 Views
- 2 replies
- 1 kudos
Hello,I was wondering if there is a way to deploy Databricks Workflows and Delta Live Table pipelines across Workspaces (DEV/UAT/PROD) using Azure DevOps.
- 1418 Views
- 2 replies
- 1 kudos
Latest Reply
Yes, for sure, using Rest API Calls to https://docs.databricks.com/workflows/delta-live-tables/delta-live-tables-api-guide.htmlYou can create DLT manually from GUI and take JSON representative of it, tweak it (so it uses your env variables, for examp...
1 More Replies
- 7511 Views
- 6 replies
- 6 kudos
How to Develop Locally on Databricks with your Favorite IDEdbx is a Databricks Labs project that allows you to develop code locally and then submit against Databricks interactive and job compute clusters from your favorite local IDE (AWS | Azure | GC...
- 7511 Views
- 6 replies
- 6 kudos
Latest Reply
I'm actually not a fan of dbx. I prefer the AWS Glue interactive sessions way of using the IDE. It's exactly like the web notebook experience. I can see the reason why dbx exists, but I'd still like to use a regular notebook experience in my IDE.
5 More Replies
by
Pat
• Honored Contributor III
- 1188 Views
- 4 replies
- 19 kudos
Hi,do you know if there is a way to create Unity Catalog metastore using Service Principal?Here I can see that for creating account-level resources we need to provide a user and password (https://registry.terraform.io/providers/databricks/databricks/...
- 1188 Views
- 4 replies
- 19 kudos
Latest Reply
Pat
Honored Contributor III
This is supported right now in the Azure, but not yet in AWS, but there is plan for AWS support as well.
3 More Replies
by
RantoB
• Valued Contributor
- 4133 Views
- 10 replies
- 8 kudos
Hi,I would like to import a python notebook to my databricks workspace from my local machine using a python script.I manages to create the folder but then I have a status code 400 when I try to import a file :create_folder = requests.post(
'{}/api/...
- 4133 Views
- 10 replies
- 8 kudos
Latest Reply
Hi, Thanks for your answer.Actually both your code and mine are working. However, I cannot write in the directory Repos which is reserved (but I can create subdirectories...)Thanks to your code I got an error message which helped me to understand. Wi...
9 More Replies
- 1267 Views
- 1 replies
- 5 kudos
Best Practices for CI/CD on DatabricksFor CI/CD and software engineering best practices with Databricks notebooks we recommend checking out this best practices guide (AWS, Azure, GCP).For CI/CD and local development using an IDE, we recommend dbx, a ...
- 1267 Views
- 1 replies
- 5 kudos
Latest Reply
Thank you, @Isaac Gritz​ , for sharing such a fantastic post!
by
mehdi1
• New Contributor III
- 5314 Views
- 9 replies
- 12 kudos
I know the dbutils.widget.text to create a widget in a notebook. So for me the workflow 1. Having a notebook2. Use dbutils.widget.text (or other type of widgets) once in a notebook cell to create a widget3. Remove the cell containing dbutils.widget...
- 5314 Views
- 9 replies
- 12 kudos
Latest Reply
@Mehdi BEN ABDESSELEM​ , Steps for Creating a Basic WidgetStep 1: Create a New ProjectTo create a new project in Android Studio, please refer to How to Create/Start a New Project in Android Studio. We are implementing it for both Java and Kotlin lang...
8 More Replies