cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Anonymous
by Not applicable
  • 645 Views
  • 0 replies
  • 0 kudos

www.jamboreeindia.com

Jamboree is the leading institute offering specialized classroom and online test prep solutions for study abroad entrance exams like GMAT, GRE, SAT, TOEFL and IELTS.https://www.jamboreeindia.com/

  • 645 Views
  • 0 replies
  • 0 kudos
Rithwik_Malla_2
by New Contributor
  • 3081 Views
  • 3 replies
  • 5 kudos

Resolved! Terraform - Databricks CI/CD pipeline

Can anyone help me in configuring the CI/CD for a ADB terraform code .The problem I am facing authentication something went wrong in there . Can anyone help me on this .Pipeline implementation in Azure DevOps.

  • 3081 Views
  • 3 replies
  • 5 kudos
Latest Reply
Ravi
Databricks Employee
  • 5 kudos

Hi @Rithwik Aditya Manoj Malla​ , as requested by @Prabakar Ammeappin​ earlier could you please share the code block and the error details. Also, you can refer to the below doc to authenticate ADB from TF code.https://registry.terraform.io/providers/...

  • 5 kudos
2 More Replies
mcharl02
by New Contributor III
  • 10585 Views
  • 9 replies
  • 6 kudos

How do I restore auto-close quote & auto-close parentheses functionality?

Over the last two days, my team's databricks notebooks (using Python interpreter) have stopped automatically adding a close single quote (') with a cursor between the two. Same issue with automatically adding close parentheses.The cluster has been re...

  • 10585 Views
  • 9 replies
  • 6 kudos
Latest Reply
Sajesh
Databricks Employee
  • 6 kudos

The fix for this issue will be most likely released to all regions/workspaces by 18th Nov 21

  • 6 kudos
8 More Replies
kjoth
by Contributor II
  • 6683 Views
  • 7 replies
  • 12 kudos

Resolved! Databricks cluster Encryption keystore_password

How to set up this value? Is this any value we can provide or the default value we have to p#!/bin/bash   keystore_file="/dbfs/<keystore_directory>/jetty_ssl_driver_keystore.jks" keystore_password="gb1gQqZ9ZIHS" sasl_secret=$(sha256sum $keystore_file...

  • 6683 Views
  • 7 replies
  • 12 kudos
Latest Reply
Prabakar
Databricks Employee
  • 12 kudos

Hi @karthick J​ please refer to this notebook.https://docs.microsoft.com/en-us/azure/databricks/_static/notebooks/cluster-encryption-init-script.htmlFurther, if you will be using %pip magic command the below post will be helpful.https://community.dat...

  • 12 kudos
6 More Replies
sarvesh
by Contributor III
  • 4422 Views
  • 0 replies
  • 0 kudos

Can we read an excel file with many sheets with there indexes?

I am trying to read a excel file which has 3 sheets which have integers as there names,sheet 1 name = 21sheet 2 name = 24sheet 3 name = 224i got this data from a user so I can't change the sheet name, but with spark reading these is an issue.code -v...

  • 4422 Views
  • 0 replies
  • 0 kudos
StephanieAlba
by Databricks Employee
  • 7081 Views
  • 2 replies
  • 3 kudos

Resolved! Best Data Model for moving from DW to Delta lake

I’m curious what Databricks recommends how we model the data. Do they recommend that the data be in 3rd normal form (3NF). Or should be it be dimensionally modeled (facts and dimensions)

  • 7081 Views
  • 2 replies
  • 3 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 3 kudos

It all depends on the use case.3NF is ideal for transactional systems. So for a data warehouse/lakehouse that might not be ideal.However there certainly are cases where it is interesting.Star schema's are def still relevant, BUT with the processing p...

  • 3 kudos
1 More Replies
Junee
by New Contributor III
  • 6496 Views
  • 5 replies
  • 3 kudos

Resolved! What happens to the clusters whose jobs are canceled or terminated due to failures? (Jobs triggered through Job API2.1 using runs/submit)

I am using Databeicks Job Api 2.1 to trigger and run my jobs. "jobs/runs/submit" this API helps in starting the cluster, as well as create the job and run it. This API works great for normal jobs as it also cleans the cluster once job is finished suc...

  • 6496 Views
  • 5 replies
  • 3 kudos
Latest Reply
User16871418122
Contributor III
  • 3 kudos

@Junee, Anytime! It is crisply mentioned in the doc too. https://docs.databricks.com/clusters/index.html

  • 3 kudos
4 More Replies
francescocamuss
by New Contributor III
  • 21474 Views
  • 12 replies
  • 10 kudos

Resolved! Databricks rbase container: Rstudio doesn´t work

Hello, How are you? I hope you are doing well!I´m trying to use a databrick´s image (link: containers/ubuntu/R at master · databricks/containers (github.com)) to run a container when starting a cluster. I need that Rstudio is installed on the contain...

1 2 3 5
  • 21474 Views
  • 12 replies
  • 10 kudos
Latest Reply
Prabakar
Databricks Employee
  • 10 kudos

If the issue is resolved would you be happy to mark the answer as best so that others can quickly find the solution in the future.

  • 10 kudos
11 More Replies
Chris_Shehu
by Valued Contributor III
  • 8989 Views
  • 7 replies
  • 2 kudos

Resolved! Can I disable the workspace directory for specific user groups?

We want to use the REPO directory in our production environment only and have a dev environment with less restrictions. If I use the checkbox on the group admin screen to disable workspace access, it locks out the entire Data Engineering section.

  • 8989 Views
  • 7 replies
  • 2 kudos
Latest Reply
Chris_Shehu
Valued Contributor III
  • 2 kudos

So I found a way to get 85% of the way there:1) Disable workspace access for the users group.2) Create a new group or use another group that you created for the next step.3) Go to the workspace and right click on whitespace in the root directory.4) A...

  • 2 kudos
6 More Replies
bdc
by New Contributor III
  • 8736 Views
  • 4 replies
  • 5 kudos

Resolved! Is it possible to show multiple cmd output in a dashboard?

I have a loop that outputs a dataframe for values in a list; basically a loop. I can create a dashboard if there is only one df but in the loop, I'm only able to see the charts in the notebook if I switch the view to charts not in the dashboard. In t...

  • 8736 Views
  • 4 replies
  • 5 kudos
Latest Reply
Wanda11
New Contributor II
  • 5 kudos

If you want to be able to easily run and kill multiple process with ctrl-c, this is my favorite method: spawn multiple background processes in a (…) subshell, and trap SIGINT to execute kill 0, which will kill everything spawned in the subshell group...

  • 5 kudos
3 More Replies
Prabakar
by Databricks Employee
  • 7336 Views
  • 2 replies
  • 5 kudos

Resolved! %pip/%conda doesn't work with encrypted clusters starting DBR 9.x

While trying to use the magic command %pip/%conda with DBR 9.x or above it fails with the following error:   %pip install numpy org.apache.spark.SparkException: %pip/%conda commands use unencrypted NFS and are disabled by default when SSL encryption ...

  • 7336 Views
  • 2 replies
  • 5 kudos
Latest Reply
Prabakar
Databricks Employee
  • 5 kudos

If you are not aware of the traffic encryption between cluster worker nodes, you can refer to the below link.https://docs.microsoft.com/en-us/azure/databricks/security/encryption/encrypt-otw

  • 5 kudos
1 More Replies
SailajaB
by Valued Contributor III
  • 12900 Views
  • 5 replies
  • 7 kudos

Resolved! Best mechanism to logging the notebook run/metadata and error details

Hi,How we can integrate log analytics with databricks to log notebook run details and code validations.Thank you

  • 12900 Views
  • 5 replies
  • 7 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 7 kudos

I think you are looking for sending application logs.I'd use Log4j as this is already used by Databricks.The link does not use notebooks but it should work in notebooks too.

  • 7 kudos
4 More Replies
Anonymous
by Not applicable
  • 883 Views
  • 1 replies
  • 0 kudos

An set up corporation’s image is the entirety. The right campaign strategies can make or ruin a organization’s brand image.business consultant Through...

An set up corporation’s image is the entirety. The right campaign strategies can make or ruin a organization’s brand image.business consultant Through digital advertising and marketing, powerful campaigns may be designed and the scope fixing any glit...

  • 883 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

CP500 is known as “Advance payment of income tax in installments” or “NotisBayaranAnsuran”. CP500 is a tax payment scheme designed by the IRM / LHDN for taxpayers to report their other forms of income, such as rental income, royalties, or other busin...

  • 0 kudos
Anonymous
by Not applicable
  • 628 Views
  • 0 replies
  • 0 kudos

Find a local spokesperson for advice.  Ask about their career path, how did they "get here"?Read books about speaking and writing.Analyze fa...

Find a local spokesperson for advice. Ask about their career path, how did they "get here"?Read books about speaking and writing.Analyze famous speeches text to speech software for yourself and do not rely on books that tell you the "why" and "how" o...

  • 628 Views
  • 0 replies
  • 0 kudos
SarahDorich
by New Contributor II
  • 11346 Views
  • 2 replies
  • 4 kudos

Resolved! Parameterize a notebook

I was wondering if there's a way to parameterize a notebook similar to how the Papermill library allows you to parameterize Jupyter notebooks?

  • 11346 Views
  • 2 replies
  • 4 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 4 kudos

you can try with widgets:https://docs.databricks.com/notebooks/widgets.htmlNot exactly the same as Papermill but it works fine. You can pass the values from your job orchestration tool into a widget so the notebook gets executed with the correct val...

  • 4 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels