cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

brickster_2018
by Databricks Employee
  • 7414 Views
  • 3 replies
  • 0 kudos

Resolved! How many notebooks/jobs can I run in parallel on a Databricks cluster?

Is there a limit on it and is the limit configurable?

  • 7414 Views
  • 3 replies
  • 0 kudos
Latest Reply
brickster_2018
Databricks Employee
  • 0 kudos

There is a hard limit of 145 active execution contexts on a Cluster. This is to ensure the cluster is not overloaded with too many parallel threads starving for resources. The limit is not configurable. If there are more than 145 parallel jobs to be ...

  • 0 kudos
2 More Replies
data_serf
by New Contributor
  • 11889 Views
  • 3 replies
  • 1 kudos

Resolved! How to integrate java 11 code in Databricks

Hi all,We're trying to attach java libraries which are compiled/packaged using Java 11.After doing some research it looks like even the most recent runtimes use Java 8 which can't run the Java 11 code ("wrong version 55.0, should be 52.0" errors)Is t...

  • 11889 Views
  • 3 replies
  • 1 kudos
Latest Reply
matthewrj
New Contributor II
  • 1 kudos

I have tried setting JNAME=zulu11-ca-amd64 under Cluster > Advanced options > Spark > Environment variables but it doesn't seem to work. I still get errors indicating Java 8 is the JRE and in the Spark UI under "Environment" I still see:Java Home: /u...

  • 1 kudos
2 More Replies
齐木木
by New Contributor III
  • 2842 Views
  • 1 replies
  • 3 kudos

Resolved! The case class reports an error when running in the notebook

As shown in the figure, the case class and the json string are converted through fasterxml.jackson, but an unexpected error occurred during the running of the code. I think this problem may be related to the loading principle of the notebook. Because...

image.png local image
  • 2842 Views
  • 1 replies
  • 3 kudos
Latest Reply
齐木木
New Contributor III
  • 3 kudos

code:var str="{\"app_type\":\"installed-app\"}" import com.fasterxml.jackson.databind.ObjectMapper import com.fasterxml.jackson.module.scala.DefaultScalaModule val mapper = new ObjectMapper() mapper.registerModule(DefaultScalaModule) ...

  • 3 kudos
WBM1
by New Contributor
  • 974 Views
  • 0 replies
  • 0 kudos

wbm.com.pk

WBM is the best online Supermarket in Pakistan provides you with Fast home delivery of your complete grocery, Home Cleaning, Skincare, Baby Products, and Mosquito Repellent Collection.https://wbm.com.pk/

  • 974 Views
  • 0 replies
  • 0 kudos
Deepak_Kandpal
by New Contributor III
  • 8519 Views
  • 3 replies
  • 2 kudos

Resolved! Enable credential passthrough Option is not available in new UI for Job Cluster

Hi All,I am trying to add new workflow which require to use credential passthrough, but when I am trying to create new Job Cluster from Workflow -> Jobs -> My Job, the option of Enable credential passthrough is not available. Is there any other way t...

image
  • 8519 Views
  • 3 replies
  • 2 kudos
Latest Reply
Rostislaw
New Contributor III
  • 2 kudos

assuming your Excel file is located on ADLS you can add a service principal to the cluster configuration. see: https://docs.microsoft.com/en-us/azure/databricks/data/data-sources/azure/azure-storage#--access-azure-data-lake-storage-gen2-or-blob-stora...

  • 2 kudos
2 More Replies
vamsi0132
by New Contributor II
  • 2029 Views
  • 0 replies
  • 2 kudos

BUG in TIME ZONE EST function

Hi,I found the bug while using in "from_utc_timestamp" function while using from UTC time stamp to EST time stampBelow is the Query Query:select trim(current_timestamp()) as Current,trim(from_utc_timestamp(current_timestamp(),'EST')) as EST,trim(from...

image
  • 2029 Views
  • 0 replies
  • 2 kudos
TutorBees_Net
by New Contributor
  • 3226 Views
  • 2 replies
  • 0 kudos

Tutorbees logo black

We provide online tutoring for students from Grade 5 and all the way up to professionals. You can find the best tutors for Maths, Biology, Physics, Chemistry, English, Social Sciences, Urdu in the comfort of your home. You can also find professional ...

  • 3226 Views
  • 2 replies
  • 0 kudos
Latest Reply
frillow
New Contributor II
  • 0 kudos

The academized review is not that clear. The company seems legitimate enough, but the anonymous profiles make customers and users doubt its legitimacy. While Academized does list the number of custom feedbacks it offers and the fields of specializati...

  • 0 kudos
1 More Replies
Raj4
by New Contributor III
  • 5178 Views
  • 7 replies
  • 0 kudos

www.databricks.com

Hi Team , I have attended the virtual instructor-led training on 23-08-2022 (https://www.databricks.com/p/webinar/databricks-lakehouse-fundamentals-learning-plan). As per the steps mentioned i have completed all of the steps for getting voucher, but ...

  • 5178 Views
  • 7 replies
  • 0 kudos
Latest Reply
amit
New Contributor II
  • 0 kudos

Thanks @Nadia Elsayed​  for quick response. I have booked my exam with supplied coupon without any issue.Thanks, Amit

  • 0 kudos
6 More Replies
sp334
by New Contributor II
  • 12629 Views
  • 4 replies
  • 3 kudos

Resolved! NPIP tunnel setup failed [WaitForNgrokTunnel]

Hello All, We have deployed a new databricks instance in Azure cloud 1) Databricks service attached public subnet/private subnet (delegated to Microsoft.Databricks/workspaces)2) i created a job with cluster runtime ( 1 worker: Standard_DS3_v27.3 LTS...

  • 12629 Views
  • 4 replies
  • 3 kudos
Latest Reply
fabienv
Databricks Employee
  • 3 kudos

In case others run into this in the future. Here is something additional to check:Is your account/workspace enabled for the "compliance security profile"? If yes, you should see a little shield icon in the lower left-hand corner of the workspace Once...

  • 3 kudos
3 More Replies
tompile
by New Contributor III
  • 6475 Views
  • 6 replies
  • 9 kudos

Resolved! Is it possible to make use of pygit2 or GitPython packages to reference git repositories from within databricks?

I am making use of repos in databricks and am trying to reference the current git branch from within the notebook session.For example:from pygit2 import Repositoryrepo = Repository('/Workspace/Repos/user@domain/repository')The code above throws an er...

  • 6475 Views
  • 6 replies
  • 9 kudos
Latest Reply
niburg123
New Contributor III
  • 9 kudos

You cannot use this as far as i know, but you can put a workaround in a notebook if you are calling code from your repo via a notebook:repo_path = "/Repos/xyz_repo_path/xyz_repo_name"repo_path_fs = "/Workspace" + repo_pathrepo_branch = "main"def chec...

  • 9 kudos
5 More Replies
Sadiq
by New Contributor III
  • 4606 Views
  • 5 replies
  • 4 kudos

Fixed length file from Databricks notebook ( Spark SQL)

Hi ,I need help writing data from azure databricks notebook into Fixed Length .txt.notebook has 10 lakh rows and 86 columns. can anyone suggest me

  • 4606 Views
  • 5 replies
  • 4 kudos
Latest Reply
Vidula
Honored Contributor
  • 4 kudos

Hi @sadiq vali​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!

  • 4 kudos
4 More Replies
PrebenOlsen
by New Contributor III
  • 4440 Views
  • 4 replies
  • 1 kudos

GroupBy in delta live tables fails with error "RuntimeError: Query function must return either a Spark or Koalas DataFrame"

I have a delta live table that I'm trying to run GroupBy on, but getting an error: "RuntimeError: Query function must return either a Spark or Koalas DataFrame". Here is my code:@dlt.table def groups_hierarchy():   df = dlt.read_stream("groups_h...

  • 4440 Views
  • 4 replies
  • 1 kudos
Latest Reply
Vidula
Honored Contributor
  • 1 kudos

Hi @Preben Olsen​ Does @Debayan Mukherjee​  response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?We'd love to hear from you.Thanks!

  • 1 kudos
3 More Replies
190809
by Contributor
  • 1877 Views
  • 0 replies
  • 0 kudos

Pulling Data From Stripe to Databricks using the Webhook

I am doing some investigation in how to connect Databricks and Stripe. Stirpe has really good documentation and I have decided to set up a webhook in Django as per their recommendation. This function handles events as they occur in stripe:-----------...

  • 1877 Views
  • 0 replies
  • 0 kudos
Munni
by New Contributor II
  • 813 Views
  • 0 replies
  • 0 kudos

Hai,I need somehelp,I am reading csv file through pyspark ,in which one field encoded with double quotes,I should get that value along with double quo...

Hai,I need somehelp,I am reading csv file through pyspark ,in which one field encoded with double quotes,I should get that value along with double quotes.Spark version is 3.0.1.col1,col2,col3"A",""B,C"","D"-----------INPUTOUTPUT:A , "B,C" , D

  • 813 Views
  • 0 replies
  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels