cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

sakuraDev
by New Contributor II
  • 458 Views
  • 1 replies
  • 0 kudos

I keep on getting Parse_syntax_error on autoloader run foreachbatch

Hey guys, I keep on getting this error message when trying to call a function with soda DQ's: [PARSE_SYNTAX_ERROR] Syntax error at or near '{'. SQLSTATE: 42601 File <command-81221799516900>, line 4 1 dfBronze.writeStream \ 2 .foreachB...

  • 458 Views
  • 1 replies
  • 0 kudos
Latest Reply
VZLA
Databricks Employee
  • 0 kudos

Hi @sakuraDev , this looks like a Soda syntax issue. Try fixing the "fail" and "warn" fields in your Soda checks. For example, instead of writing:   - missing_count(site) = 0: name: Ensure no null values fail: 1 warn: 0   Use Soda's thres...

  • 0 kudos
Jorge3
by New Contributor III
  • 610 Views
  • 1 replies
  • 2 kudos

Databricks Asset Bundle artifacts with module out of the bundle root (sync path)

Hello everyone!I’m currently working on a project with shared functionalities across different Databricks bundles. I have separate folders for each bundle, along with a common libs/ folder that holds some Python modules intended to be shared across b...

  • 610 Views
  • 1 replies
  • 2 kudos
Latest Reply
VZLA
Databricks Employee
  • 2 kudos

Hi @Jorge3, where you able to get this issue resolved? I believe your artifact build path points outside the synced directory structure, and after syncing ../libs, libs should be available within the bundle root, so the artifact path should be update...

  • 2 kudos
Data_Engineer07
by New Contributor II
  • 376 Views
  • 1 replies
  • 0 kudos

Looking for 75% coupon code for Data Engineering Associate Certification

Hi Everyone, I am Looking for 75% coupon code for Data Engineering Associate Certification . Can anyone Guide me how can get coupon code for certification.

  • 376 Views
  • 1 replies
  • 0 kudos
Latest Reply
VZLA
Databricks Employee
  • 0 kudos

Hi @Data_Engineer07 , Please reach out through https://www.databricks.com/company/contact regarding such requests. The corresponding team will guide on this request and let you know of its availability if there is.

  • 0 kudos
829023
by New Contributor
  • 1574 Views
  • 2 replies
  • 1 kudos

Databricks federation query why not support Oracle Database?

Hi,Based on the documents(https://docs.databricks.com/en/query-federation/index.html), Databricks federation query is not support Oracle for source. 1. Did you guys know the reason? (Is it depends on Oracle's speciality?)2. Is there another way to ru...

  • 1574 Views
  • 2 replies
  • 1 kudos
Latest Reply
VZLA
Databricks Employee
  • 1 kudos

@829023 There's limited support with respect to the pushdown and data types mapping as documented in our website: https://docs.databricks.com/en/query-federation/oracle.htmlThis was published recently, I believe in October, given your question was ra...

  • 1 kudos
1 More Replies
NhanNguyen
by Contributor III
  • 226 Views
  • 2 replies
  • 0 kudos

Table Properties different for liquid clustering with Databricks version.

Dear all,Today, I tried the liquid clustering in Databricks, but after running it with two Databricks engine version, it showed different properties in the catalog explorer.1. Run with DBR version 14.3 LTS (includes Apache Spark 3.5.0, Scala 2.12) it...

  • 226 Views
  • 2 replies
  • 0 kudos
Latest Reply
VZLA
Databricks Employee
  • 0 kudos

Correct, like @holly rightly said this is just an updated way of representing the columns in a more structured or updated manner, it may also be matching a new value type. In both cases the table property is reflecting that LC was enabled. Our sugges...

  • 0 kudos
1 More Replies
Flying_Rico
by New Contributor II
  • 365 Views
  • 3 replies
  • 0 kudos

Passing Parameters in a Workflow pipeline

Hello Mates,I’m currently working on four workflows, all of which are connected to my own notebook.The four workflows should be started automatically one after the other and the only point that should be passed is the output of Workflow 1.The workflo...

  • 365 Views
  • 3 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Yes the problem is that the .set and .get function works within the same job run, it does not pass to another job

  • 0 kudos
2 More Replies
dollyb
by Contributor
  • 519 Views
  • 6 replies
  • 1 kudos

Logging to an external location via UC volume

The way I understand it, mount points are deprecated in UC. db.fs.mount() doesn't even seem to work in newer DB runtimes.But what is the solution when Databricks features don't allow using UC volumes? E.g. specifying a compute's logging path won't wo...

  • 519 Views
  • 6 replies
  • 1 kudos
Latest Reply
Walter_C
Databricks Employee
  • 1 kudos

As you cannot use volumes it seems that indeed this will be your only option

  • 1 kudos
5 More Replies
Arihant
by New Contributor
  • 7316 Views
  • 1 replies
  • 0 kudos

Unable to login to Databricks Community Edition

Hello All,I have successfully created a databricks account and went to login to the community edition with the exact same login credentials as my account, but it tells me that the email/password are invalid. I can login with these same exact credenti...

  • 7316 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Databricks Employee
  • 0 kudos

Hello Arihant!You can find helpful resources for Databricks Community Edition here. If the available resource doesn’t resolve your concern, feel free to submit a ticket with Databricks Support team for further assistance. Thank you.

  • 0 kudos
17abhishek
by New Contributor III
  • 425 Views
  • 2 replies
  • 1 kudos

HOW TO SKIP A STEP FROM AN EXISTING WORKFLOW

Hi, can any one guide me in below scenario:Suppose we have created a workflow with 10 steps and our batches are running properly but due to some business requirement/testing purpose we have to skip step 4 and run the rest of job from step 5 onwords t...

  • 425 Views
  • 2 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @17abhishek ,You can try if/else conditions between the tasks with a Job parameter of "IsActiveTask" with a True or False.But it would be great if databricks team just add ability to simply disable some task from UI.

  • 1 kudos
1 More Replies
HaydenZhou
by New Contributor II
  • 402 Views
  • 3 replies
  • 0 kudos

DBR 16.0 spark read Azure Blob file Failed.

   ala:570) at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:354) at java.base/java.lang.Thread.run(Thread.java:840) Caused by: java.lang.NullPointerException at java.base/java.lang.Class.forName0(Native Method) at java.b...

HaydenZhou_0-1733888933565.png
  • 402 Views
  • 3 replies
  • 0 kudos
Latest Reply
HaydenZhou
New Contributor II
  • 0 kudos

I had find the solustion of this problem. Close this Post.

  • 0 kudos
2 More Replies
Tonny_Stark
by New Contributor III
  • 12123 Views
  • 7 replies
  • 1 kudos

FileNotFoundError: [Errno 2] No such file or directory: when I try to unzip .tar or .zip files it gives me this error

Hello, how are you? I have a small problem. I need to unzip some .zip, tar files. and gz inside these may have multiple files trying to unzip the .zip files i got this errorFileNotFoundError: [Errno 2] No such file or directory: but the files are in ...

error
  • 12123 Views
  • 7 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Alfredo Vallejos​ Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your feed...

  • 1 kudos
6 More Replies
adurand-accure
by New Contributor III
  • 546 Views
  • 4 replies
  • 1 kudos

Serverless job error - spark.rpc.message.maxSize

Hello, I am facing this error when moving a Workflow to serverless modeERROR : SparkException: Job aborted due to stage failure: Serialized task 482:0 was 269355219 bytes, which exceeds max allowed: spark.rpc.message.maxSize (268435456 bytes). Consid...

  • 546 Views
  • 4 replies
  • 1 kudos
Latest Reply
adurand-accure
New Contributor III
  • 1 kudos

Hello PiotrMi,We found out that the problem was caused by a collect() and managed to fix it by changing some codeThanks for your quick repliesBest regards,Antoine 

  • 1 kudos
3 More Replies
SimonXu
by New Contributor II
  • 10610 Views
  • 7 replies
  • 15 kudos

Resolved! Failed to launch pipeline cluster

Hi, there. I encountered an issue when I was trying to create my delta live table pipeline. The error is "DataPlaneException: Failed to launch pipeline cluster 1202-031220-urn0toj0: Could not launch cluster due to cloud provider failures. azure_error...

cluster failed to start usage and quota
  • 10610 Views
  • 7 replies
  • 15 kudos
Latest Reply
Yaadhu
New Contributor II
  • 15 kudos

you can create the pool instance in the databricks under compute/pool and assign the value in the json of the DLT pipeline. With this, we will control on pool min workers and max workers and the reuse of the pools available by other pipelines. "node_...

  • 15 kudos
6 More Replies
amoralca
by New Contributor
  • 3149 Views
  • 3 replies
  • 0 kudos

Exploring the Use of Databricks as a Transactional Database

Hey everyone, I’m currently working on a project where my team is thinking about using Databricks as a transactional database for our backend application. We're familiar with Databricks for analytics and big data processing, but we're not sure if it’...

  • 3149 Views
  • 3 replies
  • 0 kudos
Latest Reply
movmarcos
New Contributor II
  • 0 kudos

I have a similar situation in my data quality check process. During this stage, I frequently find errors or potential issues that can stop the pipeline. Each of these errors requires manual intervention, which might involve making edits or supplying ...

  • 0 kudos
2 More Replies
pora
by New Contributor
  • 714 Views
  • 1 replies
  • 0 kudos

Databricks:null error message: Cannot resolve hostname: Caused by: UnknownHostException

Hello,We are suddenly getting following error message while running any code from Databricks which is accessing Blob storage.We checked our App registration key and it's not expired.If we run to "dbutils.fs.mount" and we are able to get some info and...

  • 714 Views
  • 1 replies
  • 0 kudos
Latest Reply
VZLA
Databricks Employee
  • 0 kudos

Hi @pora , just checking if this is still an issue, otherwise where is help still required? Could you also please elaborate on the setup and requirement.

  • 0 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels