cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

karthik-kobai
by New Contributor II
  • 2007 Views
  • 0 replies
  • 0 kudos

Databricks-jdbc and vulnerabilities CVE-2021-36090 CVE-2023-6378 CVE-2023-6481

The latest version of Databricks-jdbc available through Maven (2.6.36) now has these three vulnerabilities:https://www.cve.org/CVERecord?id=CVE-2021-36090https://www.cve.org/CVERecord?id=CVE-2023-6378https://www.cve.org/CVERecord?id=CVE-2023-6481All ...

  • 2007 Views
  • 0 replies
  • 0 kudos
Christoph
by New Contributor II
  • 1900 Views
  • 3 replies
  • 0 kudos

Internal Error when querying a doubleType column of a delta table using ">" "<" operators

Hi there,we are currently facing a pretty confusing issue:We have a delta table (~2TB) which has been working just fine over the last few years and months. For a few days or weeks now, querying the table on one of its columns, let´s call it double_co...

  • 1900 Views
  • 3 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

it might be a bug which is already logged, or a new one.  You can check the Spark Jira pages.

  • 0 kudos
2 More Replies
Mohamednazeer
by New Contributor III
  • 3515 Views
  • 1 replies
  • 0 kudos

Resolved! IllegalArgumentException: Mount failed due to invalid mount source

We are trying to create mount for containers from two different storage accounts. We are using Azure Storage Account and Azure Data bricks.We could able to create mount for containers from one storage account, but when we try to create the mount for ...

  • 3515 Views
  • 1 replies
  • 0 kudos
Latest Reply
Mohamednazeer
New Contributor III
  • 0 kudos

Hi community,The issue was becoz of cross vent access. The storage account and the databricks workspace both are in different vnet. Since that we had to create private end point to access the cross vnet resources. Once we crated the private endpoint ...

  • 0 kudos
rhevarr
by New Contributor II
  • 1741 Views
  • 0 replies
  • 0 kudos

Course: Apache Spark Programming with Databricks ID: E-P0W7ZV // Issue Classroom-Setup

Hello,I am trying to run the Classroom-Setup from the course files notebook (ASP 1.1 - Databricks Platform)(Course:Apache Spark™ Programming with DatabricksID: E-P0W7ZV)Instructions: "Setup:Run classroom setup to mount Databricks training datasets an...

Data Engineering
academy
Course
Databricks
spark
  • 1741 Views
  • 0 replies
  • 0 kudos
Hardy
by New Contributor III
  • 12733 Views
  • 6 replies
  • 3 kudos

upload files to dbfs:/volume using databricks cli

In our azure pipeline we are using databricks-cli command to upload jar files at dbfs:/FileStore location and that works perfectly fine. But when we try to use the same command to upload files at dbfs:/Volume/dev/default/files, it does not work and g...

  • 12733 Views
  • 6 replies
  • 3 kudos
Latest Reply
saikumar246
Databricks Employee
  • 3 kudos

@Hardy I think you are using the word volume in the path but it should be Volumes(plural), not Volume(singular). Do one thing, copy the volume path directly from the Workspace and try.

  • 3 kudos
5 More Replies
Volker
by Contributor
  • 2763 Views
  • 2 replies
  • 2 kudos

Preferred compression format for ingesting large amounts of JSON files with Autoloader

Hello Databricks Community,in an IOT context we plan to ingest a large amount of JSON files (~2 Million per Day). The JSON files are in json lines format und need to be compressed on the IOT devices. We can provide suggestions for the type of compres...

  • 2763 Views
  • 2 replies
  • 2 kudos
Latest Reply
Volker
Contributor
  • 2 kudos

Hi, sorry I guess my response wasn't sent. The source are JSON files that are uploaded to an S3 bucket. The sink will be a Delta Table and we are using autoloader.The question was about the compression format of the incoming JSON files, e.g. if it wo...

  • 2 kudos
1 More Replies
FerArribas
by Contributor
  • 1732 Views
  • 1 replies
  • 0 kudos

Custom JobGroup in Spark UI for cluster with multiple executions

Does anyone know what the first digits of the jobgroup that are shown in the spark ui mean when using all purpose clusters to launch multiple jobs?Right now the pattern is something like: [id_random]_job_[jod_id]_run-[run_id]_action_[action].

  • 1732 Views
  • 1 replies
  • 0 kudos
Latest Reply
saikumar246
Databricks Employee
  • 0 kudos

Hi @FerArribas  The first digits of the jobgroup that are shown in the spark UI are execContextId and cmdId(Command_ID). You can think of the execContextId as some kind of “REPL ID” For example, if you take the below job group ID as an example, jobGr...

  • 0 kudos
luriveros
by New Contributor
  • 5921 Views
  • 1 replies
  • 0 kudos

implementing liquid clustering for DataFrames directly

 Hi !! I have a question is it possible to implementing liquid clustering for DataFrames directly saved to delta files (df.write.format("delta").save("path")), The conventional approach involving table creation

  • 5921 Views
  • 1 replies
  • 0 kudos
Latest Reply
brockb
Databricks Employee
  • 0 kudos

Hi,Hopefully this question is related to testing and any production data would get persisted to a table but one example is:df = (spark.range(10).write.format("delta").mode("append").save("file:/tmp/data"))ALTER TABLE delta.`file:/tmp/data` CLUSTER BY...

  • 0 kudos
pshuk
by New Contributor III
  • 1706 Views
  • 2 replies
  • 0 kudos

file transfer through CLI to DBFS, working manually but not in python code...

Hi,I ran my code sucessfully in the past but suddenly it stopped working. I have a python code that transfer local files to DBFS location using CLI. When I run the command manually on the screen, it works but in the code, it gives me the error  "retu...

  • 1706 Views
  • 2 replies
  • 0 kudos
Latest Reply
feiyun0112
Honored Contributor
  • 0 kudos

The 127 error code indicates “command not found”,Try using the full path of the databricks command

  • 0 kudos
1 More Replies
kazinahian
by New Contributor III
  • 3745 Views
  • 2 replies
  • 1 kudos

Resolved! How can I Learn Databricks Data Pipeline in Azure environment?

Hello Esteemed Community,I have a fundamental question to ask, and I approach it with a sense of humility. Your guidance in my learning journey would be greatly appreciated. I am eager to learn how to build a hands-on data pipeline within the Databri...

  • 3745 Views
  • 2 replies
  • 1 kudos
Latest Reply
Palash01
Valued Contributor
  • 1 kudos

Hey @kazinahian I completely understand your hesitation and appreciate your approach to seeking guidance! Embarking on a learning journey can be daunting, especially when financial considerations are involved. I'm happy to offer some advice on buildi...

  • 1 kudos
1 More Replies
Tripalink
by New Contributor III
  • 8270 Views
  • 4 replies
  • 1 kudos

Error. Another git operation is in progress.

I am getting an error every time I try to view another branch or create a branch. Sometimes this has happened in the past, but usually seems to fix itself after about 10-30 minutes. This error has been lasting for over 12 hours, so I am now concerned...

git_error_message
  • 8270 Views
  • 4 replies
  • 1 kudos
Latest Reply
Hakuna_Madata
New Contributor II
  • 1 kudos

I had the same problem and I could resolve it by creating the repo again with a trailing ".git" in the Git repository URL.For example, use thishttps://gitlab.mycompany.com/my-project/my-repo.gitnot this:https://gitlab.mycompany.com/my-project/my-repo...

  • 1 kudos
3 More Replies
Arnold_Souza
by New Contributor III
  • 4838 Views
  • 3 replies
  • 0 kudos

Unable to enable entitlements to account groups in a workspace

Currently, I am both an account administrator and also a workspace administrator in Databricks.​When I try to enable the entitlements "Workspace access" and "Databricks SQL access" to account groups I am receiving the error "Failed to enable entitlem...

  • 4838 Views
  • 3 replies
  • 0 kudos
Latest Reply
saikumar246
Databricks Employee
  • 0 kudos

Hi @Arnold_Souza, The error "Failed to enable entitlement.: Group not found" that you're experiencing when trying to enable the entitlements “Workspace access” and “Databricks SQL access” for account groups is likely due to the fact that Identity Fed...

  • 0 kudos
2 More Replies
Martinitus
by New Contributor III
  • 6896 Views
  • 4 replies
  • 0 kudos

CSV Reader reads quoted fields inconsistently in last column

I just opened another issue: https://issues.apache.org/jira/browse/SPARK-46959It corrupts data even when read with mode="FAILFAST", i consider it critical, because basic stuff like this  should just work!

  • 6896 Views
  • 4 replies
  • 0 kudos
Latest Reply
Martinitus
New Contributor III
  • 0 kudos

either:  [ 'some text', 'some text"', 'some text"' ]alternatively: [ '"some text"', 'some text"', 'some text"' ]probably most sane behavior would be a parser error ( with mode="FAILFAST").just parsing garbage without warning the user is certainly not...

  • 0 kudos
3 More Replies
tomph
by New Contributor II
  • 3099 Views
  • 2 replies
  • 0 kudos

Resolved! Databricks Asset Bundles - Manage existing jobs

Hello,we are starting to experiment with Databricks Asset Bundles, especially to keep jobs aligned between workspaces. Is there a way to start managing existing jobs, to avoid erasing previous runs history?Thank you,Tommaso

  • 3099 Views
  • 2 replies
  • 0 kudos
Latest Reply
tomph
New Contributor II
  • 0 kudos

Great news, thanks!

  • 0 kudos
1 More Replies
matt_stanford
by New Contributor III
  • 3705 Views
  • 1 replies
  • 0 kudos

Resolved! Type 2 SCD when using Auto Loader

Hi there! I'm pretty new to using Auto Loader, so this may be a really obvious fix, but it's stumped me for a few weeks, so I'm hoping someone can help! I have a small csv file saved in ADLS with a list of pizzas for an imaginary pizza restaurant. I'...

  • 3705 Views
  • 1 replies
  • 0 kudos
Latest Reply
matt_stanford
New Contributor III
  • 0 kudos

So, I figured out what the issue was. I needed to delete checkpoint folder. After I did this and re-ran the notebook, everything worked fine! 

  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels